Fact-checked by Grok 2 weeks ago
References
-
[1]
[PDF] Continuous Optimization (Nonlinear and Linear Programming)Aug 9, 2012 · In mathematics, continuous optimization relies heaving on various forms of mathematical analy- sis, especially real analysis and functional ...
-
[2]
[PDF] IntroductionThis book provides an introduction to continuous optimization, the minimization of continuous, real-valued functions of real variables over convex domains.
-
[3]
[PDF] Numerical Optimization - UCI MathematicsOur goal in this book is to give a comprehensive description of the most powerful, state-of-the-art, techniques for solving continuous optimization problems.<|control11|><|separator|>
-
[4]
[PDF] The original Euler's calculus-of-variations method - Edwin F. TaylorLeonhard Euler's original version of the calculus of variations (1744) used elementary mathematics and was intuitive, geometric, and easily visualized. In.
-
[5]
Lagrange and the calculus of variations | Lettera MatematicaMay 3, 2014 · Together with Euler, Lagrange is the inventor of the calculus of variations, a simple and elegant idea that revolutionised the way of solving ...
-
[6]
[PDF] LV Kantorovich and linear programming - arXivJul 4, 2007 · He told about the contents of his 1939 book, about resolving multipliers, various models and problems, etc. For the overwhelming majority of the ...
-
[7]
The (Dantzig) simplex method for linear programming - IEEE XploreIn 1947, George Dantzig created a simplex algorithm to solve linear programs for planning and decision-making in large-scale enterprises.
-
[8]
Nonlinear Programming - Project EuclidNonlinear Programming Chapter. Author(s) HW Kuhn, AW Tucker. Editor(s) Jerzy Neyman. Berkeley Symp. on Math. Statist. and Prob., 1951: 481-492 (1951)
- [9]
-
[10]
[PDF] A Survey of Optimization Methods from a Machine Learning ... - arXivAs the representative of first-order optimization methods, the stochastic gradient descent method [1], [2], as well as its variants, has been widely used in ...
-
[11]
Numerical Optimization | SpringerLinkNumerical Optimization presents a comprehensive and up-to-date description of the most effective methods in continuous optimization.Derivative-Free Optimization · Sequential Quadratic... · Quasi-Newton Methods
-
[12]
[PDF] NONLINEAR PROGRAMMINGThis work was done under contracts with the Office of Naval Research. 48I. Page 2. 482. SECOND BERKELEY SYMPOSIUM: KUHN AND TUCKER ... Koopmans, Wiley, New York, ...
-
[13]
[PDF] Convex OptimizationThis book is about convex optimization, a special class of mathematical optimiza- tion problems, which includes least-squares and linear programming ...
-
[14]
[PDF] Introduction to Optimization, and Optimality Conditions for ...We have the following definitions of local/global, strict/non-strict min- ima/maxima. Definition 1.1 x ∈ F is a local minimum of P if there exists > 0 such.
-
[15]
[PDF] OPTIMALITY CONDITIONSHence f(y) ≥ f(x) + ∇f(x)T (y − x) since ∇2f(xλ) is positive semi-definite. Therefore, f is convex by (b) of Part (1). Convexity is also preserved by certain ...
-
[16]
[PDF] Convexity - Stanford AI LabMay 29, 2018 · If f is strictly convex, then there exists at most one local minimum of f in X. Consequently, if it exists it is the unique global minimum of f ...
-
[17]
[PDF] 11.4 Maximizing and minimizing functions of two variablesExample. The function f(x) = sin (x) has a relative maximum value at x= π/2 = 1.571, at x =5π/2 = 7.854, and at x = -3π/2 = -4.712. It has a relative minimum ...
-
[18]
[PDF] Understanding search behavior via search landscape analysis in ...Consider a given local minimum. Associated with that local minimum is an attraction basin—the set of all candidate structures from which our local search ...
-
[19]
The theory of Newton's method - ScienceDirectIn this paper we deal only with the theory of Newton's method. We concentrate on the convergence properties, error estimates, complexity and related issues.
-
[20]
Trust Region Methods | SIAM Publications LibraryThis is the first comprehensive reference on trust-region methods, a class of numerical algorithms for the solution of nonlinear convex optimization methods.
-
[21]
[PDF] Constrained Optimization and Lagrange Multiplier MethodsProfessor Bertsekas has done research in a broad variety of subjects from optimization theory, control theory, parallel and distributed computa- tion, data ...
-
[22]
On the Stability of the Direct Elimination Method for Equality ...It is proved that the solution computed by the method is the exact solution of a perturbed problem and bounds for data perturbations are given.
- [23]
- [24]
-
[25]
[PDF] SEQUENTIAL QUADRATIC PROGRAMMING METHODSWe review some of the most prominent developments in SQP methods since 1963 and discuss the relationship of. SQP methods to other popular methods, including ...Missing: seminal | Show results with:seminal
-
[26]
[PDF] The Simplex Method - Stanford UniversityWhen the simplex method is used to solve a linear program the number of iterations to solve the problem starting from a basic feasible solution is typically a ...
-
[27]
[PDF] Khachiyan's Linear Programming Algorithm* - cs.wisc.eduKhachiyan's polynomial time algorithm for determining whether a system of linear inequalities is satisfiable is presented together with a proof of its validity.
-
[28]
The Cutting-Plane Method for Solving Convex Programs - SIAM.orgThis paper introduces a master cutting plane algorithm for nonlinear programming that isolates the points it generates from one another until a solution is ...Missing: original | Show results with:original
-
[29]
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse ...In this paper we present a new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global ...
-
[30]
[PDF] Distributed Optimization and Statistical Learning via the Alternating ...This review discusses the alternating direction method of multipli- ers (ADMM), a simple but powerful algorithm that is well suited to distributed convex ...
-
[31]
[PDF] Stochastic Gradient Descent - Statistics & Data ScienceMany variants provide better practical stability, convergence: momentum, acceleration, averaging, coordinate-adapted step sizes, variance reduction ... • See ...
-
[32]
[PDF] A Stochastic Approximation Method - Columbia UniversityAuthor(s): Herbert Robbins and Sutton Monro. Source: The Annals of Mathematical Statistics , Sep., 1951, Vol. 22, No. 3 (Sep., 1951), pp. 400-407. Published ...
-
[33]
Handbook of Convergence Theorems for (Stochastic) Gradient ...Jan 26, 2023 · Abstract:This is a handbook of simple proofs of the convergence of gradient and stochastic gradient descent type methods.
-
[34]
[PDF] Making Gradient Descent Optimal for Strongly Convex Stochastic ...For strongly convex prob- lems, its convergence rate was known to be O(log(T)/T), by running SGD for T itera- tions and returning the average point.<|separator|>
-
[35]
[PDF] Accelerating Stochastic Gradient Descent using Predictive Variance ...This paper introduces an explicit variance reduction method for stochastic gradient descent meth- ods. For smooth and strongly convex functions, we prove ...
-
[36]
Minimizing Finite Sums with the Stochastic Average Gradient - arXivSep 10, 2013 · Abstract:We propose the stochastic average gradient (SAG) method for optimizing the sum of a finite number of smooth convex functions.Missing: original | Show results with:original
-
[37]
Empirical Risk Minimization for Stochastic Convex Optimization: $O ...Feb 7, 2017 · In this work, we strengthen the realm of ERM for SCO by exploiting smoothness and strong convexity conditions to improve the risk bounds.
-
[38]
[PDF] Lecture 17: Nonsmooth Optimization - cs.wisc.eduIn general, the maximum of (finitely many) smooth functions is a nonsmooth ... What is af(x) for the ℓ1 norm f(x) = ∥x∥. 1 := ∑d i=1. |xi|? With the above ...
-
[39]
Constructing Bundle Methods for Convex Optimization - ScienceDirectThis paper overviews the theory necessary to construct optimization algorithms based on convex analysis, namely on the use of approximate subdifferentials.
-
[40]
[PDF] NONSMOOTH OPTIMIZATION AND DESCENT METHODS Claude ...This paper describes the role of nondifferentiable optimization from the point of view of systems analysis, briefly describes the state of the art, and gives a ...
-
[41]
[PDF] Proximal Algorithms - Stanford UniversityThis suggests a close connection between proximal operators and gradient methods, and also hints that the proximal operator may be useful in optimization. It ...
-
[42]
Polynomial time algorithms for some classes of constrained ...Abstract. In this paper we consider different classes of noneonvex quadratic problems that can be solved in polynomial time. We present an algorithm for the ...
-
[43]
Structural OptimizationDec 13, 2007 · Contemporary structural optimization has it roots in the 1960s with Lucien. Schmidt's seminal paper.1 Prior to that time there were no texts on ...
-
[44]
Structural Design by Systematic Synthesis - VanderplaatsAuthors: Lucien A. Schmit, Jr. Publication Date: September 8-9, 1960. Abstract: The use of analysis as a tool in structural design is well known. However ...
-
[45]
Model predictive heuristic control: Applications to industrial processesA new method of digital process control is described. It relies on three principles: 1. (a) The multivariable plant is represented by its impulse responses.
-
[46]
[PDF] Model predictive control: Theory, computation and designThis chapter gives an introduction into methods for the numerical so- lution of the MPC optimization problem. Numerical optimal control builds on two ®elds: ...
-
[47]
Supply Chain Design and Planning – Applications of Optimization ...This chapter describes the optimization models that effectively address the coordination of various decisions concerning the planning and design of the supply ...
-
[48]
[PDF] History of Optimal Power Flow and FormulationsThe optimal power flow problem was first formulated in the 1960's (Carpentier 1962), but has proven to be a very difficult problem to solve. Linear solvers are ...Missing: seminal | Show results with:seminal
-
[49]
Optimal power flows - ScienceDirect.comView PDF; Download full ... Carpentier. Contribution à l'étude du dispatching économique. Bulletin de la Société Française des Electriciens, Vol 3 (August 1962).
-
[50]
[PDF] Optimization of Low-Thrust Spiral Trajectories by Collocation... Direct trajectory optimization using nonlinear programming and collocation,” Journal ... As NASA examines potential missions in the post space shuttle era ...
-
[51]
Learning representations by back-propagating errors - NatureOct 9, 1986 · We describe a new learning procedure, back-propagation, for networks of neurone-like units. The procedure repeatedly adjusts the weights of the connections in ...
-
[52]
Practical Bayesian Optimization of Machine Learning AlgorithmsJun 13, 2012 · Machine learning algorithms frequently require careful tuning of model hyperparameters, regularization terms, and optimization parameters.Missing: seminal | Show results with:seminal
-
[53]
Early Stopping — But When? | SpringerLinkAbstract. Validation can be used to detect when overfitting starts during supervised training of a neural network; training is then stopped before convergence ...
-
[54]
[PDF] Language Models are Few-Shot Learners - NIPS papersWe demonstrate that scaling up language models greatly improves task-agnostic, few-shot performance, sometimes even becoming competitive with prior state-of ...