Linearly convergent descent methods for the unconstrained minimization of convex quadratic splines
- 1 July 1995
- journal article
- Published by Springer Nature in Journal of Optimization Theory and Applications
- Vol. 86 (1), 145-172
- https://doi.org/10.1007/bf02193464
Abstract
No abstract availableKeywords
This publication has 27 references indexed in Scilit:
- A Newton Method for Convex Regression, Data Smoothing, and Quadratic Programming with Bounded ConstraintsSIAM Journal on Optimization, 1993
- Remarks on Convergence of the Matrix Splitting Algorithm for the Symmetric Linear Complementarity ProblemSIAM Journal on Optimization, 1993
- Gauss-seidel method for least-distance problemsJournal of Optimization Theory and Applications, 1992
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth MinimizationSIAM Journal on Control and Optimization, 1992
- Error Bound and Convergence Analysis of Matrix Splitting Algorithms for the Affine Variational Inequality ProblemSIAM Journal on Optimization, 1992
- On the Convergence of a Matrix Splitting Algorithm for the Symmetric Monotone Linear Complementarity ProblemSIAM Journal on Control and Optimization, 1991
- Iterative Methods for Large Convex Quadratic Programs: A SurveySIAM Journal on Control and Optimization, 1987
- Some continuity properties of polyhedral multifunctionsPublished by Springer Nature ,1981
- Curvilinear path steplength algorithms for minimization which use directions of negative curvatureMathematical Programming, 1980
- Convergence Conditions for Ascent MethodsSIAM Review, 1969