Second Derivatives for Optimizing Eigenvalues of Symmetric Matrices

Abstract
Let A denote an $n \times n$ real symmetric matrix-valued function depending on a vector of real parameters, $x \in \Re ^m $. Assume that A is a twice continuously differentiable function of x, with the second derivative satisfying a Lipschitz condition. Consider the following optimization problem: minimize the largest eigenvalue of $A(x)$. Let $x^ * $ denote a minimum. Typically, the maximum eigenvalue of $A(x^ * )$ is multiple, so the objective function is not differentiable at $x^ * $, and straightforward application of Newton’s method is not possible. Nonetheless, the formulation of a method with local quadratic convergence is possible. The main idea is to minimize the maximum eigenvalue subject to a constraint that this eigenvalue has a certain multiplicity. The manifold $\Omega $ of matrices with such multiple eigenvalues is parameterized using a matrix exponential representation, leading to the definition of an appropriate Lagrangian function. Consideration of the Hessian of this Lagrangian functio...

This publication has 16 references indexed in Scilit: