Optimal Load Flow Solution Using the Hessian Matrix

Abstract
The rapid convergence that Newton's method possesses, by use of the Jacobian matrix, has led to an investigation of using a higher order matrix, the Hessian, for an even faster convergence. It turns out that this approach unifies the fields of nonlinear programming methods and Newton based methods. The load flow problem can be defined as the solution of a system of simultaneous equations f I (x)= O, i= l, ..., n. It can be shown the Newton's method proceeds in a direction that minimizes F=∑f I (x) 2 . The Hessian load flow also minimizes F by assuming that it is a quadratic function, such that the linearizations become HΔx=-g, where the Hessian H is the matrix of the second partials of F and the vector g is the gradient of F. The optimal load flow problem can be formulated by including some additional terms in F so that a single algorithm, based on the Hessian, essentially solves both the normal and the optimal load flow problems. An interesting aspect of the method is that an existing Newton's program can be updated to a Hessian program quite simply. The H matrix is somewhat less sparse than the corresponding Jacobian but enough so that sparse techniques should be used. Furthermore, the Hessian can be completely obtained from the Jacobian, thus avoiding extra explicit function evaluations in the program. The paper presents enough details of the method for an implementation of a computer program. Numerical examples are given and compared with Newton's method results.

This publication has 14 references indexed in Scilit: