Quasi-Newton methods are a category of iterative optimization algorithms used primarily for finding local maxima and minima of functions. These methods are particularly useful for solving unconstrained optimization problems where the objective function is twice continuously differentiable. Quasi-Newton methods are primarily designed to optimize functions where calculating the Hessian matrix (the matrix of second derivatives) is computationally expensive or impractical.
Articles by others on the same topic
There are currently no matching articles.