Current location - Education and Training Encyclopedia - Graduation thesis - What are the different algorithms for nonlinear optimization?
What are the different algorithms for nonlinear optimization?
1 Common algorithms for unconstrained nonlinear optimization problems;

Gradient method (steepest descent method), yoke gradient method, variable scale method and step acceleration method. Among them, the first three methods need to use the first derivative or the second derivative of the function, which is suitable for the case that the function expression has a derivative and the derivative is simple, while the step acceleration rule is the opposite, which is suitable for the case that the function expression is complex, even has no analytical expression, or has no derivative.

Two common algorithms for constrained nonlinear optimization problems;

According to whether the problem becomes unconstrained, it can be divided into feasible direction method and constrained function method (exterior point method and interior point method), in which interior point method is suitable for the case that the objective function is complex outside the feasible region, while exterior point method is the opposite. The latter has different deformation according to the structure of penalty function or obstacle function.