- Ant colony optimization. Probabilistic technique for solving problems which can be reduced to finding good paths through graphs.
- BFGS (Broyden-Fletcher-Goldfarb-Shanno method). Solves a unconstrained nonlinear optimization problem.
- Branch and bound. Method to find optimal solutions of discrete and combinatorial optimization problems.
- Conjugate gradient method. Iterative algorithm for the numerical solution of systems of linear equations, whose matrix is symmetric and positive definite.
- Evolution strategy. Technique based on ideas of adaptation and evolution. Operators are. mating selection, recombination, mutation, fitness function evaluation, and environmental selection.
- Gauss-Newton. An algorithm for solving nonlinear least squares problems.
- Gradient descent. Approaches a local minimum of a function by taking steps proportional to the negative of the gradient (or the approximate gradient) of the function at the current point.
- Gradient ascent. Approaches a local maximum of a function, as gradient descent but one takes steps proportional to the gradient.
- Levenberg-Marquardt. Numerical solution to the problem of minimizing a sum of squares of several, generally nonlinear functions that depend on a common set of parameters.
- Line search. Iterative approaches to find a local minimum of an objective function in unconstrained optimization.
- Local search. Metaheuristic for solving hard optimization problems as maximizing a criterion among a number of candidate solutions.
- Nelder-Mead method (downhill simplex method). A nonlinear optimization algorithm.
- Newton's method in optimization. The same algorithm to find roots of equations in one or more dimensions can also be used to find local maxima and local minima of functions.
- PSO, Particle swarm optimization. Swarm intelligence modeled by particles in multidimensional space that have a position and a velocity.
- Random-restart hill climbing. Meta-algorithm built on top of the hill climbing optimization algorithm.
- Simplex algorithm. An algorithm for solving the linear programming problem
- Simulated annealing. Generic probabilistic meta-algorithm for the global optimization problem, inspirated by annealing in metallurgy.
- Steepest descent. see gradient descent.
- Stochastic tunneling. Approach to minimize a function based on the Monte Carlo method-sampling.
- Tabu search. optimization method of search algorithm by using memory structures.
- Trust search. Another iterative approaches to find a local minimum of an objective function in unconstrained optimization.