Line Search Methods
Given a twice differentiable function $f\colon\mathbf{R}^n\to\mathbf{R}$ and $x\in\mathbf{R}^n$, this application calculates $x_\mathrm{new}$ from the formula \[ x_\mathrm{new} = x + \alpha d \] where $d\in\mathbf{R}^n$ is a search direction such that $f(x_\mathrm{new}) < f(x)$ and $\alpha$ is a step length (step size) that determines how far $x$ should move along that direction. This application provides two methods for finding the search directions $d$: steepest descent method and Newton's method. The first method determines $d$ from the negative gradient of $f(x)$ \[ d = -\nabla f(x) \] and the second method (Newton's method) \[ d = -[\nabla^2 f(x)]^{-1} \nabla f(x) \] where $\nabla^2 f(x)$ is the Hessian of $f(x)$. From $x$ and $d$, the corresponding step length $\alpha$ is calculated using a backtracking line search with Armijo condition.
Determining $d$, $\alpha$, and $x_\mathrm{new}$ is in one iteration of iterative methods for solving unconstrained optimization problems. The result $x_\mathrm{new}$ can also be used in the next iteration. But this application does just one iteration. If you are looking for a complete solver, unconstrained optimization is recommendeded.
Input Data
This application requires the same input format as unconstrained optimization, except that no default values for $x$ in this application, they must be entered by a user.