Line Search

A subroutine for computing the step size for descent direction based optimization methods. Suppose we have the descent direction \(\mathbf{d}\) and want to find the optimal step size \(\alpha\), assuming we want to minimize some function \(f : \mathbb{R}^d \rightarrow \mathbb{R}\), we have the following 1D optimization problem: \[ \min_\alpha f(\mathbf{x}+\alpha \mathbf{d})\]

According to the authors in (, ), the Brent-Dekker method is a commonly used algorithm for such 1D optimization problems.

However it is often computationally more expensive to perform a line search during each iteration, thus approximate line search methods are often employed. These methods rely on finding an approximate value for \(\alpha\) that satisfies the following conditions:

An alternative to the curvature condition is the strong curvature condition, \(|\nabla_{\mathbf{d}^{(k)}}f(\mathbf{x}^{(k+1)})| \leq -\sigma\nabla_{\mathbf{d}^{(k)}}f(\mathbf{x}^{(k)})\)

Together, these are called the Wolfe conditions.

Emacs 29.4 (Org mode 9.6.15)