We always want to keep a point together with the corresponding function value and gradient.
We always want to keep a point together with the corresponding function
value and gradient. This is because re-evaluation of function f can
be prohibitively expensive, and a point by itself is useless: we always
need to know something about f at this point in order to do something
useful with it. These three things should be like quarks: never in
isolation. The entry start point is the only exception.
Attempts to minimize the differentiable function f starting at point in
the given direction.
Attempts to minimize the differentiable function f starting at point in
the given direction. The function f should not be evaluated more than
maxEvals times. This method returns the approximate minimum lying on
the specified ray, and the actual number of evaluations needed.
This method provides a default implementation, which makes use of the
simplified line search. This implementation is, however, pretty wasteful,
since it calculates function values and gradients multiple times. For
better performance, it is advisable to set simplifiedLineSearch to ???
and override this method directly. It would allow to control which
values are cached, and which values are discarded.
Guideline: use simplifiedLineSearch to quickly prototype and to compare different line search methods. Once it's decided which method is the best, implement it properly, taking care to calculate everything just once.
twice differentiable function bounded from below
search direction
maximum number of evaluations of function f
next solution approximation, value and gradient at this point, alpha used for the line search, actual number of function evaluations
Finds a local minimum for a differentiable function f using it's
values and gradient.
Finds a local minimum for a differentiable function f using it's
values and gradient.
If implemented, this minimization method can be used in order to
use the minimization process as a source of possible candidate solutions,
where the actual solution is not the one that minimizes f, but some
other point, that maximizes some other fitness function.
If implemented, this minimization method can be used in order to
use the minimization process as a source of possible candidate solutions,
where the actual solution is not the one that minimizes f, but some
other point, that maximizes some other fitness function. This can be very
useful for learning algorithms where the candidate solutions are tested
on a separate validation set in order to avoid overfitting. In general,
this method just throws UnsupportedOperationException.
Convenience method for evaluation of f.
Convenience method for evaluation of f. As long as we use this method
to evaluate f, we can be sure that we keep "pvg-triples" together, and
aren't loosing any valuable information.
Generic nonlinear conjugate gradient method for optimization of nonlinear twice differentiable functions which are bounded from below.
The choice of search direction, the concrete implementation of line search and the termination criterion are left abstract.