Finds a local minimum for a differentiable function f using it's
values and gradient.
Finds a local minimum for a differentiable function f using it's
values and gradient.
If implemented, this minimization method can be used in order to
use the minimization process as a source of possible candidate solutions,
where the actual solution is not the one that minimizes f, but some
other point, that maximizes some other fitness function.
If implemented, this minimization method can be used in order to
use the minimization process as a source of possible candidate solutions,
where the actual solution is not the one that minimizes f, but some
other point, that maximizes some other fitness function. This can be very
useful for learning algorithms where the candidate solutions are tested
on a separate validation set in order to avoid overfitting. In general,
this method just throws UnsupportedOperationException.
Rasmussen's minimize.m reimplementation. It works, but ignores the termination criterion