org.kramerlab.autoencoder.math.optimization

NonlinearConjugateGradientDescent

abstract class NonlinearConjugateGradientDescent extends Minimizer

Generic nonlinear conjugate gradient method for optimization of nonlinear twice differentiable functions which are bounded from below.

The choice of search direction, the concrete implementation of line search and the termination criterion are left abstract.

Linear Supertypes
Minimizer, AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. NonlinearConjugateGradientDescent
  2. Minimizer
  3. AnyRef
  4. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new NonlinearConjugateGradientDescent(maxLineSearches: Int = 256, maxFunctionEvaluations: Int = 256, maxEvaluationsPerLineSearch: Int = 16)

Type Members

  1. type PointValueGrad[V] = (V, Double, V)

    We always want to keep a point together with the corresponding function value and gradient.

    We always want to keep a point together with the corresponding function value and gradient. This is because re-evaluation of function f can be prohibitively expensive, and a point by itself is useless: we always need to know something about f at this point in order to do something useful with it. These three things should be like quarks: never in isolation. The entry start point is the only exception.

    Attributes
    protected

Abstract Value Members

  1. abstract def initialStep(previousSlope: Double, currentSlope: Double, previousStep: Double): Double

  2. abstract def initialStep(currentSlope: Double): Double

  3. abstract def searchDirectionBeta[V <: VectorSpace[V]](previousSearchDirection: V, previousGrad: V, currentGrad: V): Double

  4. abstract def simplifiedLineSearch(phi: DifferentiableFunction[Double], maxEvals: Int, initialAlpha: Double): Double

    Attributes
    protected

Concrete Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  12. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  13. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  14. def lineSearch[V <: VectorSpace[V]](f: DifferentiableFunction[V], currentPvg: (V, Double, V), direction: V, initialAlpha: Double, maxEvals: Int): ((V, Double, V), Double, Int)

    Attempts to minimize the differentiable function f starting at point in the given direction.

    Attempts to minimize the differentiable function f starting at point in the given direction. The function f should not be evaluated more than maxEvals times. This method returns the approximate minimum lying on the specified ray, and the actual number of evaluations needed.

    This method provides a default implementation, which makes use of the simplified line search. This implementation is, however, pretty wasteful, since it calculates function values and gradients multiple times. For better performance, it is advisable to set simplifiedLineSearch to ??? and override this method directly. It would allow to control which values are cached, and which values are discarded.

    Guideline: use simplifiedLineSearch to quickly prototype and to compare different line search methods. Once it's decided which method is the best, implement it properly, taking care to calculate everything just once.

    f

    twice differentiable function bounded from below

    direction

    search direction

    maxEvals

    maximum number of evaluations of function f

    returns

    next solution approximation, value and gradient at this point, alpha used for the line search, actual number of function evaluations

    Attributes
    protected
  15. def manimize[V <: VectorSpace[V]](f: DifferentiableFunction[V], start: V): V

    Definition Classes
    Minimizer
  16. def minimize[V <: VectorSpace[V]](f: DifferentiableFunction[V], startPoint: V, progressObservers: List[Observer[V]] = Nil): V

    Finds a local minimum for a differentiable function f using it's values and gradient.

    Finds a local minimum for a differentiable function f using it's values and gradient.

    Definition Classes
    NonlinearConjugateGradientDescentMinimizer
  17. def minimize[V <: VectorSpace[V], Fitness](f: DifferentiableFunction[V], start: V, terminationCriterion: TerminationCriterion[V, (Int, Int)], resultSelector: ResultSelector[V, Fitness], progressObservers: List[Observer[V]])(implicit arg0: Ordering[Fitness]): V

    If implemented, this minimization method can be used in order to use the minimization process as a source of possible candidate solutions, where the actual solution is not the one that minimizes f, but some other point, that maximizes some other fitness function.

    If implemented, this minimization method can be used in order to use the minimization process as a source of possible candidate solutions, where the actual solution is not the one that minimizes f, but some other point, that maximizes some other fitness function. This can be very useful for learning algorithms where the candidate solutions are tested on a separate validation set in order to avoid overfitting. In general, this method just throws UnsupportedOperationException.

    Definition Classes
    Minimizer
  18. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  19. final def notify(): Unit

    Definition Classes
    AnyRef
  20. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  21. def pointValueGrad[V](f: DifferentiableFunction[V], point: V): (V, Double, V)

    Convenience method for evaluation of f.

    Convenience method for evaluation of f. As long as we use this method to evaluate f, we can be sure that we keep "pvg-triples" together, and aren't loosing any valuable information.

  22. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  23. def terminationCriterion(currentValue: Double): Boolean

  24. def toString(): String

    Definition Classes
    AnyRef → Any
  25. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  26. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  27. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Minimizer

Inherited from AnyRef

Inherited from Any

Ungrouped