org.kramerlab.autoencoder.math.optimization

NonlinearConjugateGradientDescent_Rasmussen

class NonlinearConjugateGradientDescent_Rasmussen extends CubicInterpolationLineSearch with PolakRibiere with SlopeRatioInitialStep

Version of nonlinear conjugate gradient descent based on the minimize.m implementation of Dr. Carl Edward Rasmussen.

Linear Supertypes
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. NonlinearConjugateGradientDescent_Rasmussen
  2. SlopeRatioInitialStep
  3. PolakRibiere
  4. CubicInterpolationLineSearch
  5. NonlinearConjugateGradientDescent
  6. Minimizer
  7. AnyRef
  8. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new NonlinearConjugateGradientDescent_Rasmussen()

Type Members

  1. type PointValueGrad[V] = (V, Double, V)

    We always want to keep a point together with the corresponding function value and gradient.

    We always want to keep a point together with the corresponding function value and gradient. This is because re-evaluation of function f can be prohibitively expensive, and a point by itself is useless: we always need to know something about f at this point in order to do something useful with it. These three things should be like quarks: never in isolation. The entry start point is the only exception.

    Attributes
    protected
    Definition Classes
    NonlinearConjugateGradientDescent

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  12. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  13. def initialStep(previousSlope: Double, currentSlope: Double, previousStep: Double): Double

  14. def initialStep(slope: Double): Double

  15. def interpolate(leftPvg: (Double, Double, Double), rightPvg: (Double, Double, Double), valueAtZero: Double): Double

    Definition Classes
    CubicInterpolationLineSearch
  16. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  17. def lineSearch[V <: VectorSpace[V]](f: DifferentiableFunction[V], currentPvg: (V, Double, V), direction: V, initialAlpha: Double, maxEvals: Int): ((V, Double, V), Double, Int)

    Attempts to minimize the differentiable function f starting at point in the given direction.

    Attempts to minimize the differentiable function f starting at point in the given direction. The function f should not be evaluated more than maxEvals times. This method returns the approximate minimum lying on the specified ray, and the actual number of evaluations needed.

    This method provides a default implementation, which makes use of the simplified line search. This implementation is, however, pretty wasteful, since it calculates function values and gradients multiple times. For better performance, it is advisable to set simplifiedLineSearch to ??? and override this method directly. It would allow to control which values are cached, and which values are discarded.

    Guideline: use simplifiedLineSearch to quickly prototype and to compare different line search methods. Once it's decided which method is the best, implement it properly, taking care to calculate everything just once.

    f

    twice differentiable function bounded from below

    direction

    search direction

    maxEvals

    maximum number of evaluations of function f

    returns

    next solution approximation, value and gradient at this point, alpha used for the line search, actual number of function evaluations

    Attributes
    protected
    Definition Classes
    NonlinearConjugateGradientDescent
  18. def manimize[V <: VectorSpace[V]](f: DifferentiableFunction[V], start: V): V

    Definition Classes
    Minimizer
  19. def minimize[V <: VectorSpace[V]](f: DifferentiableFunction[V], startPoint: V, progressObservers: List[Observer[V]] = Nil): V

    Finds a local minimum for a differentiable function f using it's values and gradient.

    Finds a local minimum for a differentiable function f using it's values and gradient.

    Definition Classes
    NonlinearConjugateGradientDescentMinimizer
  20. def minimize[V <: VectorSpace[V], Fitness](f: DifferentiableFunction[V], start: V, terminationCriterion: TerminationCriterion[V, (Int, Int)], resultSelector: ResultSelector[V, Fitness], progressObservers: List[Observer[V]])(implicit arg0: Ordering[Fitness]): V

    If implemented, this minimization method can be used in order to use the minimization process as a source of possible candidate solutions, where the actual solution is not the one that minimizes f, but some other point, that maximizes some other fitness function.

    If implemented, this minimization method can be used in order to use the minimization process as a source of possible candidate solutions, where the actual solution is not the one that minimizes f, but some other point, that maximizes some other fitness function. This can be very useful for learning algorithms where the candidate solutions are tested on a separate validation set in order to avoid overfitting. In general, this method just throws UnsupportedOperationException.

    Definition Classes
    Minimizer
  21. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  22. final def notify(): Unit

    Definition Classes
    AnyRef
  23. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  24. def pointValueGrad[V](f: DifferentiableFunction[V], point: V): (V, Double, V)

    Convenience method for evaluation of f.

    Convenience method for evaluation of f. As long as we use this method to evaluate f, we can be sure that we keep "pvg-triples" together, and aren't loosing any valuable information.

    Definition Classes
    NonlinearConjugateGradientDescent
  25. val rho: Double

    Definition Classes
    CubicInterpolationLineSearch
  26. def searchDirectionBeta[V <: VectorSpace[V]](previousSearchDirection: V, previousGradient: V, currentGradient: V): Double

  27. val sigma: Double

    Definition Classes
    CubicInterpolationLineSearch
  28. def simplifiedLineSearch(phi: DifferentiableFunction[Double], maxEvals: Int, initialGuess: Double): Double

  29. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  30. def terminationCriterion(currentValue: Double): Boolean

  31. def toString(): String

    Definition Classes
    AnyRef → Any
  32. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  33. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  34. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from SlopeRatioInitialStep

Inherited from PolakRibiere

Inherited from Minimizer

Inherited from AnyRef

Inherited from Any

Ungrouped