org.kramerlab.autoencoder.math.optimization

CubicInterpolationLineSearch

abstract class CubicInterpolationLineSearch extends NonlinearConjugateGradientDescent

Line search algorithm as described in Carl Edward Rasmussen's unpublished (?) document "Function minimization using conjugate gradients: conj" (May 15 1996)

Linear Supertypes
Known Subclasses
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. CubicInterpolationLineSearch
  2. NonlinearConjugateGradientDescent
  3. Minimizer
  4. AnyRef
  5. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new CubicInterpolationLineSearch(rho: Double = 0.25, sigma: Double = 0.5)

Type Members

  1. type PointValueGrad[V] = (V, Double, V)

    We always want to keep a point together with the corresponding function value and gradient.

    We always want to keep a point together with the corresponding function value and gradient. This is because re-evaluation of function f can be prohibitively expensive, and a point by itself is useless: we always need to know something about f at this point in order to do something useful with it. These three things should be like quarks: never in isolation. The entry start point is the only exception.

    Attributes
    protected
    Definition Classes
    NonlinearConjugateGradientDescent

Abstract Value Members

  1. abstract def initialStep(previousSlope: Double, currentSlope: Double, previousStep: Double): Double

  2. abstract def initialStep(currentSlope: Double): Double

  3. abstract def searchDirectionBeta[V <: VectorSpace[V]](previousSearchDirection: V, previousGrad: V, currentGrad: V): Double

Concrete Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  12. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  13. def interpolate(leftPvg: (Double, Double, Double), rightPvg: (Double, Double, Double), valueAtZero: Double): Double

  14. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  15. def lineSearch[V <: VectorSpace[V]](f: DifferentiableFunction[V], currentPvg: (V, Double, V), direction: V, initialAlpha: Double, maxEvals: Int): ((V, Double, V), Double, Int)

    Attempts to minimize the differentiable function f starting at point in the given direction.

    Attempts to minimize the differentiable function f starting at point in the given direction. The function f should not be evaluated more than maxEvals times. This method returns the approximate minimum lying on the specified ray, and the actual number of evaluations needed.

    This method provides a default implementation, which makes use of the simplified line search. This implementation is, however, pretty wasteful, since it calculates function values and gradients multiple times. For better performance, it is advisable to set simplifiedLineSearch to ??? and override this method directly. It would allow to control which values are cached, and which values are discarded.

    Guideline: use simplifiedLineSearch to quickly prototype and to compare different line search methods. Once it's decided which method is the best, implement it properly, taking care to calculate everything just once.

    f

    twice differentiable function bounded from below

    direction

    search direction

    maxEvals

    maximum number of evaluations of function f

    returns

    next solution approximation, value and gradient at this point, alpha used for the line search, actual number of function evaluations

    Attributes
    protected
    Definition Classes
    NonlinearConjugateGradientDescent
  16. def manimize[V <: VectorSpace[V]](f: DifferentiableFunction[V], start: V): V

    Definition Classes
    Minimizer
  17. def minimize[V <: VectorSpace[V]](f: DifferentiableFunction[V], startPoint: V, progressObservers: List[Observer[V]] = Nil): V

    Finds a local minimum for a differentiable function f using it's values and gradient.

    Finds a local minimum for a differentiable function f using it's values and gradient.

    Definition Classes
    NonlinearConjugateGradientDescentMinimizer
  18. def minimize[V <: VectorSpace[V], Fitness](f: DifferentiableFunction[V], start: V, terminationCriterion: TerminationCriterion[V, (Int, Int)], resultSelector: ResultSelector[V, Fitness], progressObservers: List[Observer[V]])(implicit arg0: Ordering[Fitness]): V

    If implemented, this minimization method can be used in order to use the minimization process as a source of possible candidate solutions, where the actual solution is not the one that minimizes f, but some other point, that maximizes some other fitness function.

    If implemented, this minimization method can be used in order to use the minimization process as a source of possible candidate solutions, where the actual solution is not the one that minimizes f, but some other point, that maximizes some other fitness function. This can be very useful for learning algorithms where the candidate solutions are tested on a separate validation set in order to avoid overfitting. In general, this method just throws UnsupportedOperationException.

    Definition Classes
    Minimizer
  19. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  20. final def notify(): Unit

    Definition Classes
    AnyRef
  21. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  22. def pointValueGrad[V](f: DifferentiableFunction[V], point: V): (V, Double, V)

    Convenience method for evaluation of f.

    Convenience method for evaluation of f. As long as we use this method to evaluate f, we can be sure that we keep "pvg-triples" together, and aren't loosing any valuable information.

    Definition Classes
    NonlinearConjugateGradientDescent
  23. val rho: Double

  24. val sigma: Double

  25. def simplifiedLineSearch(phi: DifferentiableFunction[Double], maxEvals: Int, initialGuess: Double): Double

  26. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  27. def terminationCriterion(currentValue: Double): Boolean

  28. def toString(): String

    Definition Classes
    AnyRef → Any
  29. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  30. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  31. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Minimizer

Inherited from AnyRef

Inherited from Any

Ungrouped