org.kramerlab.autoencoder.math.optimization

ConjugateGradientDescent_HagerZhang

class ConjugateGradientDescent_HagerZhang extends Minimizer

Implementation of the conjugate gradient descent as described in the article "A new conjugate gradient method wyth guaranteed descent and an efficient line search" by William W. Hager. and Hongchao Zhang.

It uses Polak-Ribiere-Polyak-like update method for calculation of the next direction and inexact line search with approximate Wolfe conditions.

This implementation seems broke. It does not outperform naive gradient descent on fairly simple 2D-functions.

Linear Supertypes
Minimizer, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. ConjugateGradientDescent_HagerZhang
  2. Minimizer
  3. AnyRef
  4. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new ConjugateGradientDescent_HagerZhang(configuration: ConjugateGradientDescent_HagerZhangConfiguration)

Type Members

  1. case class History(bisectionSteps: List[Double] = immutable.this.Nil) extends Product with Serializable

    Sometimes it's possible to learn something about the problem while executing line searches, and to adjust some parameters.

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  12. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  13. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  14. def lineSearch[V <: VectorSpace[V]](f: DifferentiableFunction[V], position: V, direction: V, history: History): (Double, History)

    Hybrid line search algorithm, which starts with the dumb & brutal backtracking algorithm, and switches to a faster strategy if it finds appropriate conditions.

    Hybrid line search algorithm, which starts with the dumb & brutal backtracking algorithm, and switches to a faster strategy if it finds appropriate conditions.

    Attributes
    protected
  15. def manimize[V <: VectorSpace[V]](f: DifferentiableFunction[V], start: V): V

    Definition Classes
    Minimizer
  16. def minimize[V <: VectorSpace[V]](f: DifferentiableFunction[V], start: V, progressObservers: List[Observer[V]] = Nil): V

    Finds a local minimum for a differentiable function f using it's values and gradient.

    Finds a local minimum for a differentiable function f using it's values and gradient.

    Definition Classes
    ConjugateGradientDescent_HagerZhangMinimizer
  17. def minimize[V <: VectorSpace[V], Fitness](f: DifferentiableFunction[V], start: V, terminationCriterion: TerminationCriterion[V, (Int, Int)], resultSelector: ResultSelector[V, Fitness], progressObservers: List[Observer[V]])(implicit arg0: Ordering[Fitness]): V

    If implemented, this minimization method can be used in order to use the minimization process as a source of possible candidate solutions, where the actual solution is not the one that minimizes f, but some other point, that maximizes some other fitness function.

    If implemented, this minimization method can be used in order to use the minimization process as a source of possible candidate solutions, where the actual solution is not the one that minimizes f, but some other point, that maximizes some other fitness function. This can be very useful for learning algorithms where the candidate solutions are tested on a separate validation set in order to avoid overfitting. In general, this method just throws UnsupportedOperationException.

    Definition Classes
    Minimizer
  18. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  19. final def notify(): Unit

    Definition Classes
    AnyRef
  20. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  21. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  22. def toString(): String

    Definition Classes
    AnyRef → Any
  23. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  25. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Minimizer

Inherited from AnyRef

Inherited from Any

Ungrouped