org.kramerlab.autoencoder.math.optimization

GradientDescent

case class GradientDescent(maxIters: Int) extends Minimizer with Product with Serializable

Linear Supertypes
Serializable, Serializable, Product, Equals, Minimizer, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. GradientDescent
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. Minimizer
  7. AnyRef
  8. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new GradientDescent(maxIters: Int)

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  9. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  10. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  11. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  12. def manimize[V <: VectorSpace[V]](f: DifferentiableFunction[V], start: V): V

    Definition Classes
    Minimizer
  13. val maxIters: Int

  14. def minimize[V <: VectorSpace[V]](f: DifferentiableFunction[V], start: V, progressObservers: List[Observer[V]]): V

    Finds a local minimum for a differentiable function f using it's values and gradient.

    Finds a local minimum for a differentiable function f using it's values and gradient.

    Definition Classes
    GradientDescentMinimizer
  15. def minimize[V <: VectorSpace[V], Fitness](f: DifferentiableFunction[V], start: V, terminationCriterion: TerminationCriterion[V, (Int, Int)], resultSelector: ResultSelector[V, Fitness], progressObservers: List[Observer[V]])(implicit arg0: Ordering[Fitness]): V

    If implemented, this minimization method can be used in order to use the minimization process as a source of possible candidate solutions, where the actual solution is not the one that minimizes f, but some other point, that maximizes some other fitness function.

    If implemented, this minimization method can be used in order to use the minimization process as a source of possible candidate solutions, where the actual solution is not the one that minimizes f, but some other point, that maximizes some other fitness function. This can be very useful for learning algorithms where the candidate solutions are tested on a separate validation set in order to avoid overfitting. In general, this method just throws UnsupportedOperationException.

    Definition Classes
    Minimizer
  16. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  17. final def notify(): Unit

    Definition Classes
    AnyRef
  18. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  19. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  20. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  21. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  22. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from Minimizer

Inherited from AnyRef

Inherited from Any

Ungrouped