org.kramerlab.autoencoder.neuralnet

NeuralNet

class NeuralNet extends NeuralNetLike[NeuralNet] with Serializable

Base trait for all neural networks.

Linear Supertypes
Serializable, Serializable, NeuralNetLike[NeuralNet], Visualizable, (Mat) ⇒ Mat, AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. NeuralNet
  2. Serializable
  3. Serializable
  4. NeuralNetLike
  5. Visualizable
  6. Function1
  7. AnyRef
  8. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new NeuralNet(layers: List[Layer])

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def activities(input: Mat): List[Mat]

    Definition Classes
    NeuralNetLike
  7. def andThen[A](g: (Mat) ⇒ A): (Mat) ⇒ A

    Definition Classes
    Function1
    Annotations
    @unspecialized()
  8. def apply(input: Mat): Mat

    Propagates the input from the visible layer up to the top layer

    Propagates the input from the visible layer up to the top layer

    Definition Classes
    NeuralNetLike → Function1
  9. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  10. def build(layers: List[Layer]): NeuralNet

    Builds a neural net of the right type and of the right shape out of specified layers.

    Builds a neural net of the right type and of the right shape out of specified layers.

    Note that this method depends on instance, not just a class: fore example the Autoencoder has to know what it's 'central' Layer is.

    Definition Classes
    NeuralNetNeuralNetLike
  11. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  12. def compose[A](g: (A) ⇒ Mat): (A) ⇒ Mat

    Definition Classes
    Function1
    Annotations
    @unspecialized()
  13. var dataSample: Option[Mat]

    Definition Classes
    NeuralNetLike
  14. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  15. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  16. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  17. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  18. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  19. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  20. val layers: List[Layer]

    Enumerates layers of this (linear) neural net.

    Enumerates layers of this (linear) neural net.

    TODO: generalize it to arbitrary directed acyclic graphs, what's so special about lists?...

    Definition Classes
    NeuralNetNeuralNetLike
  21. def layersToString(layers: List[Layer]): String

    Attributes
    protected
  22. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  23. final def notify(): Unit

    Definition Classes
    AnyRef
  24. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  25. def optimize(input: Mat, output: Mat, errorFunctionFactory: DifferentiableErrorFunctionFactory[Mat] = SquareErrorFunctionFactory, relativeValidationSetSize: Double, maxEvals: Int, trainingObservers: List[TrainingObserver]): NeuralNet

    Performs optimization of all parameters of the neural network using the specified input and output, the specified method to define an error function (defaults to SquareErrorFunctionFactory).

    Performs optimization of all parameters of the neural network using the specified input and output, the specified method to define an error function (defaults to SquareErrorFunctionFactory).

    Standard feed-forward algorithm is used for evaluation of the function, backpropagation is used for calculation of the gradient.

    Definition Classes
    NeuralNetLike
  26. def prependAffineLinearTransformation(factor: Mat, offset: Mat): NeuralNet

    Assumes that this is a "usual" neural net with alternating unit and connection layers and prepends an affine linear transformation to it.

    Assumes that this is a "usual" neural net with alternating unit and connection layers and prepends an affine linear transformation to it.

    Why the heck did I implement biased layers at all, why didn't I stuff all this cruft into something like "AffineLinearTransform" or so... Damn

    Definition Classes
    NeuralNetLike
  27. def reverse(output: Mat): Mat

    Propagates the output from top layer down to the visible layer

    Propagates the output from top layer down to the visible layer

    Definition Classes
    NeuralNetLike
  28. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  29. def toImage: BufferedImage

    Definition Classes
    NeuralNetLikeVisualizable
  30. def toImage(w: Int, h: Int): BufferedImage

    Definition Classes
    Visualizable
  31. def toImage(colormap: (Double) ⇒ Int): BufferedImage

    Definition Classes
    Visualizable
  32. def toString(): String

    Definition Classes
    NeuralNet → Function1 → AnyRef → Any
  33. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  34. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  35. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from NeuralNetLike[NeuralNet]

Inherited from Visualizable

Inherited from (Mat) ⇒ Mat

Inherited from AnyRef

Inherited from Any

Ungrouped