Class Adam


  • public class Adam
    extends Optimizer
    Adam optimizer.
    • Constructor Summary

      Constructors 
      Constructor Description
      Adam​(double learningRate)
      Initialize the optimizer.
      Adam​(double learningRate, double beta1, double beta2)
      Initialize the optimizer.
    • Method Summary

      All Methods Instance Methods Concrete Methods 
      Modifier and Type Method Description
      Matrix apply​(Matrix g)
      Apply the optimizer.
      Optimizer cloneSettings()
      Return a clone of this optimizer's settings.
      void multiplyLR​(double d)
      Multiply the learning rate.
      • Methods inherited from class java.lang.Object

        clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
    • Constructor Detail

      • Adam

        public Adam​(double learningRate)
        Initialize the optimizer.
        Parameters:
        learningRate - learning rate
      • Adam

        public Adam​(double learningRate,
                    double beta1,
                    double beta2)
        Initialize the optimizer.
        Parameters:
        learningRate - learning rate
        beta1 - beta 1 (default is 0.9)
        beta2 - beta 2 (default is 0.999)
    • Method Detail

      • cloneSettings

        public Optimizer cloneSettings()
        Description copied from class: Optimizer
        Return a clone of this optimizer's settings.
        Specified by:
        cloneSettings in class Optimizer
        Returns:
        the clone
      • multiplyLR

        public void multiplyLR​(double d)
        Description copied from class: Optimizer
        Multiply the learning rate.

        Used to scale the learning rate in different batch sizes.

        Specified by:
        multiplyLR in class Optimizer
        Parameters:
        d - amount
      • apply

        public Matrix apply​(Matrix g)
        Description copied from class: Optimizer
        Apply the optimizer.
        Specified by:
        apply in class Optimizer
        Parameters:
        g - errors calculated from backpropagation.
        Returns:
        the gradients from the optimizer