Package me.yixqiao.jlearn.optimizers
Class Adam
- java.lang.Object
-
- me.yixqiao.jlearn.optimizers.Optimizer
-
- me.yixqiao.jlearn.optimizers.Adam
-
public class Adam extends Optimizer
Adam optimizer.
-
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description Matrixapply(Matrix g)Apply the optimizer.OptimizercloneSettings()Return a clone of this optimizer's settings.voidmultiplyLR(double d)Multiply the learning rate.
-
-
-
Constructor Detail
-
Adam
public Adam(double learningRate)
Initialize the optimizer.- Parameters:
learningRate- learning rate
-
Adam
public Adam(double learningRate, double beta1, double beta2)Initialize the optimizer.- Parameters:
learningRate- learning ratebeta1- beta 1 (default is 0.9)beta2- beta 2 (default is 0.999)
-
-
Method Detail
-
cloneSettings
public Optimizer cloneSettings()
Description copied from class:OptimizerReturn a clone of this optimizer's settings.- Specified by:
cloneSettingsin classOptimizer- Returns:
- the clone
-
multiplyLR
public void multiplyLR(double d)
Description copied from class:OptimizerMultiply the learning rate.Used to scale the learning rate in different batch sizes.
- Specified by:
multiplyLRin classOptimizer- Parameters:
d- amount
-
-