Package me.yixqiao.jlearn.optimizers
Class Adam
- java.lang.Object
-
- me.yixqiao.jlearn.optimizers.Optimizer
-
- me.yixqiao.jlearn.optimizers.Adam
-
public class Adam extends Optimizer
Adam optimizer.
-
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description Matrix
apply(Matrix g)
Apply the optimizer.Optimizer
cloneSettings()
Return a clone of this optimizer's settings.void
multiplyLR(double d)
Multiply the learning rate.
-
-
-
Constructor Detail
-
Adam
public Adam(double learningRate)
Initialize the optimizer.- Parameters:
learningRate
- learning rate
-
Adam
public Adam(double learningRate, double beta1, double beta2)
Initialize the optimizer.- Parameters:
learningRate
- learning ratebeta1
- beta 1 (default is 0.9)beta2
- beta 2 (default is 0.999)
-
-
Method Detail
-
cloneSettings
public Optimizer cloneSettings()
Description copied from class:Optimizer
Return a clone of this optimizer's settings.- Specified by:
cloneSettings
in classOptimizer
- Returns:
- the clone
-
multiplyLR
public void multiplyLR(double d)
Description copied from class:Optimizer
Multiply the learning rate.Used to scale the learning rate in different batch sizes.
- Specified by:
multiplyLR
in classOptimizer
- Parameters:
d
- amount
-
-