ADAM

ADAM stochastic optimizer.

See ADAM: A method for stochastic optimization

class ADAM : SGDOptimizer {
float[][] W;
float[][] grad;
float eps;
float beta1_0;
float beta2_0;
float beta1;
float beta2;
float[][] M;
float[][] S;
}

Examples

1 
2 auto nn = NeuralNet()
3     .stack(DenseData(200))
4     .stack(Linear(10));
5 
6 // ADAM learning with 5 epochs, learning rate 0.1, mini-batch size of 100:
7 nn.learn(data, "multinomial", ADAM(5, 0.1, 100));

Meta