AdaGrad

Constructors

this
this(ulong num_epochs, float lr, ulong mini_batch_sz, float eps_)
Undocumented in source.

Members

Functions

current_lr
float current_lr(size_t k, size_t j)
Undocumented in source. Be warned that the author may not have intended to support it.
dup
AdaGrad dup()
Undocumented in source. Be warned that the author may not have intended to support it.
params_to_optimize
void params_to_optimize(float[][] W_, float[][] grad_)
Undocumented in source. Be warned that the author may not have intended to support it.
register
void register(NeuralLayer layer)
Undocumented in source. Be warned that the author may not have intended to support it.
toString
string toString()
Undocumented in source. Be warned that the author may not have intended to support it.
update
void update(NeuralLayer layer, float[] ext_grad)
Undocumented in source. Be warned that the author may not have intended to support it.
update
void update(NeuralLayer layer, SparseF[] ext_grad)
Undocumented in source. Be warned that the author may not have intended to support it.
update_general
void update_general(NeuralLayer layer, V[] ext_grad)
Undocumented in source. Be warned that the author may not have intended to support it.

Mixins

__anonymous
mixin opCallNew
Undocumented in source.

Static functions

ada_op1
void ada_op1(float[] row, float[] g)
Undocumented in source. Be warned that the author may not have intended to support it.
ada_op2
void ada_op2(float[] row, float[] g, float[] sg, float lr_, float eps_)
Undocumented in source. Be warned that the author may not have intended to support it.
update_matrix
void update_matrix(float[][] w, float[][] g, float[][] sum_g_sq, float lr_, float eps_)
Undocumented in source. Be warned that the author may not have intended to support it.

Variables

W
float[][] W;
Undocumented in source.
_update
void delegate(NeuralLayer, float[]) _update;
Undocumented in source.
acc_grad
float[][] acc_grad;
Undocumented in source.
eps
float eps;
Undocumented in source.
grad
float[][] grad;
Undocumented in source.

Mixed In Members

From mixin opCallNew

opCall
auto opCall(T args)
Undocumented in source. Be warned that the author may not have intended to support it.

Inherited Members

From SGDOptimizer

num_epochs
ulong num_epochs;
Undocumented in source.
mini_batch_sz
ulong mini_batch_sz;
Undocumented in source.
lr
float lr;
Undocumented in source.
cnt
ulong cnt;
Undocumented in source.
priors
AdditiveLinearPrior[] priors;
Undocumented in source.
prox
ProxyLinearPrior prox;
Undocumented in source.
learn
void learn(NeuralNet nn, D data, S delegate(R net_out, ref T ex, ref V[] grad) grad_f, bool verbose, uint num_cores)
Undocumented in source. Be warned that the author may not have intended to support it.
current_lr
float current_lr(size_t k, size_t j)
Undocumented in source.
register
void register(NeuralLayer layer)
Undocumented in source.
update
void update(NeuralLayer layer, float[] ext_grad)
Undocumented in source.
update
void update(NeuralLayer layer, SparseF[] ext_grad)
Undocumented in source.
dup
Optimizer dup()
Undocumented in source.
toString
string toString()
Undocumented in source. Be warned that the author may not have intended to support it.

Examples

auto nn = NeuralNet()
    .stack(SparseData(1000))
    .stack(Linear(1));

// Adagrad learning with 5 epochs, learning rate 0.1, mini-batch size of 100:
nn.learn(data, "square", AdaGrad(5, 0.1, 100));

Meta