DropOut

DropOut layer accepting all dense or all sparse parents.

Features rescaling happens automatically at training time.

class DropOut : NeuralLayer {
float _drop_rate;
float _scale_ratio;
void delegate() _predict;
void delegate(float[]) _acc_grad;
}

Inherited Members

From NeuralLayer

name
string name;
type
LayerT type;
dim_in
size_t dim_in;

total input dimension of this layer (sum of output dimensions of its parents)

dim_out
size_t dim_out;

total output dimension of this layer

children
NeuralLayer[] children;

array referencing all the children of this layer

parents
NeuralLayer[] parents;

array referencing all the parents of this layer

learnable
bool learnable [@property getter]

whether or not this layer has any parameters to be learnt

out_d
float[] out_d;

dense output vector of this layer (might be unused)

out_s
SparseF[] out_s;

sparse output vector of this layer (might be unused)

backgrads
float[][] backgrads;

array of gradients to backpropagate to parents

Examples

auto l = DropOut(0.3); // Drop 30% of the input neurons at random.

Meta