- _init
void _init()
Undocumented in source. Be warned that the author may not have intended to support it.
- deserialize
void deserialize(Serializer s)
Undocumented in source. Be warned that the author may not have intended to support it.
- dup
NeuralLayer dup()
Undocumented in source. Be warned that the author may not have intended to support it.
- expand
void expand(T buff)
Undocumented in source. Be warned that the author may not have intended to support it.
- monitor
void monitor(ushort id, uint hash, float val)
Undocumented in source. Be warned that the author may not have intended to support it.
- predict
void predict()
Undocumented in source. Be warned that the author may not have intended to support it.
- reset
void reset()
Undocumented in source. Be warned that the author may not have intended to support it.
- serialize
void serialize(Serializer s)
Undocumented in source. Be warned that the author may not have intended to support it.
- share_params
void share_params(NeuralLayer l)
Undocumented in source. Be warned that the author may not have intended to support it.
- input
Variant input;
Undocumented in source.
- forward_prop
void forward_prop(T obs)
Undocumented in source. Be warned that the author may not have intended to support it.
- predict
void predict()
Undocumented in source.
- accumulate_grad
void accumulate_grad(V[] grad)
Undocumented in source. Be warned that the author may not have intended to support it.
- backward_prop
void backward_prop(V[] grad)
Undocumented in source. Be warned that the author may not have intended to support it.
- num_params
ulong num_params [@property getter]
Undocumented in source. Be warned that the author may not have intended to support it.
- recompute_topology
void recompute_topology()
Undocumented in source. Be warned that the author may not have intended to support it.
- allocate_interface
void allocate_interface()
Undocumented in source. Be warned that the author may not have intended to support it.
On-the-fly polynomial kernel expansion of sparse input.
This will perform polynomial kernel expansion of a set of sparse features based on a group attribute of this features. The features fed to this layer need to be a SparseFG[]. It assumes that the feature ids are uniform random numbers (hashes) so that we can efficiently generate a cross-feature hash by just XOR-ing together the single hashes of the monomial. This layer is meant to be used as part of a NeuralNet() topology at test time, but it's preferable to run the expansion outside the net at training time so that it can be run only once while building the dataset. This will avoid rebuilding the cross-features at every pass during training.