Input layer representing a dense float[]
DropOut layer accepting all dense or all sparse parents.
Linear layer accepting sparse or dense parents and outputing a dense vector.
ReLU activation layer accepting dense parents.
Scaled Exponential Linear Unit activation layer accepting dense parents. See Self-Normalizing Neural Networks for details.
Input layer representing a sparse array SparseF[] of (uint, float) pairs
On-the-fly polynomial kernel expansion of sparse input.
TanH activation layer accepting dense parents.
Implementation of the various computational layers.