Name and add a layer to the net, without wiring it.
Add a layer to the net, without wiring it.
Name and add a root to the net.
Add a root to the net.
Remove any optimizer defined on layers of the net.
Return a copy of the net.
Initialize at random all the parameters of the net.
Train neural network on some data, using specified gradient callback and optimizer.
Train neural network on a dataset, using a predefined loss.
Train neural network on some data, using a gradient callback.
Train neural network on some data, using a predefined loss.
Compute the prediction of the net for $(PARAM v). Runs forward-propagation and outputs the predicted vector.
Compute the prediction of the net when passing the arguments to the root(s) of the net.
Reset any internal state variables of the net.
Dump the neural net (topology and weight values) to the specified path.
Discard local weights and use those of the target net instead. However, the net keeps its own internal state. Useful for hogwild SGD implementation.
Stack a layer on top of the former leaf of the net.
Stack a layer on top of the former leaf of the net.
Create a directed edge between parent and child nodes.
Create a directed edge between parent and child nodes.
Return the total number of learnable parameters in the net.
reference to the leaf of the net
Return a reference to the dense output vector of the leaf of the net.
Deserialize the neural net from the specified path.
edges of the graph: src -> [dst1, ..., dstk]
all nodes of the computational graph
map: name --> layer
array of all the leaves
array of all the roots
auto nn = NeuralNet() .stack(DenseData(400)) .stack(Linear(10)); // nn is a network working on 400-dimensions dense vectors and predicting // a 10-dimensions vector
* Neural-network abstraction.