vectorflow.optimizers

Implementation of different stochastic optimizers.

The default parallelization strategy over the cores is Hogwild!. This is a lock-free strategy where race conditions will occur. This means that the library is non-deterministic when training a network as soon as there is more than one core involved. Hogwild! will work as long as the data access pattern is sparse enough, which means that if you have too few dense parameters to learn and too many cores, the optimization can fail.

Members

Classes

ADAM
class ADAM

ADAM stochastic optimizer.

AdaGrad
class AdaGrad

AdaGrad stochastic optimizer.

SGDOptimizer
class SGDOptimizer
Undocumented in source.
ShadowSGDOptimizer
class ShadowSGDOptimizer
Undocumented in source.

Interfaces

Optimizer
interface Optimizer
Undocumented in source.

Meta