Base class of all user-defined learners. To implement your own learning
algorithm, derive from this class and override the update()
.
UserLearner(parameters, lr_schedule, as_matrix = TRUE)
parameters | – list of network parameters |
---|---|
as_matrix | |
lr | (output of learning_rate_schedule()) – learning rate schedule_schedule |
Certain optimizers (such as AdaGrad) require additional storage. This can be allocated and initialized during construction.
update_user_learner