Base class of all user-defined learners. To implement your own learning algorithm, derive from this class and override the update().

UserLearner(parameters, lr_schedule, as_matrix = TRUE)

Arguments

parameters

– list of network parameters

as_matrix
lr

(output of learning_rate_schedule()) – learning rate schedule_schedule

Details

Certain optimizers (such as AdaGrad) require additional storage. This can be allocated and initialized during construction.

See also

update_user_learner