Layer factory function to create an activation layer. Activation functions can be used directly in CNTK, so there is no difference between y = relu(x) and y = Activation(relu)(x). This layer is useful if one wants to configure the activation function with default <- options, or when its invocation should be named.

Activation(activation = activation_identity, name = "")

Arguments

activation

(Function) - optional activation Function (defaults to activation_identity) – function to apply at the end, e.g. op_relu

name

string (optional) the name of the Function instance in the network the network

Value

A function that accepts one argument and applies the operation to it

Examples

(model <- Dense(500) %>% Activation(op_relu)())
#> Composite(x: Sequence[tensor]) -> Sequence[tensor]
# is the same as (model <- Dense(500) %>% op_relu)
#> Composite(x: Sequence[tensor]) -> Sequence[tensor]
# and also the same as (model <- Dense(500, activation=op_relu))
#> Dense(x: Sequence[tensor]) -> Sequence[tensor]