This operation computes the cross entropy between the target_vector and the softmax of the output_vector. The elements of target_vector have to be non-negative and should sum to 1. The output_vector can contain any values. The function will internally compute the softmax of the output_vector.

loss_cross_entropy_with_softmax(output_vector, target_vector, axis = -1,
  name = "")

Arguments

output_vector

unscaled computed output values from the network

target_vector

one-hot encoded vector of target values

axis

integer (optional) for axis to compute cross-entropy

name

string (optional) the name of the Function instance in the network

References

https://www.cntk.ai/pythondocs/cntk.losses.html#cntk.losses.cross_entropy_with_softmax