Layer factory function to create a embedding layer.

Embedding(shape = NULL, init = init_glorot_uniform(), weights = NULL,
  name = "")

Arguments

shape

- list of ints representing tensor shape

init

(scalar or matrix or initializer, defaults to init_glorot_uniform()) – initial value of weights W

name

string (optional) the name of the Function instance in the network

Details

An embedding is conceptually a lookup table. For every input token (e.g. a word or any category label), the corresponding entry in in the lookup table is returned.

In CNTK, discrete items such as words are represented as one-hot vectors. The table lookup is realized as a matrix product, with a matrix whose rows are the embedding vectors. Note that multiplying a matrix from the left with a one-hot vector is the same as copying out the row for which the input vector is 1. CNTK has special optimizations to make this operation as efficient as an actual table lookup if the input is sparse.

The lookup table in this layer is learnable, unless a user-specified one is supplied through the weights parameter. For example, to use an existing embedding table from a file in numpy format, use this:

Embedding(weights=np.load('PATH.npy')) To initialize a learnable lookup table with a given numpy array that is to be used as the initial value, pass that array to the init parameter (not weights).

An Embedding instance owns its weight parameter tensor E, and exposes it as an attribute .E.