MaximilianSchreff opened a new pull request, #2237:
URL: https://github.com/apache/systemds/pull/2237

   This PR adds the embedding layer as a built-in operator in our nn/layers 
library. The functionality is similar to pytorch.nn.Embedding 
(https://pytorch.org/docs/stable/generated/torch.nn.Embedding.html)
   
   The layer receives indices as input which refer to indices of an embedding 
dictionary and returns an embedding matrix where row i refers to embedding 
vector indices[i] of the embedding dictionary.
   
   This layer is used in every transformer architecture. Here the indices 
usually come from a tokenizer and the embedding matrix is the input to the 
actual transformer model.
   
   ### Testing
   - Testing forward pass and backward pass for correctness
   - Implemented as a component test in NNComponentTest.java
   - Manually calculated test cases for the forward pass
   - For backward pass, comparison against pytorches autograd module


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to