Word Embeddings for Neural Networks
text
Word Embeddings for Neural Networks
As we've now been introduced, sequence models often use word embeddings to vectorize word tokens into numeric values.
Word embeddings are vectors that have numerically encoded some semantic meaning about the underlying words. Words that are close to each other in the vector space tend to have similar meanings or have a relatively direct relationship with each other.
quiz
resources
updates
Committed by on