video

Deep Learning Course - Level: Intermediate
As we've now been introduced, sequence models often use word embeddings to vectorize word tokens into numeric values.
Word embeddings are vectors that have numerically encoded some semantic meaning about the underlying words. Words that are close to each other in the vector space tend to have similar meanings or have a relatively direct relationship with each other.
Committed by on