video

Deep Learning Course - Level: Intermediate
As we've now been introduced, sequence models, in contrast to N-gram models, do indeed make use of word order.
As the name suggests, sequence models accept sequences of ordered tokens as input. If we consider the tokens as words in a sample, then we can pass the tokenized input in its original order to the model.
Committed by on