Bidirectional LSTM
Create and run a bidirectional LSTM model.
We'll cover the following...
Chapter Goals:
Learn about the bidirectional LSTM and why it's used
A. Forwards and backwards
The language model from the Language Model section of this course used a regular LSTM which read each input sequence in the forwards direction. This meant that the recurrent connections went in the left-right direction, i.e. from time step ttt to time step t+1t + 1t+1.
While regular LSTMs work well for most NLP tasks, they are not always the best option. Specifically, when we have access to a completed text sequence (e.g. text classification), it may be beneficial to look at the sequence in both the forwards ...
Ask