9.4. Recurrent Neural Networks — Dive into Deep Learning 1.0.3 documentation thumbnail
9.4. Recurrent Neural Networks — Dive into Deep Learning 1.0.3 documentation
d2l.ai
However, the number of model parameters would also increase exponentially with it, as we need to store numbers for a vocabulary set Hence, rather than modeling is a hidden state that stores the sequence information up to time step Recall that we have discussed hidden layers with hidden units in Sect
1 Users
0 Comments
14 Highlights
0 Notes

Top Highlights

  • However, the number of model parameters would also increase exponentially with it, as we need to store
  • numbers for a vocabulary set
  • Hence, rather than modeling
  • is a hidden state that stores the sequence information up to time step
  • Recall that we have discussed hidden layers with hidden units in Section 5. It is noteworthy that hidden layers and hidden states refer to two very different concepts. Hidden layers are, as explained, layers that are hidden from view on the path from input to output. Hidden states are technically speaking inputs to whatever we do at a given step, a...

Domain

Ready to highlight and find good content?

Glasp is a social web highlighter that people can highlight and organize quotes and thoughts from the web, and access other like-minded people’s learning.