9.1. Working with Sequences — Dive into Deep Learning 1.0.3 documentation thumbnail
9.1. Working with Sequences — Dive into Deep Learning 1.0.3 documentation
d2l.ai
Such models that regress the value of a signal on the previous values of that same signal are naturally called autoregressive models. Previously, when dealing with individual inputs, we assumed that they were sampled independently from the same underlying distribution . While we still assume that en
2 Users
0 Comments
14 Highlights
0 Notes

Top Highlights

  • Such models that regress the value of a signal on the previous values of that same signal are naturally called autoregressive models.
  • Previously, when dealing with individual inputs, we assumed that they were sampled independently from the same underlying distribution
  • . While we still assume that entire sequences (e.g., entire documents or patient trajectories) are sampled independently, we cannot assume that the data arriving at each time step are independent of each other.
  • For most sequence models, we do not require independence, or even stationarity, of our sequences. Instead, we require only that the sequences themselves are sampled from some fixed underlying distribution over entire sequences.
  • unsupervised density modeling (also called sequence modeling). Here, given a collection of sequences, our goal is to estimate the probability mass function that tells us how likely we are to see any given sequence

Domain

Ready to highlight and find good content?

Glasp is a social web highlighter that people can highlight and organize quotes and thoughts from the web, and access other like-minded people’s learning.