4.5. Concise Implementation of Softmax Regression — Dive into Deep Learning 1.0.3 documentation thumbnail
4.5. Concise Implementation of Softmax Regression — Dive into Deep Learning 1.0.3 documentation
d2l.ai
larger than the largest number we can have for certain data types every argument is a very large negative number, we will get underflow pass the logits and compute the softmax and its log all at once backpropagation umerical underflow and overflow in the exponentiation.
2 Users
0 Comments
15 Highlights
0 Notes

Top Highlights

  • larger than the largest number we can have for certain data types
  • every argument is a very large negative number, we will get underflow
  • pass the logits and compute the softmax and its log all at once
  • backpropagation
  • umerical underflow and overflow in the exponentiation.

Domain

Ready to highlight and find good content?

Glasp is a social web highlighter that people can highlight and organize quotes and thoughts from the web, and access other like-minded people’s learning.