How to use Soft-label for Cross-Entropy loss? thumbnail
How to use Soft-label for Cross-Entropy loss?
discuss.pytorch.org
This is probably late to answer this. I am also not sure if it would work, but what if you try inserting a manual cross-entropy function inside the forward pass… soft loss= -softlabel * log(hard label)
1 Users
0 Comments
1 Highlights
0 Notes

Top Highlights

  • This is probably late to answer this. I am also not sure if it would work, but what if you try inserting a manual cross-entropy function inside the forward pass… soft loss= -softlabel * log(hard label)

Ready to highlight and find good content?

Glasp is a social web highlighter that people can highlight and organize quotes and thoughts from the web, and access other like-minded people’s learning.