Home
My Highlights
Discover
Sign up
Activation Functions | Fundamentals Of Deep Learning
archive.is
it does not activate all the neurons at the same time. softmax function can be used for multiclass classification problems ReLU function should only be used in the hidden layers
1 Users
0 Comments
3 Highlights
0 Notes
Top Highlights
it does not activate all the neurons at the same time.
softmax function can be used for multiclass classification problems
ReLU function should only be used in the hidden layers
Domain
archive.is
Ready to highlight and find good content?
Glasp is a social web highlighter that people can highlight and organize quotes and thoughts from the web, and access other like-minded people’s learning.
Start Highlighting