soft weight sharing in neural nets
Summary
Soft weight sharing is a form of regularization for neural networks where groups of weights are encouraged to have similar values.
Context
This concept has the prerequisites:
- weight decay in neural networks (Soft weight sharing is an alternative regularization scheme for neural nets based on weight decay.)
- mixture of Gaussians models (Soft weight sharing is based on mixture of Gaussians models.)
Core resources (we're sorry, we haven't finished tracking down resources for this concept yet)
Supplemental resources (the following are optional, but you may find them useful)
-Paid-
→ Pattern Recognition and Machine Learning
A textbook for a graduate machine learning course, with a focus on Bayesian methods.
Location:
Section 5.5.7, pages 269-272
See also
-No Additional Notes-