soft weight sharing in neural nets


Soft weight sharing is a form of regularization for neural networks where groups of weights are encouraged to have similar values.


This concept has the prerequisites:

Core resources (we're sorry, we haven't finished tracking down resources for this concept yet)

Supplemental resources (the following are optional, but you may find them useful)


See also

-No Additional Notes-