Bayes net parameter learning

(55 minutes to learn)


The parameters of a Bayes net can be estimated using maximum likelihood. In the most general parameterization, when the data are fully observed, the ML estimation problem decomposes into independent subproblems associated with each CPT.


This concept has the prerequisites:


  • Know how to determine the maximum likelihood estimate for the parameters in a Bayes net when all of the variables are fully observed.
  • In particular, understand why the problem decomposes into independent parameter learning subproblems associated with each CPT, and why the assumption of full observations is necessary.
    • The decomposition into independent terms isn't just used for maximum likelihood estimation -- it's the basis behind a number of other algorithms for learning Bayes nets.
  • How does the maximum likelihood solution change when parameters are shared between different CPTs?

Core resources (read/watch one of the following)


Coursera: Probabilistic Graphical Models (2013)
An online course on probabilistic graphical models.
Author: Daphne Koller
Other notes:
  • Click on "Preview" to see the videos.


Supplemental resources (the following are optional, but you may find them useful)


Coursera: Machine Learning
An online machine learning course aimed at advanced undergraduates.
Author: Pedro Domingos
Other notes:
  • Click on "Preview" to see the videos.


See also