This content of roadmap follows Prof. Jordan's lectures/textbook.
## Conditional Independence and Factorization
* Much of our early discussion focused on [[conditional independence]] in the context of [directed graphical models (Bayes nets)](Bayesian networks) and [undirected graphical models (Markov random fields - MRFs)](Markov random fields)
* We can use the [[Bayes Ball]] algorithm to determine conditional independencies in Bayes nets.
* We can use simple [reachability algorithms](http://en.wikipedia.org/wiki/Reachability) to determine conditional independencies in MRFs
* We briefly discussed [[factor graphs]], which provide a more fine-grained representation of the independencies in a MRF
## Exact Inference
* The [[variable elimination]] algorithm is based on interchanging sums and products in the definitions of marginals or partition functions but can perform many redundant calculations.
* [the sum product algorithm](sum_product_on_trees) is a belief propagation algorithm based on dynamic programming. It has the advantage over naive variable elimination in that it reuses computations to compute marginals for all nodes in the graph
* [[junction trees]] generalize the the sum product algorithm to arbitrary graphs by grouping variables together into cliques such that the cliques form a tree.
## Sampling-based inference
* [[rejection sampling]] is a monte carlo method for sampling from a potentially complex distribution p(x) given a simpler distribution q(x)
* [[importance sampling]] is a way of estimating expectations under an intractable distribution p by sampling from a tractable distribution q and reweighting the samples according to the ratio of the probabilities
* We discussed some standard [[Markov chain Monte Carlo]] methods:
* [[Metropolis-Hastings algorithm]] is a very general method for approximately sampling from a distribution p by defining a Markov chain which has p as a stationary distribution
* [[Gibbs sampling]] is an MCMC algorithm where each random variable is iteratively resampled from its conditional distribution given the remaining variables -- it can be viewed [as a special case of Metropolis-Hastings](gibbs_as_mh)
* we also touched on determining [[MCMC convergence]]
* we can use [simulated annealing](simmulated_annealing) to determine a (MAP estimate)[map_parameter_estimation]
* Some fancier MCMC algorithms we touched oninclude:
* [hybrid Monte Carlo](Hamiltonian Monte Carlo )
* [slice sampling](slice_sampling)
* [[reversible_jump_mcmc]]
* [[sequential_monte_carlo]]
## Statistical Concepts
We discussed Bayesian vs frequentist inference; some topics we touched on include:
* [[maximum_likelihood]] and [[asymptotics_of_maximum_likelihood]] and the various prerequisites for these concepts
* [[bias_variance_decomposition]]
- * [[Bayes_rule]]
+ * [Baye's rule](bayes_rule)
* [[Bayesian model averaging]]
## Linear Regression and the Least Mean Squares algorithm
* We discussed [[linear_regression]] and its [closed form solution](linear_regression_closed_form) and their various prerequisites