# conditional independence

## Summary

Two random variables X and Y are conditionally independent given a random variable Z if they are independent in the conditional distribution given Z. Conditional independence is central notion in probabilistic modeling, because a model's conditional independence assumptions often lead to tractable algorithms for inference and learning in that model.

## Context

This concept has the prerequisites:

## Goals

- Know the definition of conditional independence

- Give examples to show that conditional independence does not imply independence, and vice versa

- It's likely that you're seeing this page because you want to learn about graphical models. If so, don't bother memorizing the rules of conditional independence; you'll get more intution and practice with them when you learn about graphical models. Just convince yourself that the basic properties make intuitive sense.

## Core resources (we're sorry, we haven't finished tracking down resources for this concept yet)

## Supplemental resources (the following are optional, but you may find them useful)

## -Free-

→ Mathematics StackExchange

Other notes:

- joriki provides several intuitive examples of conditional independence

→ Wikipedia

## See also

- Conditional independence is fundamental to probabilistic graphical models, including:
- Markov models , memoryless random sequences where each state is independent of the past conditioned on the previous state
- Bayesian networks , which (roughly) represent causal structure
- Markov random fields , which model which random variables directly interact with each other