A more elucidating description of conditional probability can be found in my CS 188 notes.


Conditional Probability

Probability for simultaneously or sequentially occurring events.

$$ P(A \space| \space B) = \frac{P(A\cap B)}{P(B)} $$

Chain Rule

The chain rule provides a decomposition of the joint probability in terms of conditional probabilities.

$$ P(\bigcap {i=1}^{n}A{i}) = \prod {i=1}^n P(A_i \space | \space A{1:{i-1}}) $$

Total Probability Theorem

The total probability theorem states that the total probability of an event $B$ is given by the sum over its joint probabilities with respect to some partition $\{ A_i \} _{i=1}^n$ of $\Omega$.

$$ P(B) = \sum _{i=1}^n P(A_i \cap B) = \sum _{i=1}^n P(A_i) P(B \space | \space A_i) $$

image.png

Bayes Rule

Bayes rule relates the posterior $P(A \space | \space B)$ of an event **$A$ to its prior $P(A)$ after observing event $B$.

$$ P(A \space| \space B) = \frac{P(B \space | \space A) P(A)}{P(B)} $$


Independence

Independence reflects whether or not an event $B$ occurring informs our belief about an event $A$.