Expectation

The expectation $\mathbb{E}$ ****of a random variable $X$ is a general measure of the central tendency of its distribution via a weighted average over its values.

$$ \mathbb{E}[X] = \sum _{\omega \in \Omega} X(\omega) P(\omega) = \sum _{x \in X(\Omega)}x p_X(x) $$

<aside> <img src="/icons/castle_yellow.svg" alt="/icons/castle_yellow.svg" width="40px" /> Linearity of Expectation. The expectation behaves like a linear map.

$$ \mathbb{E}[aX+bY] = a\mathbb{E}[X] + b\mathbb{E} [Y] $$

</aside>

<aside> <img src="/icons/castle_yellow.svg" alt="/icons/castle_yellow.svg" width="40px" /> Expectation of a Function. The expectation of a function of a random variable is very similar to the probability of a function of a random variable.

$$ \mathbb{E}[X] = \sum _{\omega \in \Omega}g(X(\omega)) P(\omega) = \sum _{x \in X(\Omega)} g(x)p_X(x) $$

The expectation and variance of a linear function have a closed form.

$$ \mathbb{E}[aX+b] = a\mathbb{E}[X]+b \\ \text{var}(aX+b) = a^2 \text{var}(X) \hspace{13 pt} $$

</aside>

<aside> <img src="/icons/castle_yellow.svg" alt="/icons/castle_yellow.svg" width="40px" />

Expectation of Product of Independent Random Variables. The expectation of a product of independent random variables is the product of their expectations.

$$ X \perp Y \implies \mathbb{E}[XY] = \mathbb{E}[X]\mathbb{E}[Y] $$


Variance

The variance $\text{var}$ ****of a random variable $X$ is a measure of how spread out its distribution is.

$$ \text{var}(X) = \mathbb{E}[(X-\mathbb{E}[X])^2] = \mathbb{E}[X^2]-\mathbb{E}[X]^2 $$

<aside> <img src="/icons/castle_yellow.svg" alt="/icons/castle_yellow.svg" width="40px" />

Variance of Sum of Independent Random Variables. The variance of a sum of independent random variables is the sum of their variances.

$$ \text{var}(X+Y) = \text{var}(X) + \text{var}(Y) $$

</aside>


Joint Distributions

The joint distribution $p_{XY}$ of random variables $X$ and $Y$ is the distribution over all possible intersections of their realizations.

$$ p_{XY}(x, y) \triangleq P(X=x \cap Y=y) $$

The marginal distribution of either random variable can be recovered via the law of total probability.

$$ p_X(x) = \sum {y \in Y(\Omega)}p{XY} (x, y) $$

Moreover, the distribution of a function of random variables is the same as in the univariate case.