Jointly Gaussians

Consider a random vector $X \in \R^n$ with entries $X_1, \dots , X_n$. Provided below are two equivalent definitions of when $X$ is considered jointly gaussian:

<aside> <img src="/icons/fireworks_yellow.svg" alt="/icons/fireworks_yellow.svg" width="40px" />

Affine Transformation of the Standard Normal. Let $Z \in \R^{\ell}$ be the standard normal vector, i.e. $Z_i \sim N(0,1)$. Then $X$ is jointly gaussian if there exist $A \in \R^{n \times \ell}$ and $\mu \in \R^{n}$ such that $X = AZ + \mu$.

</aside>

<aside> <img src="/icons/fireworks_yellow.svg" alt="/icons/fireworks_yellow.svg" width="40px" />

Any Linear Combination is Gaussian. $X$ is jointly gaussian if for any $u \in \R^n$, $u^T X$ is gaussian.

</aside>

Suppose $X \in \R^n$ is jointly gaussian with $X = AZ + \mu$. The PDF of $X$ is defined in terms of a positive-definite covariance matrix $\text{cov(X)} = AA^T = \Sigma$ and mean $\mathbb{E}[X] = \mu$ below:

$$ X \sim N(\mu, \Sigma) \implies f_X(x) = \frac{1}{\sqrt{(2\pi)^n \det (\Sigma )}} \exp(-\frac{1}{2}(x-\mu)^T\Sigma^{-1}(x-\mu)) $$

<aside> <img src="/icons/castle_yellow.svg" alt="/icons/castle_yellow.svg" width="40px" />

Uncorrelated iff Independent. If $X$ and $Y$ are jointly gaussian, then $\text{cov}(X, Y) = 0$ iff $X$ and $Y$ are independent.

<aside> <img src="/icons/castle_yellow.svg" alt="/icons/castle_yellow.svg" width="40px" />

Linear Combinations are Jointly Gaussian. If $X$ and $Y$ are jointly gaussian, then for any $a, b, c, d\in \R$, the random variables $aX + b$ and $cY + d$ are jointly gaussian.

<aside> <img src="/icons/castle_yellow.svg" alt="/icons/castle_yellow.svg" width="40px" />

MMSE $=$ LLSE. If $X$ and $Y$ are jointly gaussian, then $\mathbb{E}[X \mid Y] = \mathbb{L}[ X \mid Y]$.