The probability density function $f_Z$ of a sum of independent random variables $X+Y$ can be obtained by convolving their probability density functions.
$$ f_Z(z)=(f_X*f_Y)(z)=\int_{-\infty}^{\infty} f_Y(z-t) f_X(t)dt $$

The covariance between two random variables $X$ and $Y$ is a measure of how they vary together.
$$ \text{cov}(X,Y) := \mathbb{E}[(X-\mathbb{E}[X])(Y-\mathbb{E}[Y])= \mathbb{E}[XY]-\mathbb{E}[X]\mathbb{E}[Y] $$
One way to interpret this definition is that if $X$ and $Y$ are in the same position relative to their means (i.e. above or below their respective means), then there is positive covariance and negative covariance if the opposite is true. Moreover, if the covariance is $0$, this just means that the two random variables do not vary in any consistent way with respect to one another.
The correlation between two random variables $X$ and $Y$ is a normalized measure of their covariance. This is sometimes called the Pearson correlation coefficient $\rho$.
$$ \rho = \text{corr}(X,Y) = \frac{\text{cov}(X,Y)}{\sqrt{\text{var}(X) \text{var}(Y)}} $$

$\rho$ values near $1$ and $-1$ indicate perfect, linear predictiveness.
States that the expectation of a random variable $X$ is the expectation of its expectation conditioned on another random variable $Y$.
$$ \mathbb{E}[X] = \mathbb{E}[\mathbb{E}[X \space | \space Y]] = \mathbb{E}[\mathbb{E} [\mathbb{E} [X \space | \space Y, Z] \space | \space Z]] $$