The Bernoulli distribution is a special case of the binomial distribution for which $n=1$. For this reason, trials are sometimes known as Bernoulli trials and a series of consecutive trials sometimes as a Bernoulli process.
A discrete random variable $X: \Omega \rightarrow \R$ on a countable sample space $\Omega$ is a function which maps a randomly observed outcome $\omega$ to a subset $\mathcal{A} \sube \R$.
The event $X = a$ that a random variable is realized as a particular value $a$ is the set of all sample points $\omega$ which $X$ maps to $a$.
$$ X=a \hspace{5pt} := \hspace{5pt} \{\omega \in \Omega \space | \space X(\omega) = a\} $$
A function $f : \mathcal{A} \rightarrow \R$ on a random variable $X : \Omega \rightarrow \R$ is a random variable with:
$$ f(X)(\omega)=f(X(\omega)) $$
Statistic which gives an average value a random variable $X$ will take on, weighted by its distribution.
$$ \mathbb{E}[X] = \sum _{a \in \mathcal{A}} a \Pr[X=a] $$
Expectation as an operator is linear, regardless of the relationship between its operands.
$$ \mathbb{E}[X+Y] = \mathbb{E}[X] +\mathbb{E}[Y] \hspace{55pt} \mathbb{E}[cX] = c \mathbb{E}[X] $$
The law of the unconscious statistician states that the expectation of a function of a random variable is the expectation of that random variable with the function applied to the values it takes on.
$$ \mathbb{E}[f(X, Y, Z)] = \sum _{a \in \mathcal{A}} \sum _{b \in \mathcal{B}} \sum _{c \in \mathcal{C}} f(a, b, c)\Pr[X=a, Y=b, Z=c] $$