Multiplication in probability theory represents simultaneity in our actions. Peeling back the abstractions of conditional probability, there is a foundational correspondence between the weighted area of the intersection of two events and the product of their probabilities.
Random variables $X_1, \dots, X_n$ are mutually independent if their joint distribution is the product of their marginal distributions.
$$ \forall (x_1, \dots, x_n) , \space \Pr[x_1, \dots, x_n] = \prod _{i=1}^n \Pr[X_i=x_i] $$
A more intuitive, but equivalent, characterization of this definition is the following: random variables $X$ and $Y$ are mutually independent if their posterior and prior distributions are identical.
$$ X \mathrel{\perp\!\!\!\perp} Y \iff \Pr[X\space | \space Y] = \Pr[X] \hspace{10pt} \text{ and } \hspace{10pt} \Pr[Y \space | \space X] = \Pr[Y] $$
This is a property of the joint distribution of the variables and not of the sample space itself; simply put, knowing one variable does not clear up any uncertainty about the other variable.
Random variables $X_1, \dots, X_n$ are conditionally independent given evidence $E_1, \dots, E_m$ if their conditioned joint distribution is the product of their conditioned marginal distributions.
$$ \forall (x_1, \dots, x_n, e_1, \dots, e_m), \space \Pr[x_1, \dots, x_n \space | \space e_1, \dots, e_m] = \prod _{i=1}^n \Pr[x_i \space | \space e_1, \dots, e_n] $$
A more intuitive, but equivalent, characterization of this definition is the following: random variables $X$ and $Y$ are conditionally independent given $E$ if $Y$reveals no new information about $X$ than already revealed by $E$.
$$ \Pr[X\space | \space E, Y] = \Pr[X \space | \space E] $$
Formula which re-expresses the joint distribution of random variables $X_1, \dots, X_n$ as the product of conditional probability distributions.
$$ \Pr[x_1, \dots, x_n] = \prod {i=1}^n \Pr[x_i \space | \space x{i-1}, \dots, x_1] $$
The intuition here is that the knowledge represented by the joint distribution can be built from repeatedly updating our best guess of the distribution of random variables iteratively.