A continuous random variable is a random variable that maps outcomes from a continuous sample space. Since probabilities of individual outcomes do not comply with the normalization axiom, probabilities are defined over neighborhoods of outcomes.
$$
\textbf{Probability Density Function} f_X : \R \rightarrow \R \\
\hspace{15pt} \forall x \in \R , f(x) \geq 0 \text{ and } \int_{-\infty} ^\infty f_X(x)dx = 1 \text{} \\ \fbox{$P(a \leq X \leq b) \triangleq \int _{a} ^b f_X(x)dx$} $$
$$ \textbf{Cumulative Density Function }F_X : \R \rightarrow \R \\ \forall x \in \R, F_X(x) \in [0,1] \text{ and }F_X(x) \leq F_X(x +\varepsilon) \text{ and } F'_X=f_X\\ \fbox{$P(X\leq b) \triangleq F(b)$} $$
<aside> <img src="/icons/castle_yellow.svg" alt="/icons/castle_yellow.svg" width="40px" />
Exponential, Geometric, Poisson Distributions. As an interesting remark, the Exponential and Poisson distributions are closely related.
$$ p_Y(1) = \lambda e^{-\lambda} = f_X(1)\hspace{15pt}Y\sim \text{Poisson $(\lambda)$}, X\sim \text{Exp}(\lambda) $$
For this reason, we say that the Exponential distribution models the wait time before rare event occurrences, since $f_X(1)$ corresponds to the density of the first event occurring at time step $1$. Maybe what is more straightforward to see is that the Exponential distribution is a continuous analog of the Geometric distribution.
</aside>
Frequently, the laws of probability are leveraged when working with probability density functions, even though these functions do not represent probabilities. The reason why this is the case is because we can take infinitesimal intervals as our events and use this to derive equalities that were initially defined for probabilities in terms of densities.
$$ f_X(x)=\lim _{\Delta x\rightarrow 0} \frac{P(x \leq X\leq x + \Delta x)}{\Delta x} $$
For this reason, we typically consider $f_X(x)dx = P(x \leq X \leq x + dx) \approx P(X=x)$.
The expectation of a function $g :\R\rightarrow \R$ of a continuous random variable $X$ is exactly what it is in the discrete case.
$$ \mathbb{E}[g(X)] = \int _{-\infty} ^{\infty} g(x)f_X(x) dx $$
Conditional expectation also has a direct analog,
$$ \mathbb{E}[X \space | \space Y= y] = \int {-\infty} ^{\infty} x f{X \space | \space Y}(x, y)dx $$
The joint distribution of two continuous random variables $X$ and $Y$ is defined over ordered pairs of their realizations.
$$ P( (X, Y) \in B) = \iint_B f_{XY}(x,y)dB $$