Matrix Exponential

A square matrix $A \in \R^{n \times n}$ can be exponentiated. If $A$ is diagonalizable, this is equivalent to exponentiating its eigenvalues.

$$ e^A=\sum_{k = 0} ^{\infty} \frac{A^k}{k!}=V\begin{bmatrix} e^{\lambda_1} & \dots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \dots & e^{\lambda_n}\end{bmatrix}V^{-1}=Ve^{\Lambda}V^{-1} $$


State Space Conversion

Any linear differential or difference equation can be modeled in state space, where the state vector $\mathbf{x} \in \R^{\ell}$ represents the encoding of the function and its derivatives or the sequence and its higher indices.

$$ a\ddot y +b\dot y + cy = du \rightarrow \begin{cases}x_1 = y \\ x_2 = \dot y \rightarrow \dot x_2 = -\frac{b}{a}x_2 -\frac{c}{a}x_1 + \frac{d}{a} u \hspace{5pt} \text{ and } \hspace {5pt}\dot x_1 = x_2 \end{cases} \\ \begin{bmatrix} \dot x_1 \\ \dot x_2 \end{bmatrix} = \begin{bmatrix} 0 & 1 \\ -\frac{c}{a} & -\frac{b}{a} \end{bmatrix} \begin{bmatrix}x_1 \\ x_2 \end{bmatrix} + \begin{bmatrix} 0 \\ -\frac{d}{a} \end{bmatrix}u $$

The same process can be applied to difference equations. When each state can be solved for separate from the rest of the states, the system is said to be uncoupled and coupled otherwise.


Vector Differential Equations

Sets of continuous equations which model dynamic change in a state space modeled by $A \in \R^{\ell \times \ell}$.

$$ \mathbf{\dot{x}} = A \mathbf{x} + B \mathbf{u} \\ \mathbf{x}(t) = e^{At}\mathbf{x}(0) + \int _{0} ^t e^{A(t- \tau)} B\mathbf{u}(\tau)d\tau $$

Solving these sets involve first decoupling the equations through a conversion to the eigenbasis of $A$, solving the equations in this basis, then converting back to the standard basis.

$$ \mathbf{Conversion} \space \mathbf{to} \space \mathbf{Eigenbasis} \\ \mathbf{\dot x} = A \mathbf{x} + B\mathbf{u} \rightarrow V\mathbf{\dot{z}} = AV\mathbf{z} +B\mathbf{u} \\\mathbf{\dot z} = V^{-1} A V\mathbf{z} + V^{-1}B\mathbf{u} \\ \mathbf{\dot z} = \Lambda \mathbf{z} +V^{-1}B\mathbf{u} $$

$$ \mathbf{Decoupled} \space \mathbf{Solutions} \\ \text{Note that this new system is} \text{ now decoupled since } \\ \Lambda \text{ is} \text{ diagonal. We can solve for each} \text{ state separately,} \\ z_i(t) = z_i(0)e^{\lambda_it} + \int _0 ^t e^{\lambda_i(t-\tau)} (V^{-1}B\mathbf{u})_i(\tau) \space d\tau \\ \text{Compacting, this generalizes to} \\ \mathbf{z}(t) = e^{\Lambda t}\mathbf{z}(0) + \int _{0} ^t e^{\Lambda (t - \tau)}V^{-1}B\mathbf{u}(\tau) \space d\tau $$

$$ \mathbf{Conversion} \space \mathbf{to} \space \mathbf{Standard} \space \mathbf{Basis} \\ \text{Now that we have } \mathbf{z}(t) \text{, we simply convert back} \\ \text{to the standard basis to recover } \mathbf{x}(t) \\ V\mathbf{ z}(t) = Ve^{\Lambda t} V^{-1} \mathbf{x}(0) + \int _0 ^t Ve^{\Lambda (t - \tau)} V^{-1} B \mathbf{u}(\tau) \space d\tau \\ \mathbf{x}(t) = e^{At}\mathbf{x}(0) + \int 0 ^t e^{A(t-\tau)} B \mathbf{u}(\tau) \space d\tau = \sum{k=1} ^{\ell}\mathbf{c}_k e^{\lambda _k t} + \int _0 ^t e^{A(t-\tau)} B \mathbf{u}(\tau) \space d\tau $$

Phase Portrait