Slater’s Condition

Consider the following convex optimization problem and take $D = \cap_{i=1 }^m f_i$.

$$ \begin{align*} \min_{x \in \R^n} \quad & f_0(x) \\ \text{subject to} \quad & f_i(x) \le 0, \quad i \in [k] \\ & h_j(x) = 0, \quad j \in [m] \end{align*} $$

Slater’s condition is that there exists $y \in \text{relint}(D)$ such that the following hold:

$$ \forall i \in \{1, \dots k\} \hspace{15pt} \begin{cases} f_i(y) \leq 0 \hspace{15pt} \text{ if $f_i$ is affine} \\ f_i(y)< 0 \hspace{15pt} \text{ if $f_i$ is not affine}\end{cases}

\\ \forall j \in \{1, \dots , m\} \hspace{25pt} h_j(y) = 0 \hspace{89pt} $$

This condition forces $y$ to be in the relative interior of the feasible set.


Karush-Kuhn-Tucker Optimality Conditions

Given a convex optimization problem, if Slater’s condition holds, then $x^$ is a global minimum iff there exists Lagrange multipliers $\lambda^ = \begin{bmatrix} \lambda_1 & \dots & \lambda_k\end{bmatrix}^T$ and $\mu^* = \begin{bmatrix} \mu_1 & \dots & \mu_m \end{bmatrix}^T$such that the following hold:

$$ \textbf{Primal Feasibility } \hspace{40pt} \begin{cases} \forall i \in \{1, \dots, k \}, f_i(x^) \leq 0 \\ \forall j \in \{1, \dots , m\} , h_j(x^) = 0 \end{cases} \hspace{55pt} \\

$$

$$ \textbf{Dual Feasibility } \hspace{75pt} \lambda_1, \dots, \lambda _k \geq 0 \hspace{100pt} \\ $$

$$ \textbf{Complementary Slackness } \hspace{25pt} \forall i \in \{1, \dots , k\}, \lambda_i f_i(x^*) = 0 \hspace{38pt} $$

$$ \textbf{Stationarity } \hspace{55pt} \nabla f_0 (x^) + \sum _{i=1}^k \lambda_i \nabla f_i(x^) + \sum _{j=1}^m \mu_j \nabla h_j(x^*) = 0 $$

If the optimization is not convex, then KKT is only a necessary condition, i.e. if $x^*$ is a global minimum then KKT holds.


Lagrangian

The Lagrangian $L(x, \lambda, \mu)$ of an optimization problem is a new objective which incorporates the original constraints as penalty terms.