Berkeley Notes
  • Introduction
  • EE120
    • Introduction to Signals and Systems
    • The Fourier Series
    • The Fourier Transform
    • Generalized transforms
    • Linear Time-Invariant Systems
    • Feedback Control
    • Sampling
    • Appendix
  • EE123
    • The DFT
    • Spectral Analysis
    • Sampling
    • Filtering
  • EECS126
    • Introduction to Probability
    • Random Variables and their Distributions
    • Concentration
    • Information Theory
    • Random Processes
    • Random Graphs
    • Statistical Inference
    • Estimation
  • EECS127
    • Linear Algebra
    • Fundamentals of Optimization
    • Linear Algebraic Optimization
    • Convex Optimization
    • Duality
  • EE128
    • Introduction to Control
    • Modeling Systems
    • System Performance
    • Design Tools
    • Cascade Compensation
    • State-Space Control
    • Digital Control Systems
    • Cayley-Hamilton
  • EECS225A
    • Hilbert Space Theory
    • Linear Estimation
    • Discrete Time Random Processes
    • Filtering
  • EE222
    • Real Analysis
    • Differential Geometry
    • Nonlinear System Dynamics
    • Stability of Nonlinear Systems
    • Nonlinear Feedback Control
Powered by GitBook
On this page
  • Problem Transformations
  • Robust Optimization

Was this helpful?

  1. EECS127

Fundamentals of Optimization

Definition 23

The standard form of optimization is

p⋆=min⁡xf0(x) such that fi(x)≤0p^\star = \min_\mathbf{x} f_0(\mathbf{x}) \text{ such that } f_i(\mathbf{x}) \leq 0p⋆=minx​f0​(x) such that fi​(x)≤0

  • The vector x∈Rn\mathbf{x}\in\mathbb{R}^nx∈Rn is known as the decision variable.

  • The function f0:Rn→Rf_0:\mathbb{R}^n\to\mathbb{R}f0​:Rn→R is the objective.

  • The functions fi:Rn→Rf_i:\mathbb{R}^n\to\mathbb{R}fi​:Rn→R is the constraints.

  • p⋆p^\starp⋆ is the optimal value, and the x⋆\mathbf{x}^\starx⋆ which achieves the optimal value is called the optimizer.

Definition 24

The feasible set of an optimization problem is

X={x∈Rn: fi(x)≤0}\mathcal{X} = \{\mathbf{x}\in\mathbb{R}^n:\ f_i(\mathbf{x}) \leq 0 \}X={x∈Rn: fi​(x)≤0}

Definition 25

A point x\mathbf{x}x is ϵ\epsilonϵ-suboptimal if it is feasible and satisfies

p⋆≤f0(x)≤p⋆+ϵp^\star \leq f_0(\mathbf{x}) \leq p^\star + \epsilonp⋆≤f0​(x)≤p⋆+ϵ

Definition 26

An optimization problem is strictly feasible if ∃x0\exists \mathbf{x}_0∃x0​such that all constraints are strictly satisfied (i.e inequalities are strict inequalities, and equalities are satisfied).

Problem Transformations

Sometimes, optimizations in a particular formulation do not admit themselves to be solved easily. In this case, we can sometimes transform the problem into an easier one from which we can easily recover the solution to our original problem. In many cases, we can introduce additional “slack” variable and constraints to massage the problem into a form which is easier to analyze.

Theorem 7 (Epigraphic Constraints)

min⁡xf0(x)\min_\mathbf{x} f_0(x)minx​f0​(x) is equivalent to the problem with epigraphic constraints

min⁡x,tt:f0(x)≤t,\min_{\mathbf{x}, t} t \quad : \quad f_0(x) \leq t,minx,t​t:f0​(x)≤t,

Theorem 7 works because by minimizing ttt, we are also minimizing how large f0(x)f_0(x)f0​(x) can get since f0(x)≤tf_0(x) \leq tf0​(x)≤t, so at optimum, f0(x)=tf_0(x) = tf0​(x)=t. It can be helpful when f0(x)≤tf_0(x) \leq tf0​(x)≤t can be massaged further into constraints that are easier to deal with.

Theorem 8 (Monotone Objective Transformation)

Let Φ:R→R\Phi:\mathbb{R}\to\mathbb{R}Φ:R→R be a continuous and strictly increasing function over a feasible set X\mathcal{X}X. Then

min⁡x∈Xf0(x)≡min⁡x∈XΦ(f0(x))\min_{\mathbf{x}\in\mathcal{X}}f_0(\mathbf{x}) \equiv \min_{\mathbf{x}\in\mathcal{X}} \Phi(f_0(\mathbf{x}))minx∈X​f0​(x)≡minx∈X​Φ(f0​(x))

Robust Optimization

For a “nominal” problem

min⁡xf0(x):∀i∈[1,m], fi(x)≤0,\min_\mathbf{x} f_0(\mathbf{x}) \quad : \quad \forall i\in[1,m],\ f_i(\mathbf{x}) \leq 0,minx​f0​(x):∀i∈[1,m], fi​(x)≤0,

uncertainty can enter in the data used to create the f0f_0f0​ and fif_ifi​. It can also enter during decision time where the x⋆\mathbf{x}^\starx⋆ which solves the optimization cannot be implemented exactly. These uncertainties can create unstable solutions or degraded performance. To make our optimization more robust to uncertainty, we add a new variable u∈U\mathbf{u}\in\mathcal{U}u∈U.

Definition 27

For a nominal optimization problem min⁡xf0(x)\min_\mathbf{x} f_0(\mathbf{x})minx​f0​(x) subject to fi(x)≤0f_i(\mathbf{x}) \leq 0fi​(x)≤0 for i∈[1,m]i\in[1,m]i∈[1,m], the robust counterpart is

min⁡xmax⁡u∈Uf0(x,u):∀i∈[1,m], fi(x,u)≤0\min_\mathbf{x} \max_{\mathbf{u}\in\mathcal{U}} f_0(\mathbf{x}, \mathbf{u}) \quad : \quad \forall i\in[1,m],\ f_i(\mathbf{x}, \mathbf{u}) \leq 0minx​maxu∈U​f0​(x,u):∀i∈[1,m], fi​(x,u)≤0

PreviousLinear AlgebraNextLinear Algebraic Optimization

Last updated 3 years ago

Was this helpful?