Fundamentals of Optimization
Last updated
Last updated
The vector is known as the decision variable.
The function is the objective.
The functions is the constraints.
is the optimal value, and the which achieves the optimal value is called the optimizer.
Sometimes, optimizations in a particular formulation do not admit themselves to be solved easily. In this case, we can sometimes transform the problem into an easier one from which we can easily recover the solution to our original problem. In many cases, we can introduce additional “slack” variable and constraints to massage the problem into a form which is easier to analyze.
For a “nominal” problem
is equivalent to the problem with epigraphic constraints
Theorem 7 works because by minimizing , we are also minimizing how large can get since , so at optimum, . It can be helpful when can be massaged further into constraints that are easier to deal with.
Let be a continuous and strictly increasing function over a feasible set . Then
uncertainty can enter in the data used to create the and . It can also enter during decision time where the which solves the optimization cannot be implemented exactly. These uncertainties can create unstable solutions or degraded performance. To make our optimization more robust to uncertainty, we add a new variable .
For a nominal optimization problem subject to for , the robust counterpart is