Non-differentiable optimization is a category of optimization that deals with objective that for a variety of reasons
is non differentiable and thus non-convex. The functions in this class of optimization are generally non-smooth.
These functions although continuous often contain sharp points or corners that do not allow for the solution...
Geometric programming was introduced in 1967 by Duffin, Peterson and Zener. It is very useful in the applications
of a variety of optimization problems, and falls under the general class of signomial problems[1]. It can be used to
solve large scale, practical problems by quantifying them into a mathematical optimization...
In this work, we will focus on the “at the same time” or direct transcription approach which allow a simultaneous
method for the dynamic optimization problem. In particular, we formulate the dynamic optimization model with
orthogonal collocation methods. These methods can also be regarded as a special class of implicit...
Subgradient Optimization (or Subgradient Method) is an iterative algorithm
for minimizing convex functions, used predominantly in Nondifferentiable
optimization for functions that are convex but nondifferentiable. It is often slower
than Newton's Method when applied to convex differentiable functions, but can
be used on convex nondifferentiable functions where Newton's Method will...
Sequential quadratic programming (SQP) is a class of algorithms for solving non-linear optimization problems
(NLP) in the real world. It is powerful enough for real problems because it can handle any degree of non-linearity
including non-linearity in the constraints. The main disadvantage is that the method incorporates several
derivatives, which...
Quadratic programming (QP) is the
problem of optimizing a quadratic
objective function and is one of the
simplests form of non-linear
programming. The objective function
can contain bilinear or up to second
order polynomial terms, and the
constraints are linear and can be both
equalities and inequalities. QP is
widely...
Quasi-Newton Methods (QNMs) are generally a class of optimization methods that are used in Non-Linear
Programming when full Newton’s Methods are either too time consuming or difficult to use. More specifically,
these methods are used to find the global minimum of a function f(x) that is twice-differentiable. There are distinct...
The conjugate gradient method is a mathematical technique that can be useful
for the optimization of both linear and non-linear systems. This technique is
generally used as an iterative algorithm, however, it can be used as a direct
method, and it will produce a numerical solution. Generally this method is...