As the global population grows, consumption of water, energy, and food will also increase, placing stresses on these sectors, raising the importance of the Water-Energy-Food Nexus (WEFN). However, operation of WEFN systems are currently not sustainable. It is thus crucial to design WEFN systems to be sustainable from local to...
Recovering three-dimensional (3D) structural information of a specimen from a single two-dimensional (2D) measurement remains an important but challenging task in microscopic imaging. A conventional 2D microscopic image has a shallow depth-of-focus (DoF). Thus, recovering 3D information usually requires sequentially z-scanning the focal planes. This process is time consuming and...
Traditionally, robust optimization has solved problems based on static decisions which are predetermined by the
decision makers. Once the decisions were made, the problem was solved and whenever a new uncertainty was
realized, the uncertainty was incorporated to the original problem and the entire problem was solved again to
The chance-constrained method is one of the major approaches to solving optimization problems under various
uncertainties. It is a formulation of an optimization problem that ensures that the probability of meeting a certain
constraint is above a certain level. In other words, it restricts the feasible region so that the...
In this work, we will focus on the “at the same time” or direct transcription approach which allow a simultaneous
method for the dynamic optimization problem. In particular, we formulate the dynamic optimization model with
orthogonal collocation methods. These methods can also be regarded as a special class of implicit...
Subgradient Optimization (or Subgradient Method) is an iterative algorithm
for minimizing convex functions, used predominantly in Nondifferentiable
optimization for functions that are convex but nondifferentiable. It is often slower
than Newton's Method when applied to convex differentiable functions, but can
be used on convex nondifferentiable functions where Newton's Method will...
Optimization with absolute values is a special case of linear programming in which a problem made nonlinear due
to the presence of absolute values is solved using linear programming methods.
Absolute value functions themselves are very difficult to perform standard optimization procedures on. They are
not continuously differentiable functions, are...
This article concerns the exponential transformation method for globally solving posynomial (or general
geometric/signomial) optimization problems with nonconvex objective functions or constraints. A discussion of the
method's development and use will be presented.
Optimization and Game Theory have certain conceptual overlaps. It is even said that John von Neumann
conjectured the Duality Theorem using information from his game theory. This article discusses two optimization
applications to the game theory: a methodology for solving the Nash Equilibrium and a decentralized model in