The behavior of type-II superconductors is modeled using the time-dependent Ginzburg Landau equations (TDGLE). Pinning centers (inclusions) and geometries which maximize the critical current that can be passed through a superconductor are numerically obtained. Previous analytical results are summarized and new results are obtained for the critical current in one...
With aggressive scaling down of feature sizes in VLSI fabrication, process variations, crosstalk and buffering have become critical issues to achieve timing closure in VLSI designs. Timing analysis and optimization techniques need to consider each of them and also their interactions. There are many statistical timing analysis researches to handle...
Aerospace collectively represents one of the most sophisticated technological endeavors and largest markets in the
world. Coming with substantial costs, nearly every aspect of the industry, from aircraft design to material selection
to operation, has been optimized in at least one way. A critical design consideration in any aircraft is...
Optimization and Game Theory have certain conceptual overlaps. It is even said that John von Neumann
conjectured the Duality Theorem using information from his game theory. This article discusses two optimization
applications to the game theory: a methodology for solving the Nash Equilibrium and a decentralized model in
supply chain...
This article concerns the exponential transformation method for globally solving posynomial (or general
geometric/signomial) optimization problems with nonconvex objective functions or constraints. A discussion of the
method's development and use will be presented.
Optimization with absolute values is a special case of linear programming in which a problem made nonlinear due
to the presence of absolute values is solved using linear programming methods.
Absolute value functions themselves are very difficult to perform standard optimization procedures on. They are
not continuously differentiable functions, are...
Subgradient Optimization (or Subgradient Method) is an iterative algorithm
for minimizing convex functions, used predominantly in Nondifferentiable
optimization for functions that are convex but nondifferentiable. It is often slower
than Newton's Method when applied to convex differentiable functions, but can
be used on convex nondifferentiable functions where Newton's Method will...
In this work, we will focus on the “at the same time” or direct transcription approach which allow a simultaneous
method for the dynamic optimization problem. In particular, we formulate the dynamic optimization model with
orthogonal collocation methods. These methods can also be regarded as a special class of implicit...
The chance-constrained method is one of the major approaches to solving optimization problems under various
uncertainties. It is a formulation of an optimization problem that ensures that the probability of meeting a certain
constraint is above a certain level. In other words, it restricts the feasible region so that the...