Work

Efficient Second-Order Methods for Second-Order Cone Programs and Continuous Nonlinear Two-Stage Optimization Problems

Public

This dissertation presents novel advancements in the field of continuous nonlinear optimization, focusing on the development of efficient second-order methods for second-order conic programs (SOCPs) and continuous nonlinear two-stage optimization problems. The primary focus is on the theory and computations of Sequential Quadratic Programming (SQP) methods, which are widely used for solving nonlinear optimization problems. In this work, we introduce a new SQP method tailored for linear SOCPs, leveraging the SQP framework for nonlinear optimization. By capitalizing on warm-start capabilities for active-set quadratic programming subproblem solvers and utilizing polyhedral outer approximations of cones, our method achieves a local quadratic rate of convergence. It efficiently identifies the set of cones where the optimal solution lies at extreme points, enabling rapid local convergence. Numerical experiments confirm that the method can take advantage of good starting points and can achieve higher accuracy compared to a state-of-the-art interior point solver. The dissertation also includes two computational projects. Firstly, we introduce RestartSQP, an open-source C++ software package implementing Fletcher's Sl1QP method and integrating the parametric active-set methods for solving SQP subproblems. RestartSQP offers unique warm-start capabilities, efficiently handling changing constraint values and incorporating additional constraints dynamically. Additionally, it supports crossover from interior-point solvers like Ipopt, enabling the solution of subsequent NLPs using the SQP method. Numerical experiments using the CUTE benchmark NLP test set demonstrate the practical performance of RestartSQP. Furthermore, we develop a two-stage decomposition algorithm for continuous nonlinear two-stage optimization problems and discussed the detailed C++ implementation. The algorithm decomposes large-scale problems into master and independent subproblems, allowing for parallel processing and accelerated speedup. To address non-differentiability challenges arising in subproblem objectives, we incorporate a smoothing technique and propose a new approximation scheme for subproblem objectives. Furthermore, an extrapolation strategy is utilized to enhance the computational efficiency of the algorithm. By combining RestartSQP and the interior-point solver Ipopt, our algorithm demonstrates parallel scalability and efficient solution to large-scale optimization problems.

Creator
DOI
Subject
Language
Alternate Identifier
Keyword
Date created
Resource type
Rights statement

Relationships

Items