Abstract
In this work, we consider solving optimization problems with a stochastic objective and deterministic equality constraints. We propose a Trust-Region Sequential Quadratic Programming method to find both first- and second-order stationary points. Our method utilizes a random model to represent the objective function, which is constructed from stochastic observations of the objective and is designed to satisfy proper adaptive accuracy conditions with a high but fixed probability. To converge to first-order stationary points, our method computes a gradient step in each iteration defined by minimizing a quadratic approximation of the objective subject to a (relaxed) linear approximation of the problem constraints and a trust-region constraint. To converge to second-order stationary points, our method additionally computes an eigen step to explore the negative curvature of the reduced Hessian matrix, as well as a second-order correction step to address the potential Maratos effect, which arises due to the nonlinearity of the problem constraints. Such an effect may impede the method from moving away from saddle points. Both gradient and eigen step computations leverage a novel parameter-free decomposition of the step and the trust-region radius, accounting for the proportions among the feasibility residual, optimality residual, and negative curvature. We establish global almost sure first- and second-order convergence guarantees for our method, and present computational results on CUTEst problems, regression problems, and saddle-point problems to demonstrate its superiority over existing line-search-based stochastic methods.
Publication
arXiv preprint arXiv:2409.15734

MS in Computational and Applied Math (2021-2023)
Yuchen Fang is a PhD student in the mathematics department at UC Berkeley. He was a master student in the Computational and Applied Math program at UChicago, where he worked with Mladen Kolar and Sen Na on stochastic nonlinear optimization. His research interests include numerical linear algebra, high-dimensional statistics, statistical learning, and mathematical finance.

Assistant Professor in ISyE
Sen Na is an Assistant Professor in the School of Industrial and Systems Engineering at Georgia Tech. Prior to joining ISyE, he was a postdoctoral researcher in the statistics department and ICSI at UC Berkeley. His research interests broadly lie in the mathematical foundations of data science, with topics including high-dimensional statistics, graphical models, semiparametric models, optimal control, and large-scale and stochastic nonlinear optimization. He is also interested in applying machine learning methods to problems in biology, neuroscience, and engineering.

Professor in Statistics and ICSI, Amazon Scholar
Michael Mahoney is a Professor in the Statistics department and ICSI at UC Berkeley. He is also the director of the NSF/TRIPODS-funded Foundations of Data Analysis (FODA) Institute at UC Berkeley. He works on the algorithmic and statistical aspects of modern large-scale data analysis. Much of his recent research has focused on large-scale machine learning, including randomized matrix algorithms and randomized numerical linear algebra.

Professor in Data Sciences and Operations
Mladen Kolar is a professor in the Department of Data Sciences and Operations at the USC Marshall School of Business. His research focuses on high-dimensional statistical methods, probabilistic graphical models, and scalable optimization methods, driven by the need to uncover interesting and scientifically meaningful structures from observational data. Mladen was selected as a recipient of the 2024 Junior Leo Breiman Award for his outstanding contributions to these areas.