Statistical Inference of Constrained Model Estimation via Derivative-Free Stochastic Sequential Quadratic Programming

Abstract

We propose a derivative-free stochastic sequential quadratic programming (DF-SSQP) method for solving nonlinear equality-constrained stochastic optimization problems using only zero-order information. Our algorithm estimates gradients and Hessians via randomized finite differences and introduces an online debiasing technique that aggregates past iterates to reduce bias without excessive memory cost. We establish global and local convergence, asymptotic normality of the averaged iterates, and a functional central limit theorem (FCLT) for the optimization path. Notably, we extend existing FCLT results to a double-averaging setting arising from the debiasing process, addressing technical challenges that do not appear in standard single-averaging schemes. Our method also relaxes restrictive step size conditions by allowing random stabilization, improving practical applicability. We discuss a sketching-based extension to reduce complexity, making DF-SSQP suitable for large-scale derivative-free inference and optimization tasks.

Publication
A short version has been accepted in NeurIPS 2025 COML Workshop
Sen Na
Sen Na
Assistant Professor in ISyE

Sen Na is an Assistant Professor in the School of Industrial and Systems Engineering at Georgia Tech. Prior to joining ISyE, he was a postdoctoral researcher in the statistics department and ICSI at UC Berkeley. His research interests broadly lie in the mathematical foundations of data science, with topics including high-dimensional statistics, graphical models, semiparametric models, optimal control, and large-scale and stochastic nonlinear optimization. He is also interested in applying machine learning methods to problems in biology, neuroscience, and engineering.