Contents Online
Statistics and Its Interface
Volume 17 (2024)
Number 3
Scalable and globally convergent algorithm for sufficient dimension reduction
Pages: 479 – 491
DOI: https://dx.doi.org/10.4310/23-SII798
Author
Abstract
Sufficient Dimension Reduction (SDR) is a powerful approach for analyzing high-dimensional data, where the goal is to represent covariates by a minimal set of their linear projections that still capture the necessary information for regression analysis of the response. However, many existing SDR methods employ a generalized eigen decomposition of a method-specific kernel matrix, which is a non-convex optimization problem and requires significant computation involving large matrix products and decomposition. In this paper, we propose an iterative least squares formulation for SDR, which solves each least squares problem approximately. We combine this formulation with state-of-the-art stochastic gradient descent methods to propose a scalable and globally convergent algorithm for SDR. To the best of our knowledge, this is the first stochastic algorithm proposed for SDR. Through extensive simulations, we demonstrate that our method achieves competitive empirical performance.
Keywords
convex optimization, dimension reduction, stochastic optimization
2010 Mathematics Subject Classification
Primary 62H86. Secondary 90C25.
Received 28 November 2022
Accepted 3 May 2023
Published 19 July 2024