Title: Learning From Gaussian Data: Single and Multi-Index Models
Abstract: In this work we consider generic Gaussian Multi-index models, in which the labels only depend on the (Gaussian) d-dimensional inputs through their projection onto a low-dimensional subspace, and we study efficient agnostic estimation procedures for this hidden subspace. We introduce the generative leap exponent k*, a natural extension of the generative exponent from [DPVLB24] to the multi-index setting. We first show that a sample complexity of n= Θ(d^{1∨k*/2}) is necessary in the class of algorithms captured by the Low-Degree-Polynomial framework. We then establish that this sample complexity is also sufficient, by giving an agnostic sequential estimation procedure (that is, requiring no prior knowledge of the multi-index model) based on a spectral U-statistic over appropriate Hermite tensors. We further compute the generative leap exponent for several examples including piecewise linear functions (deep ReLU networks with bias).
3:30pm - Pre-talk meet and greet teatime - 219 Prospect Street, 13 floor, there will be light snacks and beverages in the kitchen area. For more details and upcoming events visit our website at https://statistics.yale.edu/calendar.
