Webcast option: https://yale.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=a44b304c-d962-41d1-b25a-b34d00f978a8
TITLE: Wild refitting for black box prediction
Obtaining inferential guarantees on the performance of a prediction method is essential in practice. Modern predictive methods present barriers: (a) they are opaque, so that a statistician is limited to querying its predicted values only (with no further insight into the method’s properties); (b) a severely limited number of refits, due to computational expense; and (c) data can be heterogeneous. We describe a novel procedure for estimating the excess risk of any black box regression method that overcomes these challenges, and avoids any use of hold-out. Inspired by the wild bootstrap, it uses Rademacher residual symmetrization to construct a synthetic dataset for refitting. Unlike the bootstrap, it requires only a single refit, and we give non-asymptotic guarantees on the risk estimate. We illustrate its behavior for non-rigid structure-from-motion, and plug-and-play image denoising using deep net priors.
Pre-print: https://arxiv.org/abs/2506.21460
3:30pm - Pre-talk meet and greet teatime - 219 Prospect Street, 13 floor, there will be light snacks and beverages in the kitchen area. For more details and upcoming events visit our website at https://statistics.yale.edu/calendar.