Colloque des sciences mathématiques du Québec

13 novembre 2020 de 15 h 00 à 16 h 00 (heure de Montréal/Miami) Réunion Zoom

Approximate Cross-Validation for Large Data and High Dimensions

Colloque par Tamara Broderick (Massachusetts Institute of Technology, USA)

The error or variability of statistical and machine learning algorithms is often assessed by repeatedly re-fitting a model with different weighted versions of the observed data. The ubiquitous tools of cross-validation (CV) and the bootstrap are examples of this technique. These methods are powerful in large part due to their model agnosticism but can be slow to run on modern, large data sets due to the need to repeatedly re-fit the model. We use a linear approximation to the dependence of the fitting procedure on the weights, producing results that can be faster than repeated re-fitting by orders of magnitude. This linear approximation is sometimes known as the "infinitesimal jackknife" (IJ) in the statistics literature, where it has mostly been used as a theoretical tool to prove asymptotic results. We provide explicit finite-sample error bounds for the infinitesimal jackknife in terms of a small number of simple, verifiable assumptions. Without further modification, though, we note that the IJ deteriorates in accuracy in high dimensions and incurs a running time roughly cubic in dimension. We additionally show, then, how dimensionality reduction can be used to successfully run the IJ in high dimensions when data is sparse or low rank. Simulated and real-data experiments support our theory.