Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a 35-minute lecture on approximate cross-validation techniques for large datasets and high-dimensional problems. Delve into the challenges of assessing error and variability in statistical and machine learning algorithms when dealing with modern, large-scale data. Learn about the infinitesimal jackknife (IJ) method, a linear approximation technique that can significantly speed up cross-validation and bootstrap processes. Examine finite-sample error bounds for the IJ and discover how dimensionality reduction can be applied to improve its performance in high-dimensional scenarios, particularly for leave-one-out cross-validation (LOOCV) with L1 regularization in generalized linear models. Gain insights into the theoretical foundations and practical applications of these techniques through simulated and real-data experiments, and understand how they contribute to the growing intersection of statistics and computer science in the field of machine learning.