Approximate Matrix Eigenvalues, Subspace Iteration With Repeated Random Sparsification
Institute for Pure & Applied Mathematics (IPAM) via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a conference talk on approximating matrix eigenvalues using subspace iteration with repeated random sparsification. Delve into Robert Webber's presentation from the California Institute of Technology at IPAM's Monte Carlo and Machine Learning Approaches in Quantum Mechanics Workshop. Discover how iterative random sparsification methods can estimate multiple eigenvalues at reduced computational cost, particularly beneficial for high-dimensional problems in quantum chemistry. Follow the progression from traditional numerical methods to innovative approaches leveraging random sampling and averaging. Gain insights into full configuration interaction, convergence, projective estimators, and the intricacies of random sparsification techniques. Understand the impact of bias, population mixing, and random matrix multiplication on eigenvalue approximation. Examine the role of spectral gaps, orthogonalization, and the FRI algorithm in enhancing computational efficiency for quantum chemistry benchmark problems.
Syllabus
Introduction
Background
Traditional methods
Full configuration interaction
Convergence
Projective estimator
Random sparsification
Bias
Sparsification
Fri algorithm
Population mixing
Random matrix multiplication
Spectral gap
Step 2 random sparsification
Orthogonalization
Summary
Conclusion
Taught by
Institute for Pure & Applied Mathematics (IPAM)