Completed
Q&A: What developments do you want to see in the Julia ecosystem?
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Interplay of Linear Algebra, Machine Learning, and HPC - JuliaCon 2021 Keynote
Automatically move to the next video in the Classroom when playback concludes
- 1 Welcome!
- 2 Introduction by the speaker
- 3 Acknowledgments
- 4 Algebraic solvers are fundamental tools
- 5 Mathematical libraries in which development we were involved
- 6 Two main themes of the talk
- 7 Kernel methods in ML
- 8 Kernel Ridge Regression (KRR)
- 9 Solving large sense linear systems
- 10 Low-rank compression
- 11 Classes of low-rank structured matrices
- 12 Cluster tree of matrix
- 13 Fast algebraic algorithm: sketching
- 14 Problem: we don't know the target rank
- 15 Stochastic norm estimation
- 16 Example: compression of HSS matrix
- 17 Fast geometric algorithm: approximate nearest neighbor
- 18 Approximate nearest neighbor with iterative merging
- 19 Comparison of algebraic and geometric algorithms
- 20 STRUMPACK (STRUctured Matrix PACKage)
- 21 Linear algebra and machine learning
- 22 Bayesian optimization
- 23 Modeling phase
- 24 Search phase
- 25 Parallelization of code execution
- 26 Examples of ML improved linear algebra computations
- 27 Summary
- 28 Q&A: What do we need more: linear algebra code for new architectures or for new applications?
- 29 Q&A: How we can give users the ability to use ML to get performance?
- 30 Q&A: What developments do you want to see in the Julia ecosystem?
- 31 Q&A: What high-performance algorithms can make use of specific code generation?
- 32 Q&A: Do you think that Julia can replace C++ as the language for linear algebra?
- 33 Q&A: Do you search for rank revealing LU?
- 34 Announcements