Interplay of Linear Algebra, Machine Learning, and HPC - JuliaCon 2021 Keynote
The Julia Programming Language via YouTube
Overview
Syllabus
Welcome!.
Introduction by the speaker.
Acknowledgments.
Algebraic solvers are fundamental tools.
Mathematical libraries in which development we were involved.
Two main themes of the talk.
Kernel methods in ML.
Kernel Ridge Regression (KRR).
Solving large sense linear systems.
Low-rank compression.
Classes of low-rank structured matrices.
Cluster tree of matrix.
Fast algebraic algorithm: sketching.
Problem: we don't know the target rank.
Stochastic norm estimation.
Example: compression of HSS matrix.
Fast geometric algorithm: approximate nearest neighbor.
Approximate nearest neighbor with iterative merging.
Comparison of algebraic and geometric algorithms.
STRUMPACK (STRUctured Matrix PACKage).
Linear algebra and machine learning.
Bayesian optimization.
Modeling phase.
Search phase.
Parallelization of code execution.
Examples of ML improved linear algebra computations.
Summary.
Q&A: What do we need more: linear algebra code for new architectures or for new applications?.
Q&A: How we can give users the ability to use ML to get performance?.
Q&A: What developments do you want to see in the Julia ecosystem?.
Q&A: What high-performance algorithms can make use of specific code generation?.
Q&A: Do you think that Julia can replace C++ as the language for linear algebra?.
Q&A: Do you search for rank revealing LU?.
Announcements.
Taught by
The Julia Programming Language