Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Interplay of Linear Algebra, Machine Learning, and HPC - JuliaCon 2021 Keynote

The Julia Programming Language via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the interplay between linear algebra, machine learning, and high-performance computing in this keynote address from JuliaCon 2021. Delve into the use of hierarchical matrix algebra for constructing low-complexity linear solvers and preconditioners, and learn how these fast solvers can accelerate large-scale PDE-based simulations and AI algorithms. Discover how statistical and machine learning methods can optimize solver selection and configuration. Examine recent developments in fast algebraic and geometric algorithms, including sketching and approximate nearest neighbor techniques. Gain insights into the STRUMPACK library and its applications. Investigate the use of Bayesian optimization in improving linear algebra computations. Engage with a Q&A session covering topics such as linear algebra code development, performance optimization using machine learning, and the potential of Julia in high-performance computing.

Syllabus

Welcome!.
Introduction by the speaker.
Acknowledgments.
Algebraic solvers are fundamental tools.
Mathematical libraries in which development we were involved.
Two main themes of the talk.
Kernel methods in ML.
Kernel Ridge Regression (KRR).
Solving large sense linear systems.
Low-rank compression.
Classes of low-rank structured matrices.
Cluster tree of matrix.
Fast algebraic algorithm: sketching.
Problem: we don't know the target rank.
Stochastic norm estimation.
Example: compression of HSS matrix.
Fast geometric algorithm: approximate nearest neighbor.
Approximate nearest neighbor with iterative merging.
Comparison of algebraic and geometric algorithms.
STRUMPACK (STRUctured Matrix PACKage).
Linear algebra and machine learning.
Bayesian optimization.
Modeling phase.
Search phase.
Parallelization of code execution.
Examples of ML improved linear algebra computations.
Summary.
Q&A: What do we need more: linear algebra code for new architectures or for new applications?.
Q&A: How we can give users the ability to use ML to get performance?.
Q&A: What developments do you want to see in the Julia ecosystem?.
Q&A: What high-performance algorithms can make use of specific code generation?.
Q&A: Do you think that Julia can replace C++ as the language for linear algebra?.
Q&A: Do you search for rank revealing LU?.
Announcements.

Taught by

The Julia Programming Language

Reviews

Start your review of Interplay of Linear Algebra, Machine Learning, and HPC - JuliaCon 2021 Keynote

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.