Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Designing Losses for Data-Free Training of Normalizing Flows on Boltzmann Distributions

Valence Labs via YouTube

Overview

Explore a comprehensive lecture on designing losses for data-free training of normalizing flows on Boltzmann distributions. Delve into the challenges of generating Boltzmann distributions in high dimensions using Normalizing Flows, and discover innovative strategies to train models with incomplete or no data. Learn about the limitations of standard losses based on Kullback-Leibler divergences, including their tendency for mode collapse in high-dimensional distributions. Examine a new loss function grounded in theory and optimized for high-dimensional tasks. Witness the application of these concepts to 3D molecular configuration generation, demonstrating how imperfect pre-trained models can be optimized without training data. Gain insights from speakers Jérôme Hénin and Guillaume Charpiat as they cover topics such as Boltzmann generators, the collapse problem, experimental results on molecules, L2 losses, and L2+ on dialanine. Conclude with a summary of contributions and participate in a Q&A session to deepen your understanding of this cutting-edge research in AI for drug discovery.

Syllabus

- Intro
- Boltzmann generators
- The collapse problem + experimental results on molecules
- L2 losses
- L2+ on dialanine
- Summary of contributions
- Q+A

Taught by

Valence Labs

Reviews

Start your review of Designing Losses for Data-Free Training of Normalizing Flows on Boltzmann Distributions

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.