Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the innovative approach to protein structure modeling presented in this 56-minute talk by Ben Murrell from Valence Labs. Delve into the concept of treating protein backbones as a continuous language, drawing parallels between linguistic structures and protein components. Learn about a novel generative autoregressive language model that operates in the continuous space of protein backbones, allowing for the sequential sampling of amino acid placements. Discover how this method can generate diverse and realistic protein chains, potentially revolutionizing in silico protein design. Gain insights into various aspects of the model, including autoregressive backbone generation, distributions over angles, invariant point attention, and the Generative Invariant Angle Transformer. Examine practical examples, discuss the implications of this approach, and participate in a Q&A session to deepen your understanding of this cutting-edge research in AI-driven drug discovery.
Syllabus
- Intro + Background
- Autoregressive Backbone Generation
- Distributions Over Angles
- Invariant Point Attention
- Generative Invariant Angle Transformer
- Inference
- Examples
- Conclusions
- Q+A
Taught by
Valence Labs