Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Nonparametric Bayesian Methods - Models, Algorithms, and Applications IV

Simons Institute via YouTube

Overview

Explore nonparametric Bayesian methods in this lecture from the Foundations of Machine Learning Boot Camp. Delve into probabilistic models for graphs, examining both traditional node-based approaches and newer edge-based techniques. Learn about exchangeability in various contexts, including node exchangeability, edge exchangeability, and the Aldous-Hoover theorem. Discover the concept of graph paintboxes and their role in edge-exchangeable graphs. Investigate feature allocation and its connection to exchangeability through feature paintbox representations. Gain insights into proving sparsity in graph sequences and understand the current state of knowledge in nonparametric Bayesian methods for graph modeling.

Syllabus

Intro
Probabilistic models for graphs
Sequence of graphs
The Old Way: Nodes
The Old Way: Exchangeability
The Old Way: Node exchangeability
Aldous-Hoover
A New Way: Edges
Edge exchangeability
Exchangeable probability functions
Feature allocation is exchangeable if it has a feature paintbox representation
Edge-exchangeable graph
Cor (CCB). A graph sequence is edge- exchangeable iff it has a graph paintbox
How to prove sparsity?
What we know so far
Nonparametric Bayes

Taught by

Simons Institute

Reviews

Start your review of Nonparametric Bayesian Methods - Models, Algorithms, and Applications IV

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.