Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore nonparametric Bayesian methods in this lecture from the Foundations of Machine Learning Boot Camp. Delve into probabilistic models for graphs, examining both traditional node-based approaches and newer edge-based techniques. Learn about exchangeability in various contexts, including node exchangeability, edge exchangeability, and the Aldous-Hoover theorem. Discover the concept of graph paintboxes and their role in edge-exchangeable graphs. Investigate feature allocation and its connection to exchangeability through feature paintbox representations. Gain insights into proving sparsity in graph sequences and understand the current state of knowledge in nonparametric Bayesian methods for graph modeling.
Syllabus
Intro
Probabilistic models for graphs
Sequence of graphs
The Old Way: Nodes
The Old Way: Exchangeability
The Old Way: Node exchangeability
Aldous-Hoover
A New Way: Edges
Edge exchangeability
Exchangeable probability functions
Feature allocation is exchangeable if it has a feature paintbox representation
Edge-exchangeable graph
Cor (CCB). A graph sequence is edge- exchangeable iff it has a graph paintbox
How to prove sparsity?
What we know so far
Nonparametric Bayes
Taught by
Simons Institute