Bayesian Statistics: Mixture Models
University of California, Santa Cruz via Coursera
-
109
-
- Write review
Overview
Bayesian Statistics: Mixture Models introduces you to an important class of statistical models. The course is organized in five modules, each of which contains lecture videos, short quizzes, background reading, discussion prompts, and one or more peer-reviewed assignments. Statistics is best learned by doing it, not just watching a video, so the course is structured to help you learn through application.
Some exercises require the use of R, a freely-available statistical software package. A brief tutorial is provided, but we encourage you to take advantage of the many other resources online for learning R if you are interested.
This is an intermediate-level course, and it was designed to be the third in UC Santa Cruz's series on Bayesian statistics, after Herbie Lee's "Bayesian Statistics: From Concept to Data Analysis" and Matthew Heiner's "Bayesian Statistics: Techniques and Models." To succeed in the course, you should have some knowledge of and comfort with calculus-based probability, principles of maximum-likelihood estimation, and Bayesian estimation.
Syllabus
- Basic concepts on Mixture Models
- This module defines mixture models, discusses its properties, and develops the likelihood function for a random sample from a mixture model that will be the basis for statistical learning.
- Maximum likelihood estimation for Mixture Models
- Bayesian estimation for Mixture Models
- Applications of Mixture Models
- Practical considerations
Taught by
Abel Rodriguez