The world is full of uncertainty: accidents, storms, unruly financial markets, and noisy communications. The world is also full of data. Probabilistic modeling and the related field of statistical inference are the keys to analyzing data and making scientifically sound predictions.
This is Part 1 of a 2-part sequence on the basic tools of probabilistic modeling. Part 1 introduces the general framework of probability models, multiple discrete or continuous random variables, expectations, conditional distributions, and various powerful tools of general applicability. Part 2 will then continue into further topics that include laws of large numbers, the main tools of Bayesian inference methods, and an introduction to random processes (Poisson processes and Markov chains).
The contents of the two parts of the course are essentially the same as those of the corresponding MIT class, which has been offered and continuously refined over more than 50 years. It is a challenging class, but will enable you to apply the tools of probability theory to real-world applications or your research.
Probabilistic models use the language of mathematics. But instead of relying on the traditional "theorem - proof" format, we develop the material in an intuitive -- but still rigorous and mathematically precise -- manner. Furthermore, while the applications are multiple and evident, we emphasize the basic concepts and methodologies that are universally applicable.
Photo by User: Pablo Ruiz Múzquiz on Flickr. (CC BY-NC-SA 2.0)
Introduction to Probability: Part 1 - The Fundamentals
Massachusetts Institute of Technology via edX
-
5.5k
-
- Write review
This course may be unavailable.
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Syllabus
- Probability models and axioms
- Conditioning, Bayes’ rule, independence
- Counting methods in discrete probability
- Discrete random variables (distributions, mean, variance, conditioning, etc.)
- Continuous random variables (including general forms of Bayes’ rule)
- Further topics (derived distributions; covariance & correlation, etc.)
Taught by
John Tsitsiklis , Patrick Jaillet , Zied Ben Chaouch , Dimitri Bertsekas , Qing He, Jimmy Li, Jagdish Ramakrishnan , Katie Szeto and Kuang Xu
Tags
Reviews
5.0 rating, based on 5 Class Central reviews
Showing Class Central Sort
-
This course is simply close to the perfection: the content is super, challenging and rigorous, but with every question/doubt I come up in my mind straight ahead answered by the lecturer. How did he know that I had that precise question ? Exercises a…
-
Very Good couse i loved it and a fantastic teaching team who hold very goood knowedge and it uppscaled my career vry much thanks alot tem im very thankful to everyone out there
-
This is one of the best online courses that I have taken so far. The course is really hard and challenging and that makes it fun!
I got a very good support on the discussion forums and that really helped me think through some of the tough problems. Really thankful to Edx and MIT for this awesome course. -
Tsitsiklis is an absolutely superb teacher, especially when you can speed him up to 1.5, and he presents the most clear and thorough elementary probability class sequence that I have encountered. It's in my top three MOOCs of all time.
-
This was one of the best online courses which I have done. I learnt so much in the topics covered and enjoyed solving tough problems!