Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Logistic Regression and Ensemble Learning - Bagging and Boosting - AdaBoost

Software Engineering Courses - SE Courses via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive into a comprehensive 44-minute lecture on Logistic Regression and Ensemble Learning techniques, focusing on Bagging, Boosting, and AdaBoost. Explore the fundamentals of probability, classification, and the differences between regression and classification. Gain insights into Ensemble Learning methods, understanding their benefits and applications. Examine independent classifiers, their pros and cons, and the role of randomness in bagging. Discover when bagging is most effective and delve into boosting techniques, comparing strong and weak learners. Learn about the basic algorithm training process, weighted voting, and normalizing constants. Conclude with an in-depth look at AdaBoost and its application as a strong non-linear classifier using Decision Stumps. This lecture is part of a broader Artificial Intelligence and Machine Learning course, suitable for those with programming knowledge or experience with AI and ML tools.

Syllabus

Introduction
Probability
Classification
Regression vs Classification
Ensemble Learning
Benefits of Ensemble Learning
Independent Classifiers
Pros Cons
Randomness
When does bagging work
Boosting
Strong vs Weak Learners
Basic Algorithm Training
Weighted Vote
normalizing constant
AdaBoost
Strong NonLinear Classifier
Decision Stumps

Taught by

Software Engineering Courses - SE Courses

Reviews

Start your review of Logistic Regression and Ensemble Learning - Bagging and Boosting - AdaBoost

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.