Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

CodeSignal

Gradient Descent: Building Optimization Algorithms from Scratch

via CodeSignal

Overview

Unlimited AI-Powered Learning
Level up your skills! Get 34% off Cosmo+ with code HOLIDAY24. Limited time only!
Delve into the intricacies of optimization techniques with this immersive course that focuses on the implementation of various algorithms from scratch. Bypass high-level libraries to explore Stochastic Gradient Descent, Mini-Batch Gradient Descent, and advanced optimization methods such as Momentum, RMSProp, and Adam.

Syllabus

  • Lesson 1: Stochastic Gradient Descent: Theory and Implementation in Python
  • Lesson 2: Optimizing Machine Learning with Mini-Batch Gradient Descent
  • Lesson 3: Accelerating Convergence: Implementing Momentum in Gradient Descent Algorithms
  • Lesson 4: Understanding and Implementing RMSProp in Python
  • Lesson 5: Advanced Optimization: Understanding and Implementing ADAM

Reviews

Start your review of Gradient Descent: Building Optimization Algorithms from Scratch

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.