Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Johns Hopkins University

Calculus through Data & Modeling: Applying Differentiation

Johns Hopkins University via Coursera

Overview

As rates of change, derivatives give us information about the shape of a graph. In this course, we will apply the derivative to find linear approximations for single-variable and multi-variable functions. This gives us a straightforward way to estimate functions that may be complicated or difficult to evaluate. We will also use the derivative to locate the maximum and minimum values of a function. These optimization techniques are important for all fields, including the natural sciences and data analysis. The topics in this course lend themselves to many real-world applications, such as machine learning, minimizing costs or maximizing profits.

Syllabus

  • Linear Approximations and Tangent Planes
    • In single variable calculus, the derivative computes the slope of the tangent line where defined. This is then used to create the equation of the tangent line at a point, which can be used as an accurate estimation tool for complicated functions. This theory generalizes to lines in space which are used to create tangent planes. In this module, we work through the formulas and applications of these notions, using our developed theory of derivatives and partial derivatives.
  • Maxima and Minima of Single-Variable Functions
    • Some of the most important applications of differential calculus are optimization problems in which the goal is to find the optimal (best) solution. For example, problems in marketing, economics, inventory analysis, machine learning, and business are all concerned with finding the best solution. These problems can be reduced to finding the maximum or minimum values of a function using our notions of the derivative.
  • Maxima and Minima of Multivariable Functions
    • As models become more complicated, the functions used to describe them do as well. Many functions require more than one input to describe their output. These multivariable functions also contain maximum and minimum values that we seek to find using the tools of calculus. In this module, we will extend our optimization techniques to multivariable functions.
  • Lagrange Multipliers
    • In mathematical optimization, the method of Lagrange multipliers is a strategy for finding the local maxima and minima of a function subject to equality constraints. It is named after the mathematician Joseph-Louis Lagrange. In this module, we develop the theory and work through examples of this powerful tool which converts a constrained problem into a form such that the derivative test of an unconstrained problem can still be applied. The relationship between the gradient of the function and gradients of the constraints rather naturally leads to a usually easier reformulation of the original problem.
  • Final Project - Optimization
    • We now put all our theory and practice to use in a real world problem to model the costs associated to a construction project in an effort to find the best possible price point. This project is challenging and answers may vary slightly based on the assumptions you use. Be thoughtful and clear in your report about any assumptions you make along the way.

Taught by

Joseph W. Cutrone, PhD

Reviews

4.7 rating at Coursera based on 50 ratings

Start your review of Calculus through Data & Modeling: Applying Differentiation

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.