Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

DeepLearning.AI

ChatGPT Prompt Engineering for Developers

DeepLearning.AI via Coursera

Overview

Go beyond the chat box. Use API access to leverage LLMs into your own applications, and learn to build a custom chatbot. In ChatGPT Prompt Engineering for Developers, you will learn how to use a large language model (LLM) to quickly build new and powerful applications. Using the OpenAI API, you’ll be able to quickly build capabilities that learn to innovate and create value in ways that were cost-prohibitive, highly technical, or simply impossible before now. This short course taught by Isa Fulford (OpenAI) and Andrew Ng (DeepLearning.AI) will describe how LLMs work, provide best practices for prompt engineering, and show how LLM APIs can be used in applications for a variety of tasks, including: - Summarizing (e.g., summarizing user reviews for brevity) - Inferring (e.g., sentiment classification, topic extraction) - Transforming text (e.g., translation, spelling & grammar correction) - Expanding (e.g., automatically writing emails) In addition, you’ll learn two key principles for writing effective prompts, how to systematically engineer good prompts, and also learn to build a custom chatbot. All concepts are illustrated with numerous examples, which you can play with directly in our Jupyter notebook environment to get hands-on experience with prompt engineering

Syllabus

  • Project Overview
    • In ChatGPT Prompt Engineering for Developers, you will learn how to use a large language model (LLM) to quickly build new and powerful applications. Using the OpenAI API, you’ll be able to quickly build capabilities that learn to innovate and create value in ways that were cost-prohibitive, highly technical, or simply impossible before now. This short course taught by Isa Fulford (OpenAI) and Andrew Ng (DeepLearning.AI) will describe how LLMs work, provide best practices for prompt engineering, and show how LLM APIs can be used in applications for a variety of tasks, including: 1. Summarizing (e.g., summarizing user reviews for brevity). 2. Inferring (e.g., sentiment classification, topic extraction). 3. Transforming text (e.g., translation, spelling & grammar correction). 4. Expanding (e.g., automatically writing emails)In addition, you’ll learn two key principles for writing effective prompts, how to systematically engineer good prompts, and also learn to build a custom chatbot. All concepts are illustrated with numerous examples, which you can play with directly in our Jupyter notebook environment to get hands-on experience with prompt engineering.

Taught by

Andrew Ng

Reviews

4.7 rating at Coursera based on 1186 ratings

Start your review of ChatGPT Prompt Engineering for Developers

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.