Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

edX

Generative AI and LLMs on AWS

Pragmatic AI Labs via edX

Overview

Master deploying generative AI models like GPT on AWS through hands-on labs. Learn architecture selection, cost optimization, monitoring, CI/CD pipelines, and compliance best practices. Gain skills in operationalizing LLMs using Amazon Bedrock, auto-scaling, spot instances, and differential privacy techniques. Ideal for ML engineers, data scientists, and technical leaders.

Course Highlights:

  • Choose optimal LLM architectures for your applications
  • Optimize cost, performance and scalability with auto-scaling and orchestration
  • Monitor LLM metrics and continuously improve model quality
  • Build secure CI/CD pipelines to train, deploy and update LLMs
  • Ensure regulatory compliance via differential privacy and controlled rollouts
  • Real-world, hands-on training for production-ready generative AI

Unlock the power of large language models on AWS. Master operationalization using cloud-native services through this comprehensive, practical training program.

Syllabus

Week 1: Getting Started with Developing on AWS for AI

****

  • Introduction to AWS Cloud Computing for AI, including the AWS Cloud Adoption Framework

  • Setting up AI-focused development environments using AWS services like Cloud9, SageMaker, and Lightsail

  • Developing serverless solutions for data, ML, and AI using AWS Bedrock and Rust

****

Week 2: AI Pair Programming from CodeWhisperer to Prompt Engineering

****

  • Learning prompt engineering techniques to guide large language models

  • Using AWS CodeWhisperer as an AI pair programming assistant

  • Leveraging CodeWhisperer CLI to automate tasks and build efficient Bash scripts

****

Week 3: Amazon Bedrock

****

  • Key capabilities and components of Amazon Bedrock

  • Accessing and invoking Bedrock foundation models using AWS CLI, Boto3 Python SDK, and Rust SDK

  • Prompt engineering and model evaluation to optimize Bedrock model performance

  • Customizing models with fine-tuning and knowledge bases

****

Week 4: Project Challenges

****

  • Applying course concepts to build an end-to-end AI workflow

  • Developing Rust functions for Bedrock agents and integrating into an orchestration flow

  • Debugging, benchmarking, and prompt engineering to optimize a deployed AI application on AWS

****

By the end of this course, you will have gained hands-on experience with cutting-edge AI/ML tools on AWS like Bedrock, CodeWhisperer, and Rust. You'll be able to build and deploy efficient, serverless AI applications in production.

Taught by

Noah Gift and Alfredo Deza

Reviews

Start your review of Generative AI and LLMs on AWS

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.