Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

edX

Large Language Models with Azure

Pragmatic AI Labs via edX

Overview

Master Large Language Model Operations on Azure

  • Unlock Azure's full potential for deploying & optimizing Large Language Models (LLMs)
  • Build robust LLM applications leveraging Azure Machine Learning & OpenAI Service
  • Implement architectural patterns & GitHub Actions workflows for streamlined MLOps

Course Highlights:

  • Explore Azure AI services and LLM capabilities
  • Mitigate risks with foundational strategies
  • Leverage Azure ML for model deployment & management
  • Optimize GPU quotas for performance & cost-efficiency
  • Craft advanced queries for enriched LLM interactions
  • Implement Semantic Kernel for enhanced query results
  • Dive into architectural patterns like RAG for scalable architectures
  • Build end-to-end LLM apps using Azure services & GitHub Actions

Ideal for data professionals, AI enthusiasts & Azure users looking to harness cutting-edge language AI capabilities. Gain practical MLOps skills through tailored modules & hands-on projects.

Syllabus

Week 1: Introduction to LLMOps with Azure

\\- Discover pre-trained LLMs in Azure and deploy basic LLM endpoints

\\- Identify strategies for mitigating risks when using LLMs

\\- Explain how large language models work and their potential benefits and risks

\\- Describe the core Azure services and tools for working with AI solutions like Azure ML and the Azure OpenAI Service

****

Week 2: LLMs with Azure

- Use Azure Machine Learning, including GPU quota management, compute resource creation, model deployment, and utilization of the inference API

- Use the Azure OpenAI Service and its playground by deploying models and creating required resources

- Apply your comprehension of keys, endpoints, and Python examples to integrate Azure OpenAI APIs, monitor usage, and ensure proper resource cleanup

****

Week 3: Extending with Functions and Plugins

- Use Semantic Kernel to create advanced, context-aware prompts for large language models

- Define custom functions to extend system capabilities

- Build a microservice for reusable functions to streamline system extensions

- Implement functions using external APIs and microservices to customize model behavior

****

Week 4: Building an End-to-End LLM application in Azure

- Understand architectural patterns like RAG for building LLM applications

- Use Azure AI Search to create search indexes and embeddings to power RAG

- Build GitHub Actions workflows to automate testing and deployment of LLM apps

- Deploy an end-to-end LLM application leveraging RAG, Azure, and GitHub Actions

Taught by

Noah Gift and Alfredo Deza

Reviews

Start your review of Large Language Models with Azure

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.