What you'll learn:
- Azure account structure and resource group management
- Manage Files in the cloud with Blob Storage
- Azure Cognitive Search & PgVector as Vector Databases
- Utilize PgVector and the Indexing API for data retrieval
- Manage container images using Azure Container Registry
- Deploy and monitor Azure App Services
- Azure Functions and Event Grid for event-driven architecture
- Apply security measures to protect Azure app services and databases
Dive into the depths of Azure and Large Language Model (LLM) applications with this comprehensive course. Starting with the initial setup of Azure account structures and resource groups, moving to the practical management of Azure Blob Storage, this course equips you with the essential skills to navigate and utilize Azure's extensive offerings.
We then delve into different vector stores, such as Azure Cognitive Search and PgVector, comparing their advantages and disadvantages. You will learn how to chunk raw data, embed it, and insert it into the vector store. A typical Retrieval Augmented Generation (RAG) process is performed on the vector store, primarily using Jupyter notebooks for this part of the course.
After covering the basics, we transition from notebooks to using docker-compose to locally spin up services. We'll delve deeply into how these services work.
The next step is deploying these services to the cloud, where we learn about new services like the Container Registry and App Service.
Once the Web Apps are set up, we implement an event-driven indexing process with Blob Triggers, the Event Grid, and Azure Functions to index documents upon changes in Blob Storage.
The final chapters cover basic security measures, such as setting up a firewall for the database and IP-based access restrictions.
This course is tailored for individuals with foundational knowledge of Python, Docker, and LangChain and is perfect for anyone looking to build real applications with a production-grade architecture, moving beyond simple playground apps with Streamlit.