Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

CPU Inference with ViT ONNX Model in Azure ML Managed Endpoint - AKS

The Machine Learning Engineer via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn how to perform CPU inference on Azure Kubernetes Service (AKS) by creating a Managed Endpoint in Azure Machine Learning Studio. Explore the process of converting a Vision Transformer (ViT) model to ONNX format and utilizing onnxruntime with Python Azure ML SDK v2. This 49-minute video tutorial guides you through the steps of setting up and deploying a machine learning model for efficient inference in a cloud environment, demonstrating essential MLOps practices for data scientists and machine learning engineers.

Syllabus

MLOPS: CPU Inference ViT ONNX Model in Azure ML Managed EndPoint (AKS )#machinelearning #datascience

Taught by

The Machine Learning Engineer

Reviews

Start your review of CPU Inference with ViT ONNX Model in Azure ML Managed Endpoint - AKS

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.