MLOps MLFlow: Convertir Modelo T5 Large a ONNX y Quantizar a 8Int
The Machine Learning Engineer via YouTube
Overview
Learn how to convert a T5 Large model to ONNX format and quantize it to 8INT using the Optimum library from Huggingface Transformers in this Spanish-language video tutorial. Follow along with a practical demonstration of converting a fine-tuned text summarization model while tracking all processes in MLFlow. Access the complete implementation through the provided Jupyter notebook, which details the entire workflow from model conversion to quantization for improved efficiency in machine learning operations.
Syllabus
MLOps MLFlow: Convertir a ONNX y Quantizar a 8Int Español #datascience #machinelearning
Taught by
The Machine Learning Engineer