Courses from 1000+ universities
Two years after its first major layoff round, Coursera announces another, impacting 10% of its workforce.
600 Free Google Certifications
Graphic Design
Data Analysis
Digital Marketing
El rol de la digitalización en la transición energética
First Step Korean
Supporting Successful Learning in Primary School
Organize and share your learning with Class Central Lists.
View our Lists Showcase
Explore Common Crawl data processing, comparing datasets like C4 and RefinedWeb. Learn about quality filters, deduplication strategies, and LLM-assisted filtering for creating high-quality datasets like Fineweb Edu.
Master data preparation techniques for AI models, including filtering, balancing, and synthetic dataset creation. Explore clustering methods, chat templates, and handling mixed-language data for improved model performance.
Learn techniques for anonymizing sensitive data in LLM prompts, including using libraries like Presidio and Outlines. Explore practical demos and gain insights on vLLM, TGI, and GGUF for Mac.
Explore LoRA fine-tuning techniques, from parameter selection to optimization strategies, enhancing your ability to customize language models effectively.
Learn to fine-tune Llama 3 on Wikipedia datasets for low-resource languages. Master dataset creation, LoRA setup, blending techniques, and parameter optimization for improved language model performance.
Explore fine-tuning techniques for multi-modal video and text models, covering dataset generation, clipping, querying, and evaluation using IDEFICS 2 and Jupyter Notebooks.
Dive deep into advanced transformer concepts, exploring encoder-decoder architectures, GPT-4o, and positional embeddings. Apply knowledge to weather prediction and analyze model performance.
Explore advanced multi-modal model fine-tuning and deployment techniques, focusing on IDEFICS 2 and LLaVA Llama 3 for enhanced image-text processing capabilities.
Explore IDEFICS 2 API, vLLM vs TGI, and fine-tuning techniques. Gain insights on deploying multimodal models, transformer architectures, and advanced training methods for AI development.
Explore advanced fine-tuning techniques ReFT and LoRA for efficient parameter optimization in transformer models. Learn implementation, comparison, and practical tips for improved model performance.
Explore fine-tuning and API setup for tiny text and vision models, covering multi-modal architectures, LoRA adapters, and deployment strategies for custom APIs.
Explore full fine-tuning vs (Q)LoRA techniques, comparing VRAM usage, training time, and quality. Learn parameter selection, tips for QLoRA, and step-by-step guidance for TinyLlama and Mistral 7B fine-tuning.
Master multi-GPU fine-tuning techniques using DDP and FSDP. Explore VRAM optimization, distributed training strategies, and practical implementation with code examples and demos.
Explore combined preference and supervised fine-tuning with ORPO. Learn about fine-tuning methods, loss functions, and performance improvements. Includes practical demos and evaluations.
Deploy serverless inference endpoints efficiently. Learn setup, use cases, and cost comparisons for API-based machine learning model deployment.
Get personalized course recommendations, track subjects and courses with reminders, and more.