Courses from 1000+ universities
Two years after its first major layoff round, Coursera announces another, impacting 10% of its workforce.
600 Free Google Certifications
Artificial Intelligence
Web Development
Computer Networking
Introductory Human Physiology
Mechanics of Materials I: Fundamentals of Stress & Strain and Axial Loading
Philosophy, Science and Religion: Religion and Science
Organize and share your learning with Class Central Lists.
View our Lists Showcase
Explore groundbreaking research on BitNet b1.58, a ternary parameter model matching full-precision Transformers while offering improved cost-effectiveness and defining new LLM scaling laws.
Explore SparQ Attention, a technique for increasing LLM inference throughput by reducing memory bandwidth in attention blocks through selective history fetching, applicable to off-the-shelf models without retraining.
Explore OpenMoE, an open-source Mixture-of-Experts language model series. Discover its cost-effectiveness, routing mechanisms, and key insights in comparison to dense LLMs.
Explore Teacher-Student architectures in knowledge distillation for AI model compression, expansion, adaptation, and enhancement. Gain insights into cutting-edge research and practical applications.
Explore SliceGPT's innovative approach to compressing large language models by deleting rows and columns, maintaining high performance while reducing parameters significantly.
Explore Contrastive Preference Optimization, a novel approach to enhance LLM performance in translation tasks by guiding models towards producing superior translations.
Explore OpenAI's Sora and Google's Lumiere text-to-video models. Compare architectures, capabilities, and emerging simulation features for diverse, coherent video generation from text prompts.
Explore hardware accelerators enhancing LLM performance and efficiency, covering GPUs, FPGAs, and specialized designs in this comprehensive survey presentation.
Explore Distill-Whisper: a compact, efficient speech recognition model. Learn about its robust knowledge distillation, large-scale pseudo labelling, and performance compared to the larger Whisper model.
Explore essential sparsity in large pre-trained models, examining efficient handling of complex transformers and the impact of weight removal on performance.
Explore how large language models can enhance compiler optimization, focusing on a custom 7B parameter Transformer model that generates optimal compiler options for minimizing code size.
Master Python coding best practices for clean, efficient, and maintainable machine learning solutions using Ivy. Learn essential principles for variables, functions, classes, and conventions.
Explore SAM, a versatile image segmentation model by Meta AI. Learn about its multitask capabilities, zero-shot transferability, and applications across various scenes and objects.
Explore device handling in Ivy, including policies, decorators, and framework comparisons. Learn efficient development and optimal deployment of machine learning solutions.
Explore Llama V2's safety features in this concluding session. Gain insights into Meta AI's latest language model iteration and its focus on secure AI development.
Get personalized course recommendations, track subjects and courses with reminders, and more.