Courses from 1000+ universities
Two years after its first major layoff round, Coursera announces another, impacting 10% of its workforce.
600 Free Google Certifications
Web Development
Software Development
Graphic Design
Functional Programming Principles in Scala
Mountains 101
Industrial Pharmacy-I
Organize and share your learning with Class Central Lists.
View our Lists Showcase
Explore how to distill commonsense knowledge from large language models to create high-quality knowledge graphs and train smaller, specialized models for improved performance in commonsense reasoning tasks.
Explore Topographic VAEs: a novel approach to deep generative models with organized latent variables, bridging topographic organization and equivariance in neural networks for improved feature learning and transformation handling.
Explores innovative Transformer model with unbounded memory, enabling processing of arbitrarily long sequences. Discusses continuous attention mechanisms, sticky memories, and potential applications in language modeling.
ALiBi: A novel attention mechanism enabling transformers to process longer sequences than trained on. Uses linear biases instead of position encodings, improving efficiency and extrapolation capabilities.
Explores underspecification in ML pipelines, where models with similar training performance behave differently in real-world deployment, leading to instability and poor outcomes across various domains.
Explore how pre-trained language models can be used to construct knowledge graphs without human supervision, uncovering new relations and outperforming existing methods in automated KG construction.
Explore a novel approach to efficient attention mechanisms in Transformers using random positive orthogonal features, enabling linear-time approximation of attention matrices for scalable deep learning architectures.
Explore LambdaNetworks, a novel approach to modeling long-range interactions in AI without traditional attention mechanisms, offering improved efficiency and performance in computer vision tasks.
Comprehensive analysis of popular deep learning optimizers, comparing performance across tasks and providing evidence-based recommendations for algorithm selection and parameter tuning.
Explore how Transformers outperform CNNs in image recognition, delving into Vision Transformer architecture, experimental results, and implications for AI research.
Explores a novel approach to machine learning optimization using neural networks, discussing its architecture, training process, and potential to revolutionize model training through self-improvement and generalization.
Explores the "hardware lottery" concept in AI research, discussing how hardware availability influences algorithmic success, with historical examples and future implications for machine learning progress.
Explore how AlphaZero assesses chess variants, analyzing game dynamics and strategic changes resulting from rule modifications. Gain insights into AI's potential for game design and balance evaluation.
Explore OpenAI's innovative approach to text summarization using human feedback and reinforcement learning, improving summary quality beyond traditional metrics and supervised learning methods.
Explore how neural cellular automata can achieve global consensus through local communication, applied to digit classification. Discover innovative approaches to modeling biological systems and machine learning tasks.
Get personalized course recommendations, track subjects and courses with reminders, and more.