Courses from 1000+ universities
Two years after its first major layoff round, Coursera announces another, impacting 10% of its workforce.
600 Free Google Certifications
Web Development
Software Development
Graphic Design
Functional Programming Principles in Scala
Mountains 101
Industrial Pharmacy-I
Organize and share your learning with Class Central Lists.
View our Lists Showcase
Discover how Large Language Models enhance Bayesian optimization through LLAMBO, improving hyperparameter tuning and black-box function optimization with natural language integration.
Explore groundbreaking research on autonomous AI systems capable of conducting end-to-end scientific discovery, from generating ideas to writing papers and performing peer reviews in machine learning.
Explore neural architecture search through einspace, a novel approach using fundamental operations to discover diverse and competitive network architectures for machine learning tasks.
Explore cutting-edge research on Mixture-of-Supernets architecture that enhances neural architecture search efficiency and improves BERT and MT model performance through innovative weight-sharing techniques.
Discover how scaling Gaussian process lengthscale prior with dimensionality enables vanilla Bayesian optimization to excel in high-dimensional optimization problems, outperforming complex alternatives.
Explore advanced techniques in neural architecture optimization across multiple objectives and hardware devices, focusing on Pareto front profiling and hypernetwork-based solutions.
Explore advanced neural architecture search techniques for optimizing few-shot learning adaptation strategies, focusing on meta-training and meta-testing trade-offs in multi-domain contexts.
Explore how Mamba, a state space model, demonstrates in-context learning capabilities comparable to transformers while handling longer input sequences more efficiently in AI applications.
Discover a methodology for efficiently selecting and fine-tuning pretrained models through meta-learning, optimizing model selection and hyperparameter configurations for new datasets.
Discover how MotherNet hypernetwork architecture revolutionizes tabular classification by generating optimized neural networks through in-context learning, offering faster training and competitive performance against XGBoost.
Discover advanced hyperparameter optimization techniques through Population Based Training (PBT) and its multi-objective evolution, focusing on real-world applications in precision, fairness, and robustness.
Discover how CAAFE leverages large language models to enhance automated feature engineering in data science, improving model performance through context-aware feature generation and semantic understanding.
Delve into efficient training algorithms for Transformer models, exploring dynamic architectures, batch selection, and optimizers while examining their effectiveness against baseline methods.
Explore the complexities of hyperparameter optimization in Reinforcement Learning, examining AutoML tools and methods to better understand and tune RL algorithms for improved performance.
Discover advanced Bayesian optimization techniques for biomedical data analysis, focusing on multi-objective approaches with heuristic objectives in unsupervised bioinformatics workflows.
Get personalized course recommendations, track subjects and courses with reminders, and more.