Generative AI Engineering with LLMs
IBM via Coursera Specialization
Overview
The Gen AI market is expected to grow 46% . yearly till 2030 (Source: Statista). Gen AI engineers are high in demand. This program gives aspiring data scientists, machine learning engineers, and AI developers essential skills in Gen AI, large language models (LLMs), and natural language processing (NLP) employers need.
Gen AI engineers design systems that understand human language. They use LLMs and machine learning to build these systems.
During this program, you will develop skills to build apps using frameworks and pre-trained foundation models such as BERT, GPT, and LLaMA. You’ll use the Hugging Face transformers library, PyTorch deep learning library, RAG and LangChain framework to develop and deploy LLM NLP-based apps. Plus, you’ll explore tokenization, data loaders, language and embedding models, transformer techniques, attention mechanisms, and prompt engineering.
Through the series of short-courses in this specialization, you’ll also gain practical experience through hands-on labs and a project, which is great for interviews.
This program is ideal for gaining job-ready skills that GenAI engineers, machine learning engineers, data scientists and AI developers require. Note, you need a working knowledge of Python, machine learning, and neural networks.. Exposure to PyTorch is helpful.
Syllabus
Course 1: Generative AI and LLMs: Architecture and Data Preparation
- Offered by IBM. This IBM short course, a part of Generative AI Engineering Essentials with LLMs Professional Certificate, will teach you the ... Enroll for free.
Course 2: Gen AI Foundational Models for NLP & Language Understanding
- Offered by IBM. This IBM course will teach you how to implement, train, and evaluate generative AI models for natural language processing ... Enroll for free.
Course 3: Generative AI Language Modeling with Transformers
- Offered by IBM. This course provides you with an overview of how to use transformer-based models for natural language processing (NLP). In ... Enroll for free.
Course 4: Generative AI Engineering and Fine-Tuning Transformers
- Offered by IBM. The demand for technical gen AI skills is exploding. Businesses are hunting hard for AI engineers who can work with large ... Enroll for free.
Course 5: Generative AI Advance Fine-Tuning for LLMs
- Offered by IBM. Fine-tuning a large language model (LLM) is crucial for aligning it with specific business needs, enhancing accuracy, and ... Enroll for free.
Course 6: Fundamentals of AI Agents Using RAG and LangChain
- Offered by IBM. Business demand for technical gen AI skills is exploding and AI engineers who can work with large language models (LLMs) are ... Enroll for free.
Course 7: Project: Generative AI Applications with RAG and LangChain
- Offered by IBM. Get ready to put all your gen AI engineering skills into practice! This guided project will test and apply the knowledge and ... Enroll for free.
- Offered by IBM. This IBM short course, a part of Generative AI Engineering Essentials with LLMs Professional Certificate, will teach you the ... Enroll for free.
Course 2: Gen AI Foundational Models for NLP & Language Understanding
- Offered by IBM. This IBM course will teach you how to implement, train, and evaluate generative AI models for natural language processing ... Enroll for free.
Course 3: Generative AI Language Modeling with Transformers
- Offered by IBM. This course provides you with an overview of how to use transformer-based models for natural language processing (NLP). In ... Enroll for free.
Course 4: Generative AI Engineering and Fine-Tuning Transformers
- Offered by IBM. The demand for technical gen AI skills is exploding. Businesses are hunting hard for AI engineers who can work with large ... Enroll for free.
Course 5: Generative AI Advance Fine-Tuning for LLMs
- Offered by IBM. Fine-tuning a large language model (LLM) is crucial for aligning it with specific business needs, enhancing accuracy, and ... Enroll for free.
Course 6: Fundamentals of AI Agents Using RAG and LangChain
- Offered by IBM. Business demand for technical gen AI skills is exploding and AI engineers who can work with large language models (LLMs) are ... Enroll for free.
Course 7: Project: Generative AI Applications with RAG and LangChain
- Offered by IBM. Get ready to put all your gen AI engineering skills into practice! This guided project will test and apply the knowledge and ... Enroll for free.
Courses
-
This IBM course will teach you how to implement, train, and evaluate generative AI models for natural language processing (NLP). The course will help you acquire knowledge of NLP applications including document classification, language modeling, language translation, and fundamentals for building small and large language models. You will learn about converting words to features. You will understand one-hot encoding, bag-of-words, embedding, and embedding bags. You also will learn how Word2Vec embedding models are used for feature representation in text data. You will implement these capabilities using PyTorch. The course will teach you how to build, train, and optimize neural networks for document categorization. In addition, you will learn about the N-gram language model and sequence-to-sequence models. This course will help you evaluate the quality of generated text using metrics, such as BLEU. You will practice what you learn using Hands-on Labs and perform tasks such as implementing document classification using torchtext in PyTorch. You will gain the skills to build and train a simple language model with a neural network to generate text and integrate pre-trained embedding models, such as word2vec, for text analysis and classification. In addition, you will apply your new skills to develop sequence-to-sequence models in PyTorch and perform tasks such as language translation.
-
This IBM short course, a part of Generative AI Engineering Essentials with LLMs Professional Certificate, will teach you the basics of using generative AI and Large Language Models (LLMs). This course is suitable for existing and aspiring data scientists, machine learning engineers, deep-learning engineers, and AI engineers. You will learn about the types of generative AI and its real-world applications. You will gain the knowledge to differentiate between various generative AI architectures and models, such as Recurrent Neural Networks (RNNs), Transformers, Generative Adversarial Networks (GANs), Variational AutoEncoders (VAEs), and Diffusion Models. You will learn the differences in the training approaches used for each model. You will be able to explain the use of LLMs, such as Generative Pre-Trained Transformers (GPT) and Bidirectional Encoder Representations from Transformers (BERT). You will also learn about the tokenization process, tokenization methods, and the use of tokenizers for word-based, character-based, and subword-based tokenization. You will be able to explain how you can use data loaders for training generative AI models and list the PyTorch libraries for preparing and handling data within data loaders. The knowledge acquired will help you use the generative AI libraries in Hugging Face. It will also prepare you to implement tokenization and create an NLP data loader. For this course, a basic knowledge of Python and PyTorch and an awareness of machine learning and neural networks would be an advantage, though not strictly required.
-
This course provides you with an overview of how to use transformer-based models for natural language processing (NLP). In this course, you will learn to apply transformer-based models for text classification, focusing on the encoder component. You’ll learn about positional encoding, word embedding, and attention mechanisms in language transformers and their role in capturing contextual information and dependencies. Additionally, you will be introduced to multi-head attention and gain insights on decoder-based language modeling with generative pre-trained transformers (GPT) for language translation, training the models, and implementing them in PyTorch. Further, you’ll explore encoder-based models with bidirectional encoder representations from transformers (BERT) and train using masked language modeling (MLM) and next sentence prediction (NSP). Finally, you will apply transformers for translation by gaining insight into the transformer architecture and performing its PyTorch implementation. The course offers practical exposure with hands-on activities that enables you to apply your knowledge in real-world scenarios. This course is part of a specialized program tailored for individuals interested in Generative AI engineering. This course requires a working knowledge of Python, PyTorch, and machine learning.
-
The demand for technical gen AI skills is exploding. Businesses are hunting hard for AI engineers who can work with large language models (LLMs). This Generative AI Engineering and Fine-Tuning Transformers course builds job-ready skills that will power your AI career forward. During this course, you’ll explore transformers, model frameworks, and platforms such as Hugging Face and PyTorch. You’ll begin with a general framework for optimizing LLMs and quickly move on to fine-tuning generative AI models. Plus, you’ll learn about parameter-efficient fine-tuning (PEFT), low-rank adaptation (LoRA), quantized low-rank adaptation (QLoRA), and prompting. Additionally, you’ll get valuable hands-on experience in online labs that you can talk about in interviews, including loading, pretraining, and fine-tuning models with Hugging Face and PyTorch. If you’re keen to take your AI career to the next level and boost your resume with in-demand gen AI competencies that catch the eye of an employer, ENROLL today and have job-ready skills you can use straight away within a week!
-
Business demand for technical gen AI skills is exploding and AI engineers who can work with large language models (LLMs) are in high demand. This Fundamentals of Building AI Agents using RAG and LangChain course builds job-ready skills that will fuel your AI career. During this course, you’ll explore retrieval-augmented generation (RAG), prompt engineering, and LangChain concepts. You’ll look at RAG, its applications, and its process, along with encoders, their tokenizers, and the FAISS library. Then, you’ll apply in-context learning and prompt engineering to design and refine prompts for accurate responses. Plus, you’ll explore LangChain tools, components, and chat models, and work with LangChain to simplify the application development process using LLMs. Additionally, you’ll get valuable hands-on practice in online labs developing applications using integrated LLM, LangChain, and RAG technologies. Plus, you’ll complete a real-world project you can discuss in interviews. If you’re keen to boost your resume and extend your generative AI skills to applying transformer-based LLMs, ENROLL today and build job-ready skills in just 8 hours.
-
Fine-tuning a large language model (LLM) is crucial for aligning it with specific business needs, enhancing accuracy, and optimizing its performance. In turn, this gives businesses precise, actionable insights that drive efficiency and innovation. This course gives aspiring gen AI engineers valuable fine-tuning skills employers are actively seeking. During this course, you’ll explore different approaches to fine-tuning and causal LLMs with human feedback and direct preference. You’ll look at LLMs as policies for probability distributions for generating responses and the concepts of instruction-tuning with Hugging Face. You’ll learn to calculate rewards using human feedback and reward modeling with Hugging Face. Plus, you’ll explore reinforcement learning from human feedback (RLHF), proximal policy optimization (PPO) and PPO Trainer, and optimal solutions for direct preference optimization (DPO) problems. As you learn, you’ll get valuable hands-on experience in online labs where you’ll work on reward modeling, PPO, and DPO. If you’re looking to add in-demand capabilities in fine-tuning LLMs to your resume, ENROLL TODAY and build the job-ready skills employers are looking for in just two weeks!
Taught by
Ashutosh Sagar, Fateme Akbari, Joseph Santarcangelo, Kang Wang, Roodra Pratap Kanwar and Wojciech 'Victor' Fulmyk