Schedule, monitor, and manage data workflows efficiently using tools like Apache Airflow. Build data pipelines by leveraging Airflow DAGs to organize tasks and utilize AWS resources such as S3 and Redshift to process and move data effectively between systems. Engage in hands-on projects to automate and maintain complex data pipelines, streamlining operations and improving data reliability. Gain expertise in workflow automation, data integration, and error handling, enabling you to construct efficient and scalable data pipelines in production environments. Ideal for data engineers and professionals aiming to advance their skills in managing and automating data workflows.
Overview
Syllabus
- Introduction to Automating Data Pipelines
- Welcome to Automating Data Pipelines. In this lesson, you'll be introduced to the topic, prerequisites for the course, and the environment and tools you'll be using to build data pipelines.
- Data Pipelines
- In this lesson, you'll learn about the components of a data pipeline including Directed Acyclic Graphs (DAGs). You'll practice creating data pipelines with DAGs and Apache Airflow
- Airflow and AWS
- This lesson creates connections between Airflow and AWS first by creating credentials, then copying S3 data, leveraging connections and hooks, and building S3 data to the Redshift DAG.
- Data Quality
- Students will learn how to track data lineage and set up data pipeline schedules, partition data to optimize pipelines, investigating Data Quality issues, and write tests to ensure data quality.
- Production Data Pipelines
- In this last lesson, students will learn how to build Pipelines with maintainability and reusability in mind. They will also learn about pipeline monitoring.
- Data Pipelines
- Students work on a music streaming company’s data infrastructure by creating and automating a set of data pipelines with Airflow, monitoring and debugging production pipelines
Taught by
Sean Murdock