Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Cloud Composer is a fully managed workflow orchestration service that allows creation, scheduling, and monitoring of pipelines. This course covers the architecture of Composer and creation of pipelines to run arbitrary shell scripts and Python code.
Cloud Composer is a pipeline orchestration service on the GCP. Based on the Apache Airflow API, Composer was launched in May 2018 and is fast-emerging as a popular and versatile service for building and executing system pipelines. In this course, Building Pipelines for Workflow Orchestration Using Google Composer, you'll learn how Composer allows cloud users to quickly create pipelines with complex interconnected tasks. First, you'll discover where Composer fits in the taxonomy of GCP services and how it compares to Dataflow, which is another service for building and executing pipelines on the GCP. Next, you'll explore what a Composer environment is and how pipelines are specified, and run on these environments. Then, you'll develop an understanding of the powerful suite of operators made available for use within Composer pipelines by utilizing Airflow operators for executing shell scripts, executing arbitrary Python code, and implementing complex control flow. Finally, you'll learn how to use Airflow’s GCP-specific operators for sending email, working with BigQuery, and instantiating Dataproc clusters. When you're finished with this course, you'll have the skills and knowledge necessary to build and deploy complex pipelines built on the Apache Airflow API by utilizing Composer.
Cloud Composer is a pipeline orchestration service on the GCP. Based on the Apache Airflow API, Composer was launched in May 2018 and is fast-emerging as a popular and versatile service for building and executing system pipelines. In this course, Building Pipelines for Workflow Orchestration Using Google Composer, you'll learn how Composer allows cloud users to quickly create pipelines with complex interconnected tasks. First, you'll discover where Composer fits in the taxonomy of GCP services and how it compares to Dataflow, which is another service for building and executing pipelines on the GCP. Next, you'll explore what a Composer environment is and how pipelines are specified, and run on these environments. Then, you'll develop an understanding of the powerful suite of operators made available for use within Composer pipelines by utilizing Airflow operators for executing shell scripts, executing arbitrary Python code, and implementing complex control flow. Finally, you'll learn how to use Airflow’s GCP-specific operators for sending email, working with BigQuery, and instantiating Dataproc clusters. When you're finished with this course, you'll have the skills and knowledge necessary to build and deploy complex pipelines built on the Apache Airflow API by utilizing Composer.