top of page

Workflow orchestration with Airflow

Git-Icon-1788C.png

Category

build data pipelines

Duration (fully-guided training)

8h

Flipped-classroom training duration:

2h30min

of videos and

6h

of interactive workshop.

About the Course

Data transformation pipelines, like those made with DBT or Spark, rarely run by themselves. There are typically some conditions that apply, like "we first need to have the results from this
API, before we can upload the data to the database". Such workflows can be coded as part of your pipeline, but you risk creating an intangible mess that won't allow you to continue from halfway if an error occurred only halfway through.

In this workshop, you'll learn about Apache Airflow, one of the most popular ways to orchestrate work. It also features a pleasant dashboard to follow up the daily progress of tasks that had to be completed, and also allowing you to easily rerun tasks that didn't complete successfully.

bottom of page