Workflow orchestration with Airflow
Create data transformation pipelines that scale & are easy to debug.
- Starts 17 OctStarts 17 Oct
- €500 euros€500
- Data Minded office
Data transformation pipelines, like those made with DBT or Spark, rarely run by themselves. There are typically some conditions that apply, like "we first need to have the results from this API, before we can upload the data to the database". Such workflows can be coded as part of your pipeline, but you risk creating an intangible mess that won't allow you to continue from halfway if an error occurred only halfway through. In this workshop, you'll learn about Apache Airflow, one of the most popular ways to orchestrate work. It also features a pleasant dashboard to follow up the daily progress of tasks that had to be completed, and also allowing you to easily rerun tasks that didn't complete successfully.
To cancel or reschedule, contact us at least 24h prior to the event. Cancellations received afterwards will be invoiced at 50% of the price.