Apache Airflow® Crash Course: From 0 to Running your Pipeline in the Cloud

Apache Airflow® Crash Course: From 0 to Running your Pipeline in the Cloud

With over 30 million monthly downloads, Apache Airflow is the tool of choice for programmatically authoring, scheduling, and monitoring data pipelines. Airflow enables you to define workflows as Python code, allowing for dynamic and scalable pipelines suitable to any use case from ETL/ELT to running ML/AI operations in production.

This introductory tutorial provides a crash course for writing and deploying your first Airflow pipeline.

  • Get an overview of need-to-know foundational Airflow concepts
  • Create your first Airflow project in a local development environment
  • Write your first DAG
  • Deploy your DAG to the cloud

Get It Now!

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.