Live with Astronomer

How to Orchestrate Databricks Jobs Using Airflow

Watch On Demand

Hosted By

  • Daniel Imberman Daniel Imberman Strategy Engineer
  • Kenten Danas Kenten Danas Lead Developer Advocate

This “Live with Astronomer” session covers how to use the Astro Databricks provider to orchestrate your Databricks Jobs from Airflow. This new provider allows you to monitor your Databricks Jobs from Airflow, write tasks that run Databricks Jobs using the DAG API you’re familiar with, and even send repair requests to Databricks when tasks fail.

Questions covered in this session include:

  • How is the Astro Databricks provider different from the original Airflow Databricks provider?
  • How do I install the Astro Databricks provider?
  • How can I use the DatabricksNotebookOperator to run and monitor Databricks Jobs from Airflow?
  • How does the Astro Databricks provider help me recover from failures in my Databricks Jobs?

Learn more about the Astro Databricks Provider and see example code in the official repo.

Astronomer Apache Airflow Fundamentals Certification badge

Get Apache Airflow Certified

If you want to learn more about how to get started with Airflow, you can join the thousands of other data engineers who have received the Astronomer Certification for Apache Airflow Fundamentals. This exam assesses an understanding of the basics of the Airflow architecture and the ability to create simple data pipelines for scheduling and monitoring tasks.

Learn More About Certification