globe
Webinars

Running Airflow Tasks in Isolated Environments

Watch On Demand

Hosted By

  • Tamara Fingerlin Tamara Fingerlin Developer Advocate
    Astronomer
  • Kenten Danas Kenten Danas Lead Developer Advocate
    Astronomer

This webinar provides an extensive overview of the different options for running Airflow tasks in isolated environments - a Kubernetes pod or Python virtual environment that’s separate from your Airflow environment. The ability to run tasks in an isolated environment helps avoid common data pipeline problems, like dependency conflicts and resource management, and gives DAG authors control over how and where their tasks run. It’s useful, for example, if you need to run a task requiring Python packages that conflict with core Airflow.

Questions covered in the webinar include:

  • What are the use cases for running Airflow tasks in isolated environments?
  • What operators should I use to run tasks in an isolated environment?
  • How do I use the suggested operators?
  • How do I set up my Airflow infrastructure to support running tasks in isolated environments?

All of the sample code shown in this webinar can be found in this repo.



Astronomer Apache Airflow Fundamentals Certification badge

Get Apache Airflow Certified

If you want to learn more about how to get started with Airflow, you can join the thousands of other data engineers who have received the Astronomer Certification for Apache Airflow Fundamentals. This exam assesses an understanding of the basics of the Airflow architecture and the ability to create simple data pipelines for scheduling and monitoring tasks.

Learn More About Certification