Lead Developer Advocate
This webinar provides an overview of how to use three new Airflow features and supporting open source community projects to simplify your DAG authoring process. We cover how to utilize these tools to write less code and design more straightforward DAGs, as well as use cases that most benefit from them. Questions covered in this session include:
- How can I use datasets to implement cross-DAG dependencies that have increased visibility?
- How can I use dynamic task mapping to create DAGs that dynamically update at runtime while also remaining scalable?
- How do I use the new Astro Python SDK to rapidly develop ELT workflows by turning my SQL queries and Python functions into Airflow tasks with no boilerplate code?
All of the code shown in this webinar can be found in this repo.
Get Apache Airflow Certified
If you want to learn more about how to get started with Airflow, you can join the thousands of other data engineers who have received the Astronomer Certification for Apache Airflow Fundamentals. This exam assesses an understanding of the basics of the Airflow architecture and the ability to create simple data pipelines for scheduling and monitoring tasks.Learn More About Certification