Modern Data Orchestration
Build, run, and observe data pipelines-as-code with Astro, the cloud-native data orchestration platform powered by Apache Airflow™.

Trusted By
Focus on your pipelines, not managing Apache Airflow
-
Build Faster
Accelerate the development of data pipelines with tools that allow your entire data team to focus on code that impacts your business.
-
Run With Confidence
Increase data availability with a reliable and efficient production runtime environment optimized for the cloud.
-
See the Whole Picture
Make sense of your data universe with real-time visibility and actionable insights across environments.


Apache Airflow is the de facto standard for expressing data flows as code, with a robust and growing community of data engineers, scientists, and analysts around the world.
Learn more about AirflowUnified Data Flows
Bring order to your distributed data ecosystem with a modern orchestration platform.
Choose Your Deployment
Astro
Fully managed,
deployed in your cloud or ours
Keep orchestration close to your data with a single-tenant data plane in your cloud or ours, no DevOps required. With a common control plane for data pipelines across clouds, you’ll sleep easy knowing your environment is managed by the core developers behind Apache Airflow.
Learn More About Astro

Software
Self managed, deployed in your private cloud
Launch, manage, and secure Airflow environments with an enterprise-ready software platform built for the most demanding settings.
Learn About AstronomerWork Locally and Control Astro from Your Terminal
Write, test, and run DAGs in a lightweight local development environment.
astro dev init
> Airflow project Initialized!
├── Dockerfile # Base Airflow image
├── README.md
├── airflow_settings.yaml # For local connections
├── dags
│ ├── example-dag-advanced.py
│ └── example-dag-basic.py
├── include # Scripts, helpers, etc.
├── packages.txt # OS packages
├── plugins
└── requirements.txt # Python packages
astro dev start
> Creating containers for Airflow Scheduler, Webserver, and Database...
> Airflow is starting up...
> Your local instance of Airflow is now running on localhost: 8080 🚀
astro dev logs
> Printing logs from Airflow Scheduler, Webserver, and Workers....
================================================================================
DAG File Processing Stats
File Path PID Runtime # DAGs # Errors Last Runtime Last Run
----------------------------------------------- ----- --------- -------- ---------- -------------- -------------------
/usr/local/airflow/dags/example-dag-advanced.py 1 0 0.48s 2022-02-17T16: 32: 14
/usr/local/airflow/dags/example-dag-basic.py 1 0 1.22s 2022-02-17T16: 32: 15
================================================================================
Create, manage, and deploy to production from the comfort of your terminal or CI/CD processes.
astro deployment create my-airflow
NAME Namespace DEPLOYMENT ID TAG AIRFLOW VERSION
my-airflow my-namespace my-uuid 4.0.8 2.2.2
Successfully created deployment. Deployment can be accessed at the following URLs
Airflow Dashboard: https: //my-org.astronomer.run/my-airflow/home
Deployment Dashboard: https: //cloud.astronomer.io/my-workspace/deployments/my-airflow
astro deploy my-airflow
> Deploying: updated DAGs to my-airflow
> Building image...
> Pushing image to Astronomer...
> Deploy Succeeded! Your DAGs are now running at https: //my-org.astronomer.run/my-airflow/home
astro deployment list
> Listing Deployments available to modify...
* Prod Deployment
Stage Deployment
Dev Deployment