Getting Started with Declarative Orchestration
See how Kestra can simplify your data pipelines—and scale beyond them.
Book a demoAirflow schedules Python pipelines for data engineering teams. Kestra orchestrates workflows across your entire stack. One serves Python-native data teams. The other changes what's possible for your business.
Declarative orchestration layer where YAML describes intent and any language executes. Orchestration is configuration, not code. Data pipelines, infrastructure automation, and business processes run from a single control plane without forcing every task into one language.
A Python-based workflow engine where DAGs are executable programs. Workflows are Python code that the scheduler runs. Everything—scheduling, dependencies, retries—is defined in Python and tied to the Airflow runtime.
Airflow's standalone command works for local development, but production deployments need an API server, DAG processor, metadata database, and executor configured separately. Kestra's single Docker Compose command stands up everything in a format that's already production-shaped.
curl -o docker-compose.yml \https://raw.githubusercontent.com/kestra-io/kestra/develop/docker-compose.ymldocker compose up
# Open localhost:8080# Pick a Blueprint, run it. Done.Download the Docker Compose file, spin it up, and you're ready (database and config included). Open the UI, pick a Blueprint, run it. No Python environment, no DAG library to configure.
pip install apache-airflowpip install apache-airflow-providers-standard
# Dev: airflow standalone (SQLite, no parallelism)# Production: configure API server, DAG processor,# metadata DB, and executor separately# Then write your DAGs in Python...The standalone command gets a dev instance running quickly, but on SQLite with no parallelism. Production requires installing providers, configuring the API server, DAG processor, metadata DB, and executor separately.
YAML is readable on day 1. Our docs are embedded in the UI for easy reference, the AI Copilot writes workflows for you, or start with our library of Blueprints. No Python knowledge required to understand or modify a workflow.
DAG definitions, retry logic, and scheduling are all Python code. Airflow can run Bash and SQL via operators, but modifying orchestration logic requires Python. Non-Python contributors can view runs in the UI but can't author or edit workflows directly.
Orchestrate across data pipelines, infrastructure operations, business processes, and AI workflows in one unified platform. Event-driven at its core, with native triggers for S3, webhooks, Kafka, and message queues.
Purpose-built for Python data pipeline scheduling. Airflow 3 adds event-driven Asset Watchers, but the core is schedule-driven. Infrastructure automation and business processes are possible via Python but not first-class use cases.
| | | |
|---|---|---|
| Workflow definition | Declarative YAML | Python DAGs |
| Languages supported | Any (Python, SQL, R, Bash, Go, Node.js) | Python-first (Bash/SQL via operators; multi-language Task SDK experimental) |
| Architecture | Event-driven at core | Schedule-first (event-driven Asset Watchers added in v3) |
| Assets and lineage | Data and infrastructure assets | Data assets only |
| Multi-tenancy | Namespace isolation + RBAC out-of-box | Limited |
| Deployment | Single service (Docker Compose) | API server + DAG processor + metadata DB + executor |
| Self-service for non-engineers | Kestra Apps | UI is observability-focused, no self-service triggers |
| Human-in-the-loop | Native task types | Requires custom operator or external tooling |
| Infrastructure automation | Native support | Possible via Python operators, not a primary use case |
Bring existing Python scripts, SQL queries, and Shell commands. Kestra orchestrates them in YAML without wrapping them in decorators or operators. Your code stays portable and your team stays unblocked.
Data pipelines, infrastructure automation, and business processes run from one platform. Airflow focuses on data pipeline orchestration. Infrastructure and business workflows are possible through Python operators but aren't first-class use cases.
Native triggers for S3, webhooks, Kafka, database changes, and API events. Airflow 3 added Asset Watchers for event-driven scheduling, but triggers in Kestra are first-class YAML declarations, not layered onto a schedule-first core. Unlimited event-driven workflows in open source.
Find answers to your questions right here, and don't hesitate to Contact us if you couldn't find what you're looking for.
See how Kestra can simplify your data pipelines—and scale beyond them.
Book a demo