The differences between Kestra and Airflow

How to Choose the Right Orchestration Platform

A screenshot of the user interface of Kestra's application

About Kestra & Airflow

Kestra is an open-source data orchestrator that brings Infrastructure as Code best practices to data pipelines. Kestra empowers users to build data pipelines in an API first way rather than tying it to a single Python client's implementation. Every action in Kestra is API-driven and declared using a simple YAML configuration file. Being API-first opens up your orchestration platform to more integrations and use cases without having to touch any client code.

On the other hand, Apache Airflow, born out of Airbnb in 2014, provides a Python client for authoring, scheduling, and monitoring workflows. Your workflows are tied to a specific Python client implementation, requiring redeploying code and rebuilding infrastructure for every change.

Installation and Setup

Both tools can be installed with Docker. Kestra's setup process aims to be straightforward, enabling swift initiation into managing data workflows. Using just two commands, you can install Kestra with all its dependencies. Setting up Airflow involves integrating additional components, such as a metadata database and a message broker.

Python Architecture vs. Decoupled Event-Driven System

While Airflow is built as a Python-based web framework, this architecture can present some challenges. Its reliance on Python can lead to issues due to dependency conflicts. Integration with third-party systems often requires additional Python packages, resulting in conflicts unless everything is containerized. However, containerizing every small component adds complexity and counteracts the benefits of a lightweight Python-based framework.

Kestra, on the other hand, uses a decoupled microservice-oriented architecture built based on time-tested, proven technologies such as Postgres, Java, Kafka, and Elasticsearch. This design offers significant performance advantages, as Java can handle a large number of concurrent threads, allowing robust data processing at scale. Kestra can handle a large volume of data and millions of concurrent workloads without degradation in performance.

Workflow Definition

While Kestra uses YAML for workflow definitions, it also supports inline scripting in a variety of languages. The API-first design makes Kestra language-agnostic and accessible to stakeholders who might be familiar with SQL or other languages but not with Python or Docker. With Kestra, workflows can be divided into manageable subflows that can be triggered based on certain events.

Airflow's workflow definition is Python-based. This design choice limits building data workflows only to Python developers and hinders wider adoption across business domain experts and engineers working on different technology stacks.

While Airflow can handle complex workflows, it does not come with an event-driven system, making decoupled workflow patterns difficult, if not impossible, to accomplish.

Workflow Deployment

Airflow's deployment strategy reflects the Python-centric nature of the platform, which involves packaging Python applications, managing dependencies, and deploying code to the server. Despite the widespread adoption of Python, handling the underlying dependencies is challenging, especially at scale. Data movement usually involves communicating with various third-party services via API calls. We believe that dealing with multiple Python environments only to perform simple integration tasks (e.g., triggering batch jobs or data ingestion) adds unnecessary complexity. You can accomplish the Workflows as Code paradigm much easier when your orchestration logic is decoupled from your business logic (written in languages such as Python).

With Kestra, you can directly create and modify workflows from the UI, API, Terraform, and CI/CD tools, allowing immediate deployment. Kestra seamlessly integrates with tools like GitHub Actions and GitLab CI to automate the testing and deployment of workflows. You can manage your workflows with Infrastructure as Code (IaC) tools like Terraform, benefiting from version control, peer review, and the ability to automatically rollout or rollback workflows. Kestra's architecture is heavily focused on APIs, allowing users to manage workflows programmatically and integrate Kestra with other tools and systems.

The Code editor embedded in Kestra also comes with autocompletion, syntax validation, and task documentation, which allows you to see at a glance which tools are used in any given pipeline.

Developer Experience

Kestra is known for its intuitive and visually appealing user interface, making workflow management simple and accessible. Airflow can deploy DAGs but doesn’t offer an integrated code editor to help you write workflows. Also, Airflow DAG is visible in the UI only after you deploy it. In contrast, Kestra’s topology view is updating live as you write your flow, making it much easier to understand workflow dependencies already during their development.

Regarding workflow visualization, both platforms provide DAG views. Kestra also includes logos representing the tools used in a given workflow, allowing you to see at a glance which tools are used in any given pipeline.

Collaboration and Accessibility

Kestra's design aims to be user-friendly, not just for developers and engineers but also for business stakeholders. The user interface allows SQL-savvy users to modify queries or parameters directly from the UI, enabling them to contribute without needing to delve into the codebase.

In contrast, Airflow only supports writing workflows in Python without the possibility of building them from the UI in collaboration with business stakeholders.

Modern Data Stack Integration

Kestra's flexible plugin ecosystem enables seamless integration with a wide range of popular data tools. All workflow components are also exposed via REST APIs, allowing third-party systems to interact with Kestra.

In contrast, modifying workflow components in Airflow always requires redeploying code.

KestraAirflow
InstallationEasy with Docker in two commands.Docker installation requires additional component setup.
ArchitectureAPI-first, decoupled microservice-oriented architecture capable of handling a large number of scheduled and event-driven workloads at any scale.Orchestration configuration bound to a Python-client and monolithic architecture with scaling and dependency management issues.
Workflow DefinitionLanguage-agnostic declarative API-first interface allowing building workflows in YAML or from the UI.Python only, modifications require redeployments leading to slow feedback loops.
Developer ExperienceIntuitive UI, live topology view, integrated documentation next to the code editor with autocompletion, blueprints and advanced features.Deployable DAGs, but no integrated code editor. DAGs are only visible post-deployment.
CollaborationEncourages cross-role collaboration. SQL users can modify queries/parameters from the UI.Primarily developer-focused. Workflows are written in Python without UI-building options.
Data IntegrationFlexible plugin ecosystem. All workflow components are accessible via REST APIs.Supports many data tools but requires redeploying code and additional infrastructure work due to dependency management.

Getting Started

Start building with Kestra — Automate Everything Everywhere All at Once.

Read the docsGet started!