Blueprints

Run DDL queries and load data to Postgres

Source

yaml
id: extract-load-postgres
namespace: company.team

variables:
  db: jdbc:postgresql://host.docker.internal:5432/postgres
  table: public.orders

tasks:
  - id: extract
    type: io.kestra.plugin.core.http.Download
    uri: https://huggingface.co/datasets/kestra/datasets/raw/main/csv/orders.csv

  - id: query
    type: io.kestra.plugin.jdbc.postgresql.Query
    url: "{{ vars.db }}"
    username: postgres
    password: "{{ secret('DB_PASSWORD') }}"
    sql: |
      create table if not exists {{ vars.table }}
      (
          order_id       integer,
          customer_name  varchar(50),
          customer_email varchar(50),
          product_id     integer,
          price          real,
          quantity       integer,
          total          real
      );

  - id: load_to_postgres
    type: io.kestra.plugin.jdbc.postgresql.CopyIn
    url: "{{ vars.db }}"
    username: postgres
    password: "{{ secret('DB_PASSWORD') }}"
    from: "{{ outputs.extract.uri }}"
    format: CSV
    header: true
    table: "{{ vars.table }}"

About this blueprint

Ingest Postgres SQL

This flow will extract data from a remote CSV file and load it into a Postgres database. It's recommended to store database credentials using secrets. For local testing, you can start a Postgres database using Docker: bash docker run -d --name mypostgres -v mypostgresdb:/var/lib/postgresql/data -p 5432:5432 -e POSTGRES_USER=postgres -e POSTGRES_PASSWORD=topSecret42 -e POSTGRES_DB=postgres postgres:latest

Download

Query

Copy In

New to Kestra?

Use blueprints to kickstart your first workflows.

Get started with Kestra