StorageWrite​Storage​Write

Load a Kestra internal storage file into BigQuery using the BigQuery Storage API.

For more details, check out the BigQuery Storage API.

yaml
type: "io.kestra.plugin.gcp.bigquery.StorageWrite"
yaml
id: gcp_bq_storage_write
namespace: company.team

tasks:
  - id: read_data
    type: io.kestra.plugin.core.http.Download
    uri: https://dummyjson.com/products/1

  - id: storage_write
    type: io.kestra.plugin.gcp.bigquery.StorageWrite
    from: "{{ outputs.read_data.uri }}"
    destinationTable: "my_project.my_dataset.my_table"
    writeStreamType: DEFAULT
Properties

The table where to load data

The table must be created before.

Default DEFAULT
Possible Values
DEFAULTCOMMITTEDPENDING

The type of write stream to use

Default 1000

The number of records to send on each query

The fully-qualified URIs that point to source data

The GCP service account to impersonate.

The geographic location where the dataset should reside

This property is experimental and might be subject to change or removed.

See Dataset Location

The GCP project ID.

SubType string
Default ["https://www.googleapis.com/auth/cloud-platform"]

The GCP scopes to be used.

The GCP service account.

Format date-time

Commit time reported by BigQuery, only on PENDING writeStreamType

Rows count

Rows count reported by BigQuery, only on PENDING writeStreamType

Rows count.

Rows count reported by BigQuery, only on PENDING writeStreamType.