StorageWrite
Load an kestra internal storage file on bigquery using BigQuery Storage API
yaml
type: "io.kestra.plugin.gcp.bigquery.StorageWrite"
yaml
id: gcp_bq_storage_write
namespace: company.team
tasks:
- id: read_data
type: io.kestra.plugin.core.http.Download
uri: https://dummyjson.com/products/1
- id: storage_write
type: io.kestra.plugin.gcp.bigquery.StorageWrite
from: "{{ outputs.read_data.uri }}"
destinationTable: "my_project.my_dataset.my_table"
writeStreamType: DEFAULT
Dynamic YES
Default 1000
Dynamic YES
The table where to load data
The table must be created before.
Dynamic YES
Default DEFAULT
Possible Values
DEFAULTCOMMITTEDPENDING
The type of write stream to use
Dynamic YES
The fully-qualified URIs that point to source data
Dynamic YES
The GCP service account to impersonate.
Dynamic YES
The geographic location where the dataset should reside
This property is experimental and might be subject to change or removed.
See Dataset Location
Dynamic YES
The GCP project ID.
SubType string
Dynamic YES
Default ["https://www.googleapis.com/auth/cloud-platform"]
The GCP scopes to be used.
Dynamic YES
The GCP service account.
Format date-time
Commit time reported by BigQuery, only on PENDING
writeStreamType
Rows count
Rows count reported by BigQuery, only on PENDING
writeStreamType