StorageWrite
Load a Kestra internal storage file into BigQuery using the BigQuery Storage API.
For more details, check out the BigQuery Storage API.
type: "io.kestra.plugin.gcp.bigquery.StorageWrite"
Examples
id: gcp_bq_storage_write
namespace: company.team
tasks:
- id: read_data
type: io.kestra.plugin.core.http.Download
uri: https://dummyjson.com/products/1
- id: storage_write
type: io.kestra.plugin.gcp.bigquery.StorageWrite
from: "{{ outputs.read_data.uri }}"
destinationTable: "my_project.my_dataset.my_table"
writeStreamType: DEFAULT
Properties
destinationTable *Requiredstring
The table where to load data
The table must be created before.
writeStreamType *Requiredstring
DEFAULT
DEFAULT
COMMITTED
PENDING
The type of write stream to use
bufferSize integerstring
1000
The number of records to send on each query
from string
The fully-qualified URIs that point to source data
impersonatedServiceAccount string
The GCP service account to impersonate.
location string
The geographic location where the dataset should reside
This property is experimental and might be subject to change or removed.
See Dataset Location
projectId string
The GCP project ID.
scopes array
["https://www.googleapis.com/auth/cloud-platform"]
The GCP scopes to be used.
serviceAccount string
The GCP service account.
Outputs
commitTime string
date-time
Commit time reported by BigQuery, only on PENDING
writeStreamType
rows integer
Rows count
rowsCount integer
Rows count reported by BigQuery, only on PENDING
writeStreamType
Metrics
rows counter
Rows count.
rows_count counter
Rows count reported by BigQuery, only on PENDING
writeStreamType.