StorageWrite
StorageWrite
yaml
type: "io.kestra.plugin.gcp.bigquery.StorageWrite"
Load an kestra internal storage file on bigquery using BigQuery Storage API
Examples
yaml
id: gcp_bq_storage_write
namespace: company.team
tasks:
- id: read_data
type: io.kestra.plugin.core.http.Download
uri: https://dummyjson.com/products/1
- id: storage_write
type: io.kestra.plugin.gcp.bigquery.StorageWrite
from: "{{ outputs.read_data.uri }}"
destinationTable: "my_project.my_dataset.my_table"
writeStreamType: DEFAULT
Properties
bufferSize
- Type: integer
- Dynamic: ❌
- Required: ✔️
- Default:
1000
The number of records to send on each query
destinationTable
- Type: string
- Dynamic: ✔️
- Required: ✔️
- Min length:
1
The table where to load data
The table must be created before.
writeStreamType
- Type: string
- Dynamic: ❌
- Required: ✔️
- Default:
DEFAULT
- Possible Values:
DEFAULT
COMMITTED
PENDING
The type of write stream to use
from
- Type: string
- Dynamic: ✔️
- Required: ❌
The fully-qualified URIs that point to source data
impersonatedServiceAccount
- Type: string
- Dynamic: ✔️
- Required: ❌
The GCP service account to impersonate.
location
- Type: string
- Dynamic: ✔️
- Required: ❌
The geographic location where the dataset should reside
This property is experimental and might be subject to change or removed.
See Dataset Location
projectId
- Type: string
- Dynamic: ✔️
- Required: ❌
The GCP project ID.
scopes
- Type: array
- SubType: string
- Dynamic: ✔️
- Required: ❌
- Default:
[https://www.googleapis.com/auth/cloud-platform]
The GCP scopes to be used.
serviceAccount
- Type: string
- Dynamic: ✔️
- Required: ❌
The GCP service account.
Outputs
commitTime
- Type: string
- Required: ❌
- Format:
date-time
Commit time reported by BigQuery, only on PENDING
writeStreamType
rows
- Type: integer
- Required: ❌
Rows count
rowsCount
- Type: integer
- Required: ❌
Rows count reported by BigQuery, only on PENDING
writeStreamType
Metrics
rows
- Type: counter
rows_count
- Type: counter
Was this page helpful?