StorageWrite StorageWrite

yaml
type: "io.kestra.plugin.gcp.bigquery.StorageWrite"

Load an kestra internal storage file on bigquery using BigQuery Storage API

Examples

yaml
id: "storage_write"
type: "io.kestra.plugin.gcp.bigquery.StorageWrite"
from: "{{ outputs.read.uri }}"
destinationTable: "my_project.my_dataset.my_table"
writeStreamType: DEFAULT

Properties

bufferSize

  • Type: integer
  • Dynamic:
  • Required: ✔️
  • Default: 1000

The number of records to send on each query

destinationTable

  • Type: string
  • Dynamic: ✔️
  • Required: ✔️
  • Min length: 1

The table where to load data

The table must be created before.

writeStreamType

  • Type: string
  • Dynamic:
  • Required: ✔️
  • Default: DEFAULT
  • Possible Values:
    • DEFAULT
    • COMMITTED
    • PENDING

The type of write stream to use

from

  • Type: string
  • Dynamic: ✔️
  • Required:

The fully-qualified URIs that point to source data

location

  • Type: string
  • Dynamic: ✔️
  • Required:

The geographic location where the dataset should reside

This property is experimental and might be subject to change or removed.

See Dataset Location

projectId

  • Type: string
  • Dynamic: ✔️
  • Required:

The GCP project id

scopes

  • Type: array
  • SubType: string
  • Dynamic: ✔️
  • Required:
  • Default: [https://www.googleapis.com/auth/cloud-platform]

The GCP scopes to used

serviceAccount

  • Type: string
  • Dynamic: ✔️
  • Required:

The GCP service account key

Outputs

commitTime

  • Type: string

Commit time reported by BigQuery, only on PENDING writeStreamType

rows

  • Type: integer

Rows count

rowsCount

  • Type: integer

Rows count reported by BigQuery, only on PENDING writeStreamType