StorageWrite StorageWrite

yaml
type: "io.kestra.plugin.gcp.bigquery.StorageWrite"

Load an kestra internal storage file on bigquery using BigQuery Storage API

Examples

yaml
id: "storage_write"
type: "io.kestra.plugin.gcp.bigquery.StorageWrite"
from: "{{ outputs.read.uri }}"
destinationTable: "my_project.my_dataset.my_table"
writeStreamType: DEFAULT

Properties

bufferSize

  • Type: integer
  • Dynamic: ❌
  • Required: βœ”οΈ
  • Default: 1000

The number of records to send on each query

destinationTable

  • Type: string
  • Dynamic: βœ”οΈ
  • Required: βœ”οΈ
  • Min length: 1

The table where to load data

The table must be created before.

writeStreamType

  • Type: string
  • Dynamic: ❌
  • Required: βœ”οΈ
  • Default: DEFAULT
  • Possible Values:
    • DEFAULT
    • COMMITTED
    • PENDING

The type of write stream to use

from

  • Type: string
  • Dynamic: βœ”οΈ
  • Required: ❌

The fully-qualified URIs that point to source data

location

  • Type: string
  • Dynamic: βœ”οΈ
  • Required: ❌

The geographic location where the dataset should reside

This property is experimental and might be subject to change or removed.

See Dataset Location

projectId

  • Type: string
  • Dynamic: βœ”οΈ
  • Required: ❌

The GCP project ID.

scopes

  • Type: array
  • SubType: string
  • Dynamic: βœ”οΈ
  • Required: ❌
  • Default: [https://www.googleapis.com/auth/cloud-platform]

The GCP scopes to be used.

serviceAccount

  • Type: string
  • Dynamic: βœ”οΈ
  • Required: ❌

The GCP service account key.

Outputs

commitTime

  • Type: string
  • Dynamic: ❓
  • Required: ❌
  • Format: date-time

Commit time reported by BigQuery, only on PENDING writeStreamType

rows

  • Type: integer
  • Dynamic: ❓
  • Required: ❌

Rows count

rowsCount

  • Type: integer
  • Dynamic: ❓
  • Required: ❌

Rows count reported by BigQuery, only on PENDING writeStreamType

Metrics

rows

  • Type: counter

rows_count

  • Type: counter

Was this page helpful?