ExtractToGcs
Extract data from BigQuery table to GCS (Google Cloud Storage).
type: "io.kestra.plugin.gcp.bigquery.ExtractToGcs"
Extract a BigQuery table to a GCS bucket.
id: gcp_bq_extract_to_gcs
namespace: company.team
tasks:
- id: extract_to_gcs
type: io.kestra.plugin.gcp.bigquery.ExtractToGcs
destinationUris:
- "gs://bucket_name/filename.csv"
sourceTable: "my_project.my_dataset.my_table"
format: CSV
fieldDelimiter: ';'
printHeader: true
the compression value to use for exported files. If not set exported files are not compressed.
The list of fully-qualified Google Cloud Storage URIs (e.g. gs://bucket/path) where the extracted table should be written.
The delimiter to use between fields in the exported data. By default "," is used.
The exported file format. If not set table is exported in CSV format.
The GCP service account to impersonate.
The labels associated with this job.
The labels associated with this job. You can use these to organize and group your jobs. Label keys and values can be no longer than 63 characters, can only contain lowercase letters, numeric characters, underscores and dashes. International characters are allowed. Label values are optional. Label keys must start with a letter and each label in the list must have a different key. Parameters: labels - labels or null for none
The geographic location where the dataset should reside.
This property is experimental and might be subject to change or removed.
See Dataset Location
The GCP project ID.
The messages which would trigger an automatic retry.
Message is tested as a substring of the full message, and is case insensitive.
The reasons which would trigger an automatic retry.
The GCP scopes to be used.
The GCP service account.
The table to export.
The destination URI file
Number of extracted files
The job id
source Table