ExtractToGcs​Extract​To​Gcs

Extract data from a BigQuery table to GCS.

yaml
type: "io.kestra.plugin.gcp.bigquery.ExtractToGcs"

Extract a BigQuery table to a GCS bucket.

yaml
id: gcp_bq_extract_to_gcs
namespace: company.team

tasks:
  - id: extract_to_gcs
    type: io.kestra.plugin.gcp.bigquery.ExtractToGcs
    destinationUris:
      - "gs://bucket_name/filename.csv"
    sourceTable: "my_project.my_dataset.my_table"
    format: CSV
    fieldDelimiter: ';'
    printHeader: true
Properties

the compression value to use for exported files. If not set exported files are not compressed.

SubType string

The list of fully-qualified Google Cloud Storage URIs (e.g. gs://bucket/path) where the extracted table should be written.

The delimiter to use between fields in the exported data. By default "," is used.

The exported file format. If not set table is exported in CSV format.

The GCP service account to impersonate.

Optional Job timeout in milliseconds. If this time limit is exceeded, BigQuery may attempt to terminate the job.

SubType string

The labels associated with this job.

The labels associated with this job. You can use these to organize and group your jobs. Label keys and values can be no longer than 63 characters, can only contain lowercase letters, numeric characters, underscores and dashes. International characters are allowed. Label values are optional. Label keys must start with a letter and each label in the list must have a different key. Parameters: labels - labels or null for none

The geographic location where the dataset should reside.

This property is experimental and might be subject to change or removed.

See Dataset Location

Whether to print out a header row in the results. By default an header is printed.

The GCP project ID.

Automatic retry for retryable BigQuery exceptions.

Some exceptions (especially rate limit) are not retried by default by BigQuery client, we use by default a transparent retry (not the kestra one) to handle this case. The default values are exponential of 5 seconds for a maximum of 15 minutes and ten attempts

SubType string
Default ["due to concurrent update","Retrying the job may solve the problem","Retrying may solve the problem"]

The messages which would trigger an automatic retry.

Message is tested as a substring of the full message, and is case insensitive.

SubType string
Default ["rateLimitExceeded","jobBackendError","backendError","internalError","jobInternalError"]

The reasons which would trigger an automatic retry.

SubType string
Default ["https://www.googleapis.com/auth/cloud-platform"]

The GCP scopes to be used.

The GCP service account.

The table to export.

Optional Flag if format is set to "AVRO".

Optional If destinationFormat is set to "AVRO", this flag indicates whether to enable extracting applicable column types (such as TIMESTAMP) to their corresponding AVRO logical types (timestamp-micros), instead of only using their raw types (avro-long).

SubType string

The destination URI file

SubType integer

Number of extracted files

The job id

source Table

The time it took for the job to run.

The number of files extracted to GCS.

Format duration
Default RETRY_FAILED_TASK
Possible Values
RETRY_FAILED_TASKCREATE_NEW_EXECUTION
Minimum >= 1
Format duration
Default false
Format duration
Format duration
Default RETRY_FAILED_TASK
Possible Values
RETRY_FAILED_TASKCREATE_NEW_EXECUTION
Minimum >= 1
Format duration
Default false
Format duration
Format duration
Default RETRY_FAILED_TASK
Possible Values
RETRY_FAILED_TASKCREATE_NEW_EXECUTION
Minimum >= 1
Format duration
Default false