DeleteTable​Delete​Table

Delete a BigQuery table or a partition.

yaml
type: "io.kestra.plugin.gcp.bigquery.DeleteTable"

Delete a partition

yaml
id: gcp_bq_delete_table
namespace: company.team

tasks:
  - id: delete_table
    type: io.kestra.plugin.gcp.bigquery.DeleteTable
    projectId: my-project
    dataset: my-dataset
    table: my-table$20130908
Properties

The dataset's user-defined ID.

The table's user-defined ID.

The GCP service account to impersonate.

The geographic location where the dataset should reside.

This property is experimental and might be subject to change or removed.

See Dataset Location

The GCP project ID.

Automatic retry for retryable BigQuery exceptions.

Some exceptions (especially rate limit) are not retried by default by BigQuery client, we use by default a transparent retry (not the kestra one) to handle this case. The default values are exponential of 5 seconds for a maximum of 15 minutes and ten attempts

SubType string
Default ["due to concurrent update","Retrying the job may solve the problem","Retrying may solve the problem"]

The messages which would trigger an automatic retry.

Message is tested as a substring of the full message, and is case insensitive.

SubType string
Default ["rateLimitExceeded","jobBackendError","backendError","internalError","jobInternalError"]

The reasons which would trigger an automatic retry.

SubType string
Default ["https://www.googleapis.com/auth/cloud-platform"]

The GCP scopes to be used.

The GCP service account.

The dataset's id

The project's id

The table name

Format duration
Default RETRY_FAILED_TASK
Possible Values
RETRY_FAILED_TASKCREATE_NEW_EXECUTION
Minimum >= 1
Format duration
Default false
Format duration
Format duration
Default RETRY_FAILED_TASK
Possible Values
RETRY_FAILED_TASKCREATE_NEW_EXECUTION
Minimum >= 1
Format duration
Default false
Format duration
Format duration
Default RETRY_FAILED_TASK
Possible Values
RETRY_FAILED_TASKCREATE_NEW_EXECUTION
Minimum >= 1
Format duration
Default false