DeleteDataset
Delete a dataset.
type: "io.kestra.plugin.gcp.bigquery.DeleteDataset"
Delete a dataset.
id: gcp_bq_delete_dataset
namespace: company.team
tasks:
- id: delete_dataset
type: io.kestra.plugin.gcp.bigquery.DeleteDataset
name: "my-dataset"
deleteContents: true
The dataset's user-defined id.
Whether to delete a dataset even if non-empty.
If not provided, attempting to delete a non-empty dataset will result in a exception being thrown.
The GCP service account to impersonate.
The geographic location where the dataset should reside.
This property is experimental and might be subject to change or removed.
See Dataset Location
The GCP project ID.
Automatic retry for retryable BigQuery exceptions.
Some exceptions (especially rate limit) are not retried by default by BigQuery client, we use by default a transparent retry (not the kestra one) to handle this case. The default values are exponential of 5 seconds for a maximum of 15 minutes and ten attempts
["due to concurrent update","Retrying the job may solve the problem"]
The messages which would trigger an automatic retry.
Message is tested as a substring of the full message, and is case insensitive.
["rateLimitExceeded","jobBackendError","internalError","jobInternalError"]
The reasons which would trigger an automatic retry.
["https://www.googleapis.com/auth/cloud-platform"]
The GCP scopes to be used.
The GCP service account.
The dataset's user-defined id
duration
RETRY_FAILED_TASK
RETRY_FAILED_TASK
CREATE_NEW_EXECUTION
>= 1
duration
constant
false
duration
duration
RETRY_FAILED_TASK
RETRY_FAILED_TASK
CREATE_NEW_EXECUTION
>= 1
duration
random
false
duration
duration
RETRY_FAILED_TASK
RETRY_FAILED_TASK
CREATE_NEW_EXECUTION
>= 1
duration
exponential
false