DeletePartitions
Delete partitions between interval
type: "io.kestra.plugin.gcp.bigquery.DeletePartitions"
id: gcp_bq_delete_partitions
namespace: company.team
tasks:
- id: delete_partitions
type: io.kestra.plugin.gcp.bigquery.DeletePartitions
projectId: my-project
dataset: my-dataset
table: my-table
partitionType: DAY
from: "{{ now() | dateAdd(-30, 'DAYS') }}"
to: "{{ now() | dateAdd(-7, 'DAYS') }}"
YES
The dataset's user-defined ID.
YES
The inclusive starting date or integer.
If the partition :
- is a numeric range, must be a valid integer
- is a date, must a valid datetime like
{{ now() }}
YES
DAY
HOUR
MONTH
YEAR
RANGE
The partition type of the table
YES
The table's user-defined ID.
YES
The inclusive ending date or integer.
If the partition :
- is a numeric range, must be a valid integer
- is a date, must a valid datetime like
{{ now() }}
YES
The GCP service account to impersonate.
YES
The geographic location where the dataset should reside.
This property is experimental and might be subject to change or removed.
See Dataset Location
YES
The GCP project ID.
NO
YES
["due to concurrent update","Retrying the job may solve the problem"]
The messages which would trigger an automatic retry.
Message is tested as a substring of the full message, and is case insensitive.
YES
["rateLimitExceeded","jobBackendError","internalError","jobInternalError"]
The reasons which would trigger an automatic retry.
YES
["https://www.googleapis.com/auth/cloud-platform"]
The GCP scopes to be used.
YES
The GCP service account.
The dataset's id
Partitions deleted
The project's id
The table name
NO
duration
NO
RETRY_FAILED_TASK
RETRY_FAILED_TASK
CREATE_NEW_EXECUTION
NO
>= 1
NO
duration
NO
constant
NO
false
NO
duration
NO
duration
NO
RETRY_FAILED_TASK
RETRY_FAILED_TASK
CREATE_NEW_EXECUTION
NO
>= 1
NO
duration
NO
random
NO
false
NO
duration
NO
duration
NO
RETRY_FAILED_TASK
RETRY_FAILED_TASK
CREATE_NEW_EXECUTION
NO
NO
>= 1
NO
duration
NO
exponential
NO
false