Available on: Enterprise Edition>= 0.18.0

Run tasks as containers on Google Cloud Run.

How to use the Google Cloud Run task runner

This runner will deploy the container for the task to a Google Compute Engine VM.

How does Google Cloud Run task runner work

To use inputFiles, outputFiles or namespaceFiles properties, make sure to set the bucket property. The bucket serves as an intermediary storage layer for the task runner. Input and namespace files will be uploaded to the cloud storage bucket before the task run.

Similarly, the task runner will store outputFiles in this bucket during the task run. In the end, the task runner will make those files available for download and preview from the UI by sending them to internal storage.

To make it easier to track where all files are stored, the task runner will generate a folder for each task run. You can access that folder using the {{ bucketPath }} Pebble expression or the BUCKET_PATH environment variable.

Due to the ephemeral nature of Cloud Run, the task runner will not run the task in the working directory but in the root directory. Therefore, you have to use the {{ workingDir }} Pebble expression or the WORKING_DIR environment variable to access the inputFiles and namespaceFiles from the task's working directory.

A full flow example

The following example runs a simple Shell command in a Cloud Run container:

yaml
id: new-shell
namespace: company.team

variables:
  projectId: myProjectId
  region: europe-west2

tasks:
  - id: shell
    type: io.kestra.plugin.scripts.shell.Commands
    taskRunner:
      type: io.kestra.plugin.ee.gcp.runner.CloudRun
      projectId: "{{ vars.projectId }}"
      region: "{{ vars.region }}"
      serviceAccount: "{{ secret('GOOGLE_SA') }}"
    commands:
      - echo "Hello World"

The following example runs a Shell command in a Cloud Run container and passes input files to the task:

yaml
id: new-shell-with-file
namespace: company.team

variables:
  projectId: myProjectId
  region: europe-west2

inputs:
  - id: file
    type: FILE

tasks:
  - id: shell
    type: io.kestra.plugin.scripts.shell.Commands
    inputFiles:
      data.txt: "{{ inputs.file }}"
    outputFiles:
      - out.txt
    containerImage: centos
    taskRunner:
      type: io.kestra.plugin.ee.gcp.runner.CloudRun
      projectId: "{{ vars.projectId }}"
      region: "{{ vars.region }}"
      bucket: "{{ vars.bucket }}"
      serviceAccount: "{{ secret('GOOGLE_SA') }}"
    commands:
      - cp {{ workingDir }}/data.txt {{ workingDir }}/out.txt

How to Run Tasks on Google Cloud Run

Before you begin

Before you start, you need to have the following:

  1. A Google Cloud account.
  2. A Kestra instance in a version 0.16.0 or later with Google credentials stored as secrets or environment variables within the Kestra instance.

Google Cloud Console Setup

Create a project

If you don't already have a project, create one with a name of your choice.

project

Once you've done this, make sure your project is selected in the menu bar.

project_selection

Enable Cloud Run Admin API

In the search bar, search and select APIs & Services. Then select Enable APIs and Services and search for Cloud Run Admin API. When you select this, select the Enable button.

batchapi

Create the Service Account

Now that the Cloud Run Admin API is enabled, we can proceed with creating credentials so we can access GCP directly inside of Kestra.

In the search bar, search and select Service Accounts. Now select Create Service Account.

After you've selected this, you'll need to give a name to your service account. Name it something memorable, as we'll need to type this into Kestra later.

sa-1

Once you've given it a name, make sure to select the following roles:

  • Cloud Run Developer
  • Logs Viewer
  • Storage Admin (to upload files to GCS and download files from GCS)

roles

Check out this guide on how to add your service account into Kestra as a secret.

We'll also need to make sure our service account can access the Compute Engine default service account so it can create jobs.

To do this, we can go to IAM & Admin, then Service Accounts. On this page, we can select the compute engine service account, select Permissions and then Grant Access. On this page, we want to add our original Service account as a Service Account User role. Once we've done this, we can select Save.

compute

Create Bucket

Head to the search bar and type "Bucket" to find GCS Bucket. Now create a new bucket! You'll be prompted to set a name, region and various other permissions. For now, we can leave these all to default.

bucket

Creating our Flow

Below is an example flow that will run a Shell script that will copy a file with a new file name using a GCP Cloud Run Task Runner. At the top of the io.kestra.plugin.scripts.shell.Commands task, there are the properties for defining our Task Runner:

yaml
containerImage: centos
taskRunner:
  type: io.kestra.plugin.ee.gcp.runner.CloudRun
  projectId: "{{ secret('GCP_PROJECT_ID') }}"
  region: "{{ vars.region }}"
  bucket: "{{ secret('GCP_BUCKET') }}"
  serviceAccount: "{{ secret('GOOGLE_SA') }}"

This is where we can enter the details for GCP such as the projectId, region, bucket, as well as serviceAccount. We can add these all as secrets.

The containerImage property is required because Cloud Run executes tasks as containers. You can use any image from a public or private registry. In this example, we are going to use centos.

yaml
id: new-shell-with-file
namespace: company.team

variables:
  projectId: myProjectId
  region: europe-west2

inputs:
  - id: file
    type: FILE

tasks:
  - id: shell
    type: io.kestra.plugin.scripts.shell.Commands
    inputFiles:
      data.txt: "{{ inputs.file }}"
    outputFiles:
      - out.txt
    containerImage: centos
    taskRunner:
      type: io.kestra.plugin.ee.gcp.runner.CloudRun
      projectId: "{{ secret('GCP_PROJECT_ID') }}"
      region: "{{ vars.region }}"
      bucket: "{{ secret('GCP_BUCKET') }}"
      serviceAccount: "{{ secret('GOOGLE_SA') }}"
    commands:
      - cp {{ workingDir }}/data.txt {{ workingDir }}/out.txt

When we press execute, we can see that our task runner is created in the Logs.

logs

We can also go to the GCP Console and see our task runner has been created:

jobs

Once the task has completed, it will automatically close down the VM on Google Cloud.

Was this page helpful?