Skip to main content

Use Dagger with GitHub Actions and Google Cloud

note

Watch a live demo of this tutorial in the Dagger Community Call (12 Jan 2023). For more demos, join the next Dagger Community Call.

Introduction

This tutorial teaches you how to use a Dagger pipeline to continuously build and deploy a Node.js application with GitHub Actions on Google Cloud Run. You will learn how to:

  • Configure a Google Cloud service account and assign it the correct roles
  • Create a Google Cloud Run service accessible at a public URL
  • Create a Dagger pipeline using a Dagger SDK
  • Run the Dagger pipeline on your local host to manually build and deploy the application on Google Cloud Run
  • Use the same Dagger pipeline with GitHub Actions to automatically build and deploy the application on Google Cloud Run on every repository commit

Requirements

This tutorial assumes that:

Step 1: Create a Google Cloud service account

The Dagger pipeline demonstrated in this tutorial (re)builds a container image of an application every time a new commit is added to the application's repository. It then publishes the container image to Google Container registry and deploys it at a public URL using Google Cloud infrastructure.

This requires the following:

  • A Google Cloud service account with all necessary privileges
  • A Google Cloud Run service with a public URL and defined resource/capacity/access rules
  • Access to various Google Cloud APIs
info

This step discusses how to create a Google Cloud service account. If you already have a Google Cloud service account and key for your project, skip to Step 2.

Create a Google Cloud service account, as follows:

  1. Log in to the Google Cloud Console and select your project.

  2. From the navigation menu, click IAM & Admin -> Service Accounts.

  3. Click Create Service Account.

  4. In the Service account details section, enter a string in the Service account ID field. This string forms the prefix of the unique service account email address.

    Create Google Cloud service account

  5. Click Create and Continue.

  6. In the Grant this service account access to project section, select the Service Account Token Creator and Editor roles.

    Create Google Cloud service account roles

  7. Click Continue.

  8. Click Done.

Once the service account is created, the Google Cloud Console displays it in the service account list, as shown below. Note the service account email address, as you will need it in the next step.

List Google Cloud service accounts

Next, create a JSON key for the service account as follows:

  1. From the navigation menu, click IAM & Admin -> Service Accounts.
  2. Click the newly-created service account in the list of service accounts.
  3. Click the Keys tab on the service account detail page.
  4. Click Add Key -> Create new key.
  5. Select the JSON key type.
  6. Click Create.

The key is created and automatically downloaded to your local host through your browser as a JSON file.

Create Google Cloud service account key

warning

Store the JSON service account key file safely as it cannot be retrieved again.

Step 2: Configure Google Cloud APIs and a Google Cloud Run service

The next step is to enable access to the required Google Cloud APIs:

  1. From the navigation menu, select the APIs & Services -> Enabled APIs & services option.

  2. Select the Enable APIs and Services option.

  3. On the API Library page, search for and select the Cloud Run API entry.

  4. On the API detail page, click Enable.

    Enable Google Cloud API

  5. Repeat the previous two steps for the IAM Service Account Credentials API.

Once the APIs are enabled, the Google Cloud Console displays the updated status of the APIs.

The final step is to create a Google Cloud Run service and corresponding public URL endpoint. This service will eventually host the container deployed by the Dagger pipeline.

  1. From the navigation menu, select the Serverless -> Cloud Run product.

  2. Select the Create Service option.

  3. Select the Deploy one revision from an existing container image option. Click Test with a sample container to have a container image URL pre-filled.

  4. Continue configuring the service with the following inputs:

    • Service name: myapp (modify as needed)
    • Region: us-central1 (modify as needed)
    • CPU allocation and pricing: CPU is only allocated during request processing
    • Minimum number of instances: 0 (modify as needed)
    • Maximum number of instances: 1 (modify as needed)
    • Ingress: Allow all traffic
    • Authentication: Allow unauthenticated invocations

    Create Google Cloud Run service

  5. Click Create to create the service.

The new service is created. The Google Cloud Console displays the service details, including its public URL, on the service detail page, as shown below.

View Google Cloud Run service details

Step 3: Create the Dagger pipeline

The next step is to create a Dagger pipeline to do the heavy lifting: build a container image of the application, release it to Google Container Registry and deploy it on Google Cloud Run.

  1. In the application directory, install the Dagger SDK and the Google Cloud Run client library as development dependencies:

    go get dagger.io/dagger@latest
    go get cloud.google.com/go/run/apiv2
  2. Create a new sub-directory named ci. Within the ci directory, create a file named main.go and add the following code to it. Replace the PROJECT placeholder with your Google Cloud project identifier and adjust the region (us-central1) and service name (myapp) if you specified different values when creating the Google Cloud Run service in Step 2.

    package main

    import (
    "context"
    "fmt"
    "os"

    run "cloud.google.com/go/run/apiv2"
    runpb "cloud.google.com/go/run/apiv2/runpb"
    "dagger.io/dagger"
    )

    const GCR_SERVICE_URL = "projects/PROJECT/locations/us-central1/services/myapp"
    const GCR_PUBLISH_ADDRESS = "gcr.io/PROJECT/myapp"

    func main() {
    // create Dagger client
    ctx := context.Background()
    daggerClient, err := dagger.Connect(ctx, dagger.WithLogOutput(os.Stderr))
    if err != nil {
    panic(err)
    }
    defer daggerClient.Close()

    // get working directory on host
    source := daggerClient.Host().Directory(".", dagger.HostDirectoryOpts{
    Exclude: []string{"ci", "node_modules"},
    })

    // build application
    node := daggerClient.Container(dagger.ContainerOpts{Platform: "linux/amd64"}).
    From("node:16")

    c := node.
    WithDirectory("/src", source).
    WithWorkdir("/src").
    WithExec([]string{"cp", "-R", ".", "/home/node"}).
    WithWorkdir("/home/node").
    WithExec([]string{"npm", "install"}).
    WithEntrypoint([]string{"npm", "start"})

    // publish container to Google Container Registry
    addr, err := c.Publish(ctx, GCR_PUBLISH_ADDRESS)
    if err != nil {
    panic(err)
    }

    // print ref
    fmt.Println("Published at:", addr)

    // create Google Cloud Run client
    gcrClient, err := run.NewServicesClient(ctx)
    if err != nil {
    panic(err)
    }
    defer gcrClient.Close()

    // define service request
    gcrRequest := &runpb.UpdateServiceRequest{
    Service: &runpb.Service{
    Name: GCR_SERVICE_URL,
    Template: &runpb.RevisionTemplate{
    Containers: []*runpb.Container{
    {
    Image: addr,
    Ports: []*runpb.ContainerPort{
    {
    Name: "http1",
    ContainerPort: 1323,
    },
    },
    },
    },
    },
    },
    }

    // update service
    gcrOperation, err := gcrClient.UpdateService(ctx, gcrRequest)
    if err != nil {
    panic(err)
    }

    // wait for service request completion
    gcrResponse, err := gcrOperation.Wait(ctx)
    if err != nil {
    panic(err)
    }

    // print ref
    fmt.Println("Deployment for image", addr, "now available at", gcrResponse.Uri)

    }

    This file performs the following operations:

    • It imports the Dagger and Google Cloud Run client libraries.
    • It creates a Dagger client with Connect(). This client provides an interface for executing commands against the Dagger engine.
    • It uses the client's Host().Directory() method to obtain a reference to the current directory on the host, excluding the node_modules and ci directories. This reference is stored in the source variable.
    • It uses the client's Container().From() method to initialize a new container from a base image. The additional Platform argument to the Container() method instructs Dagger to build for a specific architecture. In this example, the base image is the node:16 image and the archiecture is linux/amd64, which is one of the architectures supported by Google Cloud. This method returns a Container representing an OCI-compatible container image.
    • It uses the previous Container object's WithDirectory() method to return the container image with the host directory written at the /src path, and the WithWorkdir() method to set the working directory in the container.
    • It chains the WithExec() method to copy the contents of the working directory to the /home/node directory in the container and then uses the WithWorkdir() method to change the working directory in the container to /home/node.
    • It chains the WithExec() method again to install dependencies with npm install and sets the container entrypoint using the WithEntrypoint() method.
    • It uses the container object's Publish() method to publish the container to Google Container Registry, and prints the SHA identifier of the published image.
    • It creates a Google Cloud Run client, updates the Google Cloud Run service defined in Step 2 to use the published container image, and requests a service update.
  3. Run the following command to update go.sum:

    go mod tidy
tip

Most Container object methods return a revised Container object representing the new state of the container. This makes it easy to chain methods together. Dagger evaluates pipelines "lazily", so the chained operations are only executed when required - in this case, when the container is published. Learn more about lazy evaluation in Dagger.

Step 4: Test the Dagger pipeline on the local host

Configure credentials for the Google Cloud SDK on the local host, as follows:

  1. Configure Docker credentials for Google Container Registry on the local host using the following commands. Replace the SERVICE-ACCOUNT-ID placeholder with the service account email address created in Step 1, and the SERVICE-ACCOUNT-KEY-FILE placeholder with the location of the service account JSON key file downloaded in Step 1.

    gcloud auth activate-service-account SERVICE-ACCOUNT-ID --key-file=SERVICE-ACCOUNT-KEY-FILE
    gcloud auth configure-docker
    info

    This step is necessary because Dagger relies on the host's Docker credentials and authorizations when publishing to remote registries.

  2. Set the GOOGLE_APPLICATION_CREDENTIALS environment variable to the location of the service account JSON key file, replacing the SERVICE-ACCOUNT-KEY-FILE placeholder in the following command. This variable is used by the Google Cloud Run client library during the client authentication process.

    export GOOGLE_APPLICATION_CREDENTIALS=SERVICE-ACCOUNT-KEY-FILE

Once credentials are configured, test the Dagger pipeline by running the command below:

dagger run go run ci/main.go

Dagger performs the operations defined in the pipeline script, logging each operation to the console. At the end of the process, the built container is deployed to Google Cloud Run and a message similar to the one below appears in the console output:

Deployment for image gcr.io/PROJECT/myapp@sha256:b1cf... now available at https://...run.app

Browse to the URL shown in the deployment message to see the running application.

If you deployed the example application from Appendix A, you should see a page similar to that shown below:

Result of running pipeline from local host

Step 5: Create a GitHub Actions workflow

Dagger executes your pipelines entirely as standard OCI containers. This means that the same pipeline will run the same, whether on on your local machine or a remote server.

This also means that it's very easy to move your Dagger pipeline from your local host to GitHub Actions - all that's needed is to commit and push the pipeline script from your local clone to your GitHub repository, and then define a GitHub Actions workflow to run it on every commit.

  1. Commit and push the pipeline script and related changes to the application's GitHub repository:

    git add .
    git commit -a -m "Added pipeline"
    git push
  2. In the GitHub repository, create a new workflow file at .github/workflows/main.yml with the following content:

    name: 'ci'

    on:
    push:
    branches:
    - master

    jobs:
    dagger:
    runs-on: ubuntu-latest
    steps:
    -
    name: Checkout
    uses: actions/checkout@v3
    -
    id: 'auth'
    name: 'Authenticate to Google Cloud'
    uses: 'google-github-actions/auth@v1'
    with:
    token_format: 'access_token'
    credentials_json: '${{ secrets.GOOGLE_CREDENTIALS }}'
    -
    name: Login to Google Container Registry
    uses: docker/login-action@v2
    with:
    registry: gcr.io
    username: oauth2accesstoken
    password: ${{ steps.auth.outputs.access_token }}
    -
    name: Setup Go
    uses: actions/setup-go@v4
    with:
    go-version: '>=1.20'
    -
    name: Install
    run: go get dagger.io/dagger@latest cloud.google.com/go/run/apiv2
    -
    name: Install Dagger CLI
    run: cd /usr/local && { curl -L https://dl.dagger.io/dagger/install.sh | sh; cd -; }
    -
    name: Release and deploy with Dagger
    run: dagger run go run ci/main.go

    This workflow runs on every commit to the repository master branch. It consists of a single job with seven steps, as below:

    • The first step uses the Checkout action to check out the latest source code from the main branch to the GitHub runner.
    • The second step uses the Authenticate to Google Cloud action to authenticate to Google Cloud. It requires a service account key in JSON format, which it expects to find in the GOOGLE_CREDENTIALS GitHub secret. This step sets various environment variables (including the GOOGLE_APPLICATION_CREDENTIALS variable required by the Google Cloud Run SDK) and returns an access token as output, which is used to authenticate the next step.
    • The third step uses the Docker Login action and the access token from the previous step to authenticate to Google Container Registry from the GitHub runner. This is necessary because Dagger relies on the host's Docker credentials and authorizations when publishing to remote registries.
    • The fourth and fifth steps download and install the programming language and required dependencies (such as the Dagger SDK and the Google Cloud Run SDK) on the GitHub runner.
    • The sixth step downloads and installs the Dagger CLI on the GitHub runner.
    • The seventh and final step executes the Dagger pipeline.

The Authenticate to Google Cloud action looks for a JSON service account key in the GOOGLE_CREDENTIALS GitHub secret. Create this secret as follows:

  1. Navigate to the Settings -> Secrets -> Actions page in the GitHub Web interface.
  2. Click New repository secret to create a new secret.
  3. Configure the secret with the following inputs:
    • Name: GOOGLE_CREDENTIALS
    • Secret: The contents of the service account JSON key file downloaded in Step 1.
  4. Click Add secret to save the secret.

Create GitHub secret

Step 6: Test the Dagger pipeline on GitHub

Test the Dagger pipeline by committing a change to the GitHub repository.

If you are using the example application described in Appendix A, the following commands modify and commit a simple change to the application's index page:

git pull
sed -i 's/Dagger/Dagger on GitHub/g' routes/index.js
git add routes/index.js
git commit -a -m "Update welcome message"
git push

The commit triggers the GitHub Actions workflow defined in Step 6. The workflow runs the various steps of the dagger job, including the pipeline script.

At the end of the process, a new version of the built container image is released to Google Container Registry and deployed on Google Cloud Run. A message similar to the one below appears in the GitHub Actions log:

Deployment for image gcr.io/PROJECT/myapp@sha256:h4si... now available at https://...run.app

Browse to the URL shown in the deployment message to see the running application. If you deployed the example application with the additional modification above, you see a page similar to that shown below:

Result of running pipeline from GitHub

Conclusion

This tutorial walked you through the process of creating a Dagger pipeline to continuously build and deploy a Node.js application on Google Cloud Run. It used the Dagger SDKs and explained key concepts, objects and methods available in the SDKs to construct a Dagger pipeline.

Dagger executes your pipelines entirely as standard OCI containers. This means that pipelines can be tested and debugged locally, and that the same pipeline will run consistently on your local machine, a CI runner, a dedicated server, or any container hosting service. This portability is one of Dagger's key advantages, and this tutorial demonstrated it in action by using the same pipeline on the local host and on GitHub.

Use the API Key Concepts page and the Go, Node.js and Python SDK References to learn more about Dagger.

Appendix A: Create a GitHub repository with an example Express application

This tutorial assumes that you have a GitHub repository with a Node.js Web application. If not, follow the steps below to create a GitHub repository and commit an example Express application to it.

  1. Log in to GitHub using the GitHub CLI:

    gh auth login
  2. Create a directory for the Express application:

    mkdir myapp
    cd myapp
  3. Create a skeleton Express application:

    npx express-generator
  4. Make a minor modification to the application's index page:

    sed -i -e 's/Express/Dagger/g' routes/index.js
  5. Initialize a local Git repository for the application:

    git init
  6. Add a .gitignore file and commit the application code:

    echo node_modules >> .gitignore
    git add .
    git commit -a -m "Initial commit"
  7. Create a private repository in your GitHub account and push the changes to it:

    gh repo create myapp --push --source . --private