Skip to main content

Use Dagger with GitHub Actions and Google Cloud

Introduction

This guide teaches you how to create and use a custom Dagger Module to continuously build and deploy a Node.js application with GitHub Actions on Google Cloud Run. You will learn how to:

  • Configure a Google Cloud service account and assign it the correct roles
  • Create a Google Cloud Run service accessible at a public URL
  • Create a Dagger module for your pipeline
  • Run the Dagger pipeline on your local host to manually build and deploy the application on Google Cloud Run
  • Use the same Dagger pipeline with GitHub Actions to automatically build and deploy the application on Google Cloud Run on every repository commit

Requirements

This guide assumes that:

Step 1: Create a Google Cloud service account

The Dagger pipeline demonstrated in this tutorial (re)builds a container image of an application every time a new commit is added to the application's repository. It then publishes the container image to Google Artifact Registry and deploys it at a public URL using Google Cloud infrastructure.

This requires the following:

  • A Google Cloud service account with all necessary privileges
  • A Google Cloud Run service with a public URL and defined resource/capacity/access rules
  • A Google Artifact Registry repository
  • Access to various Google Cloud APIs
info

This step discusses how to create a Google Cloud service account. If you already have a Google Cloud service account and key for your project, you may skip this step.

Create a Google Cloud service account, as follows:

  1. Log in to the Google Cloud Console and select your project.

  2. From the navigation menu, click IAM & Admin -> Service Accounts.

  3. Click Create Service Account.

  4. In the Service account details section, enter a string in the Service account ID field. This string forms the prefix of the unique service account email address.

    Create Google Cloud service account

  5. Click Create and Continue.

  6. In the Grant this service account access to project section, select the Service Account Token Creator and Editor roles.

    Create Google Cloud service account roles

  7. Click Continue.

  8. Click Done.

Once the service account is created, the Google Cloud Console displays it in the service account list, as shown below. Note the service account email address, as you will need it in the next step.

List Google Cloud service accounts

Next, create a JSON key for the service account as follows:

  1. From the navigation menu, click IAM & Admin -> Service Accounts.
  2. Click the newly-created service account in the list of service accounts.
  3. Click the Keys tab on the service account detail page.
  4. Click Add Key -> Create new key.
  5. Select the JSON key type.
  6. Click Create.

The key file is created and automatically downloaded to your local host through your browser as a JSON file.

Create Google Cloud service account key

warning

Store the JSON service account key file safely as it cannot be retrieved again.

Step 2: Configure Google Cloud APIs

info

This step discusses how to enable various Google Cloud APIs. If you already have these APIs enabled, you may skip this step.

The next step is to enable access to the required Google Cloud APIs:

  1. From the navigation menu, select the APIs & Services -> Enabled APIs & services option.

  2. Select the Enable APIs and Services option.

  3. On the API Library page, search for and select the Cloud Run API entry.

  4. On the API detail page, click Enable.

    Enable Google Cloud API

  5. Repeat the previous two steps for the IAM Service Account Credentials API and the Artifact Registry API.

Once the APIs are enabled, the Google Cloud Console displays the updated status of the APIs.

Step 3: Configure a Google Cloud Run service

info

This step discusses how to create a Google Cloud Run service. If you already have a Google Cloud Run service for your project, you may skip this step.

The next step is to create a Google Cloud Run service and corresponding public URL endpoint. This service will eventually host the container deployed by the Dagger pipeline.

  1. From the navigation menu, select the Serverless -> Cloud Run product.

  2. Select the Create Service option.

  3. Select the Deploy one revision from an existing container image option. Click Test with a sample container to have a container image URL pre-filled.

  4. Continue configuring the service with the following inputs, modifying them as needed:

    • Authentication: Allow unauthenticated invocations
    • CPU allocation and pricing: CPU is only allocated during request processing
    • Minimum number of instances: 0
    • Maximum number of instances: 1
    • Ingress: Allow all traffic

    Create Google Cloud Run service

  5. Click Create to create the service.

The new service is created. The Google Cloud Console displays the service details, including its public URL, on the service detail page, as shown below.

View Google Cloud Run service details

note

This guide assumes a project named myproject and a service named myapp in the us-central1 region. These details are used as arguments to the Dagger functions in this guide; remember to update them based on the inputs you provided when creating the service.

Step 4: Configure a Google Artifact Registry repository

info

This step discusses how to create a Google Artifact Registry repository. If you already have a Google Google Artifact Registry repository for your project, you may skip this step.

The next step is to create a repository in Google Artifact Registry. This repository will host the built container image of the application.

  1. From the navigation menu, select the CI/CD -> Artifact Registry product.

  2. Select the Repositories category.

  3. Select the Create Repository option.

  4. Configure the repository with the following inputs, modifying them as needed:

    • Format: Docker
    • Mode: Standard
    • Location type: Region
    • Encryption: Google-managed encryption key
    • Cleanup policy: Delete artifacts

    Create Artifact Registry repository

  5. Click Create to create the repository.

The new repository is created. The Google Cloud Console displays the repository in the repository index page. Click the repository name to view details, including its public URL, as shown below:

View Artifact Registry repository URL

note

This guide assumes a project named myproject and a repository named myrepository in the us-central1` region. These repository details are used as arguments to the Dagger functions in this guide; remember to update them based on the inputs you provided when creating the repository.

Step 5: Create the Dagger Module

The next step is to create a Dagger Module to do the heavy lifting: build a container image of the application, release it to Google Container Registry and deploy it on Google Cloud Run.

  1. In the application directory, create a new Dagger Module.

    dagger init --name=my-module --sdk=go
  2. Install the Google Cloud Run module.

    dagger install github.com/vvaswani/daggerverse/google-cloud-run
  3. Update the generated dagger/main.go file with the following code.

    package main

    import (
    "context"
    "fmt"
    )

    type MyModule struct{}

    // build an image
    func (m *MyModule) Build(source *Directory) *Container {
    return dag.Container().
    From("node:21").
    WithDirectory("/home/node", source).
    WithWorkdir("/home/node").
    WithExec([]string{"npm", "install"}).
    WithEntrypoint([]string{"npm", "start"})
    }

    // publish an image
    // example: dagger call publish --source . --project PROJECT --location LOCATION --repository REPOSITORY/APPNAME --credential env:GOOGLE_JSON
    func (m *MyModule) Publish(ctx context.Context, source *Directory, project string, location string, repository string, credential *Secret) (string, error) {
    registry := fmt.Sprintf("%s-docker.pkg.dev/%s/%s", location, project, repository)
    return m.Build(source).
    WithRegistryAuth(fmt.Sprintf("%s-docker.pkg.dev", location), "_json_key", credential).
    Publish(ctx, registry)
    }

    // deploy an image to Google Cloud Run
    // example: dagger call deploy --source . --project PROJECT --registry-location LOCATION --repository REPOSITORY/APPNAME --service-location LOCATION --service SERVICE --credential env:GOOGLE_JSON
    func (m *MyModule) Deploy(ctx context.Context, source *Directory, project, registryLocation, repository, serviceLocation, service string, credential *Secret) (string, error) {
    // publish image
    addr, err := m.Publish(ctx, source, project, registryLocation, repository, credential)
    if err != nil {
    return "", err
    }

    // update service with new image
    return dag.GoogleCloudRun().UpdateService(ctx, project, serviceLocation, service, addr, 3000, credential)
    }

    This Dagger Module contains three Dagger Functions. Let's look at each in detail.

    The Build() function takes care of building the application image.

    • It accepts one argument: a Directory representing the directory containing the application source code.
    • It uses the client's Container().From() method to initialize a new container from a base image. In this example, the base image is the node:21 image. This method returns a Container representing an OCI-compatible container image.
    • It uses the Container.WithDirectory() method to return the container image with the application source code written at the /home/node path, and the Container.WithWorkdir() method to set the working directory in the container.
    • It chains the Container.WithExec() method again to install dependencies with npm install and sets the container entrypoint using the Container.WithEntrypoint() method.

    The Publish() function takes care of publishing the container image to Google Artifact Registry.

    • It accepts six arguments: the Go context, a Directory representing the directory containing the application source code, and strings representing the Google Cloud project name, Google Artifact Registry location and Google Artifact Registry repository name, and a Secret representing Google Cloud credentials.
    • It invokes the Build() function to produce a Container representing the application image.
    • It uses the Container.Publish() method to publish the container to the Google Artifact Registry repository and returns the SHA identifier of the published image.

    The Deploy() function takes care of deploying the application to an existing Google Cloud Run service.

    • It accepts eight arguments: the Go context, a Directory representing the directory containing the application source code, and strings representing the Google Cloud project name, Google Artifact Registry location, Google Artifact Registry repository name, Google Cloud Run service location, Google Cloud Run service name, and a Secret representing Google Cloud credentials.
    • It invokes the Publish() function to build and publish a Container representing the application image.
    • It uses the Google Cloud Run module to deploy the container image to the named Google Cloud Run service.

Step 6: Test the Dagger pipeline on the local host

Once credentials are configured, test the Dagger Functions by running the command below from the application directory. Remember to update the registry-location, repository, service, and service-location arguments if you specified different values when creating the various Google Cloud services in Step 3 and Step 4, and update the SERVICE-ACCOUNT-KEY-FILE placeholder with the local filesystem path to your JSON service key file from Step 1.

dagger call deploy --source . --project myproject --registry-location us-central1 --repository myregistry/myapp --service-location us-central1 --service myservice  --credential file:SERVICE-ACCOUNT-KEY-FILE

Dagger performs the operations defined in the Dagger Function, logging each operation to the console. At the end of the process, the built container is deployed to Google Cloud Run and the function returns the deployment URL. Browse to the deployment URL to see the running application.

If you deployed the example application from Appendix A, you should see a page similar to that shown below:

Result of running pipeline from local host

Step 7: Create a GitHub Actions workflow

Dagger executes Dagger Functions in standard OCI containers. This means that a Dagger Function will always run the same way, whether on on your local machine or a remote server.

This also means that it's very easy to move the workflow you just encapsulated in Dagger Functions from your local host to GitHub Actions - all that's needed is to define a GitHub Actions workflow to call your Dagger Functions on every commit.

  1. Commit and push the Dagger Module to the application's GitHub repository:

    git add .
    git commit -a -m "Added module"
    git push
  2. In the GitHub repository, create a new workflow file at .github/workflows/main.yml with the following content:

    name: 'ci'

    on:
    push:
    branches:
    - master

    jobs:
    dagger:
    runs-on: ubuntu-latest
    steps:
    -
    name: Checkout
    uses: actions/checkout@v4
    -
    name: Call Dagger Function
    uses: dagger/dagger-for-github@v5
    with:
    version: "0.11.1"
    verb: call
    args: |
    deploy \
    --source . \
    --project myproject \
    --registry-location us-central1 \
    --repository myrepository/myapp \
    --service-location us-central1 \
    --service myservice \
    --credential env:GOOGLE_CREDENTIALS \
    env:
    GOOGLE_CREDENTIALS: ${{ secrets.GOOGLE_CREDENTIALS }}

    This workflow runs on every commit to the repository master branch. It consists of a single job with two steps, as below:

    • The first step uses the Checkout action to check out the latest source code from the main branch to the GitHub runner.
    • The second step uses the Dagger for GitHub action to download and install the Dagger CLI and call the Dagger Function with the required arguments.

One of the arguments to the Dagger Function is the Google Cloud JSON service account key, which is retrieved from a GOOGLE_CREDENTIALS environment variable. As shown in the GitHub Actions workflow above, this environment variable is itself populated from a GitHub Actions secret. Create this secret as follows:

  1. Navigate to the Settings -> Secrets -> Actions page in the GitHub Web interface.
  2. Click New repository secret to create a new secret.
  3. Configure the secret with the following inputs:
    • Name: GOOGLE_CREDENTIALS
    • Secret: The contents of the service account JSON key file downloaded in Step 1.
  4. Click Add secret to save the secret.

Create GitHub secret

Step 8: Test the Dagger pipeline on GitHub

Test the Dagger pipeline by committing a change to the GitHub repository.

If you are using the example application described in Appendix A, the following commands modify and commit a simple change to the application's index page:

git pull
sed -i 's/Dagger/Dagger on GitHub/g' routes/index.js
git add routes/index.js
git commit -a -m "Update welcome message"
git push

The commit triggers the GitHub Actions workflow defined in Step 7. The workflow runs the various steps of the dagger job, including the pipeline script.

At the end of the process, a new version of the built container image is released to Google Container Registry and deployed on Google Cloud Run. A message containing the deployment URL appears in the GitHub Actions log.

Browse to the URL shown in the deployment message to see the running application. If you deployed the example application with the additional modification above, you see a page similar to that shown below:

Result of running pipeline from GitHub

Conclusion

This guide walked you through the process of creating a Dagger Module to continuously build and deploy a Node.js application on Google Cloud Run. It explained key concepts, objects and methods available in the Dagger SDKs to create Dagger Functions.

Dagger executes Dagger Functions entirely as standard OCI containers. This means that Dagger Functions can be tested and debugged locally, and that the same Dagger Function will run consistently on your local machine, a CI runner, a dedicated server, or any container hosting service. This portability is one of Dagger's key advantages, and this guide demonstrated it in action by using the same Dagger Functions on the local host and on GitHub.

Appendix A: Create a GitHub repository with an example Express application

This tutorial assumes that you have a GitHub repository with a Node.js Web application. If not, follow the steps below to create a GitHub repository and commit an example Express application to it.

  1. Create a directory for the Express application:

    mkdir myapp
    cd myapp
  2. Create a skeleton Express application:

    npx express-generator
  3. Make a minor modification to the application's index page:

    sed -i -e 's/Express/Dagger/g' routes/index.js
  4. Initialize a local Git repository for the application:

    git init
  5. Add a .gitignore file and commit the application code:

    echo node_modules >> .gitignore
    git add .
    git commit -a -m "Initial commit"
  6. Use the GitHub CLI to create a private repository in your GitHub account and push the changes to it:

    gh auth login
    gh repo create myapp --push --source . --private