Skip to main content

Use Dagger with GitHub Actions and Google Cloud

note

Watch a live demo of this tutorial in the Dagger Community Call (12 Jan 2023). For more demos, join the next Dagger Community Call.

Introduction

This tutorial teaches you how to use a Dagger pipeline to continuously build and deploy a Node.js application with GitHub Actions on Google Cloud Run. You will learn how to:

  • Configure a Google Cloud service account and assign it the correct roles
  • Create a Google Cloud Run service accessible at a public URL
  • Create a Dagger pipeline using the Node.js SDK
  • Run the Dagger pipeline on your local host to manually build and deploy the application on Google Cloud Run
  • Use the same Dagger pipeline with GitHub Actions to automatically build and deploy the application on Google Cloud Run on every repository commit

Requirements

This tutorial assumes that:

Step 1: Create a Google Cloud service account

The Dagger pipeline demonstrated in this tutorial (re)builds a container image of an application every time a new commit is added to the application's repository. It then publishes the container image to Google Container registry and deploys it at a public URL using Google Cloud infrastructure.

This requires the following:

  • A Google Cloud service account with all necessary privileges
  • A Google Cloud Run service with a public URL and defined resource/capacity/access rules
  • Access to various Google Cloud APIs
info

This step discusses how to create a Google Cloud service account. If you already have a Google Cloud service account and key for your project, skip to Step 2.

Create a Google Cloud service account, as follows:

  1. Log in to the Google Cloud Console and select your project.

  2. From the navigation menu, click IAM & Admin -> Service Accounts.

  3. Click Create Service Account.

  4. In the Service account details section, enter a string in the Service account ID field. This string forms the prefix of the unique service account email address.

    Create Google Cloud service account

  5. Click Create and Continue.

  6. In the Grant this service account access to project section, select the Service Account Token Creator and Editor roles.

    Create Google Cloud service account roles

  7. Click Continue.

  8. Click Done.

Once the service account is created, the Google Cloud Console displays it in the service account list, as shown below. Note the service account email address, as you will need it in the next step.

List Google Cloud service accounts

Next, create a JSON key for the service account as follows:

  1. From the navigation menu, click IAM & Admin -> Service Accounts.
  2. Click the newly-created service account in the list of service accounts.
  3. Click the Keys tab on the service account detail page.
  4. Click Add Key -> Create new key.
  5. Select the JSON key type.
  6. Click Create.

The key is created and automatically downloaded to your local host through your browser as a JSON file.

Create Google Cloud service account key

danger

Store the JSON service account key file safely as it cannot be retrieved again.

Step 2: Configure Google Cloud APIs and a Google Cloud Run service

The next step is to enable access to the required Google Cloud APIs:

  1. From the navigation menu, select the APIs & Services -> Enabled APIs & services option.

  2. Select the Enable APIs and Services option.

  3. On the API Library page, search for and select the Cloud Run API entry.

  4. On the API detail page, click Enable.

    Enable Google Cloud API

  5. Repeat the previous two steps for the IAM Service Account Credentials API.

Once the APIs are enabled, the Google Cloud Console displays the updated status of the APIs.

The final step is to create a Google Cloud Run service and corresponding public URL endpoint. This service will eventually host the container deployed by the Dagger pipeline.

  1. From the navigation menu, select the Serverless -> Cloud Run product.

  2. Select the Create Service option.

  3. Select the Deploy one revision from an existing container image option. Click Test with a sample container to have a container image URL pre-filled.

  4. Continue configuring the service with the following inputs:

    • Service name: myapp (modify as needed)
    • Region: us-central1 (modify as needed)
    • CPU allocation and pricing: CPU is only allocated during request processing
    • Minimum number of instances: 0 (modify as needed)
    • Maximum number of instances: 1 (modify as needed)
    • Ingress: Allow all traffic
    • Authentication: Allow unauthenticated invocations

    Create Google Cloud Run service

  5. Click Create to create the service.

The new service is created. The Google Cloud Console displays the service details, including its public URL, on the service detail page, as shown below.

View Google Cloud Run service details

Step 3: Create the Dagger pipeline

The next step is to create a Dagger pipeline to do the heavy lifting: build a container image of the application, release it to Google Container Registry and deploy it on Google Cloud Run.

  1. In the application directory, install the Dagger SDK and the Google Cloud Run client library as development dependencies:

    npm install @dagger.io/dagger@latest @google-cloud/run --save-dev
  2. Create a new sub-directory named ci. Within the ci directory, create a file named main.mjs and add the following code to it. Replace the PROJECT placeholder with your Google Cloud project identifier and adjust the region (us-central1) and service name (myapp) if you specified different values when creating the Google Cloud Run service in Step 2.

    import { connect } from "@dagger.io/dagger"

    import { ServicesClient } from "@google-cloud/run";

    const GCR_SERVICE_URL = 'projects/PROJECT/locations/us-central1/services/myapp'
    const GCR_PUBLISH_ADDRESS = 'gcr.io/PROJECT/myapp'

    // initialize Dagger client
    connect(async (daggerClient) => {
    // get reference to the project directory
    const source = daggerClient.host().directory(".", { exclude: ["node_modules/", "ci/"] })

    // get Node image
    const node = daggerClient.container({ platform: "linux/amd64" }).from("node:16")

    // mount cloned repository into Node image
    // install dependencies
    const c = node
    .withMountedDirectory("/src", source)
    .withWorkdir("/src")
    .withExec(["cp", "-R", ".", "/home/node"])
    .withWorkdir("/home/node")
    .withExec(["npm", "install"])
    .withEntrypoint(["npm", "start"])

    // publish container to Google Container Registry
    const gcrContainerPublishResponse = await c
    .publish(GCR_PUBLISH_ADDRESS)

    // print ref
    console.log(`Published at: ${gcrContainerPublishResponse}`)

    // initialize Google Cloud Run client
    const gcrClient = new ServicesClient();

    // define service request
    const gcrServiceUpdateRequest = {
    service: {
    name: GCR_SERVICE_URL,
    template: {
    containers: [
    {
    image: gcrContainerPublishResponse,
    ports: [
    {
    name: "http1",
    containerPort: 3000
    }
    ]
    }
    ],
    },
    }
    };

    // update service
    const [gcrServiceUpdateOperation] = await gcrClient.updateService(gcrServiceUpdateRequest);
    const [gcrServiceUpdateResponse] = await gcrServiceUpdateOperation.promise();

    // print ref
    console.log(`Deployment for image ${gcrContainerPublishResponse} now available at ${gcrServiceUpdateResponse.uri}`)

    }, {LogOutput: process.stdout})

    This file performs the following operations:

    • It imports the Dagger and Google Cloud Run client libraries.
    • It creates a Dagger client with connect(). This client provides an interface for executing commands against the Dagger engine.
    • It uses the client's host().workdir() method to obtain a reference to the current directory on the host, excluding the node_modules and ci directories. This reference is stored in the source variable.
    • It uses the client's container().from() method to initialize a new container from a base image. The additional platform argument to the container() method instructs Dagger to build for a specific architecture. In this example, the base image is the node:16 image and the archiecture is linux/amd64, which is one of the architectures supported by Google Cloud. This method returns a Container representing an OCI-compatible container image.
    • It uses the previous Container object's withMountedDirectory() method to mount the host directory into the container at the /src mount point, and the withWorkdir() method to set the working directory in the container.
    • It chains the withExec() method to copy the contents of the working directory to the /home/node directory in the container and then uses the withWorkdir() method to change the working directory in the container to /home/node.
    • It chains the withExec() method again to install dependencies with npm install and sets the container entrypoint using the withEntrypoint() method.
    • It uses the container object's publish() method to publish the container to Google Container Registry, and prints the SHA identifier of the published image.
    • It creates a Google Cloud Run client, updates the Google Cloud Run service defined in Step 2 to use the published container image, and requests a service update.
tip

Most Container object methods return a revised Container object representing the new state of the container. This makes it easy to chain methods together. Dagger evaluates pipelines "lazily", so the chained operations are only executed when required - in this case, when the publish() method is called.

Step 4: Test the Dagger pipeline on the local host

Configure credentials for the Google Cloud SDK on the local host, as follows:

  1. Configure Docker credentials for Google Container Registry on the local host using the following commands. Replace the SERVICE-ACCOUNT-ID placeholder with the service account email address created in Step 1, and the SERVICE-ACCOUNT-KEY-FILE placeholder with the location of the service account JSON key file downloaded in Step 1.

    gcloud auth activate-service-account SERVICE-ACCOUNT-ID --key-file=SERVICE-ACCOUNT-KEY-FILE
    gcloud auth configure-docker
    info

    This step is necessary because Dagger relies on the host's Docker credentials and authorizations when publishing to remote registries.

  2. Set the GOOGLE_APPLICATION_CREDENTIALS environment variable to the location of the service account JSON key file, replacing the SERVICE-ACCOUNT-KEY-FILE placeholder in the following command. This variable is used by the Google Cloud Run client library during the client authentication process.

    export GOOGLE_APPLICATION_CREDENTIALS=SERVICE-ACCOUNT-KEY-FILE

Once credentials are configured, test the Dagger pipeline by running the command below:

node ci/main.mjs

Dagger performs the operations defined in the pipeline script, logging each operation to the console. At the end of the process, the built container is deployed to Google Cloud Run and a message similar to the one below appears in the console output:

Deployment for image gcr.io/PROJECT/myapp@sha256:b1cf... now available at https://...run.app

Browse to the URL shown in the deployment message to see the running application.

If you deployed the example application from Appendix A, you should see a page similar to that shown below:

Result of running pipeline from local host

Step 5: Create a GitHub Actions workflow

Dagger executes your pipelines entirely as standard OCI containers. This means that the same pipeline will run the same, whether on on your local machine or a remote server.

This also means that it's very easy to move your Dagger pipeline from your local host to GitHub Actions - all that's needed is to commit and push the pipeline script from your local clone to your GitHub repository, and then define a GitHub Actions workflow to run it on every commit.

  1. Commit and push the pipeline script and related changes to the application's GitHub repository:

    git add .
    git commit -a -m "Added pipeline"
    git push
  2. In the GitHub repository, create a new workflow file at .github/workflows/main.yml with the following content:

    name: 'ci'

    on:
    push:
    branches:
    - master

    jobs:
    dagger:
    runs-on: ubuntu-latest
    steps:
    -
    name: Checkout
    uses: actions/checkout@v3
    -
    id: 'auth'
    name: 'Authenticate to Google Cloud'
    uses: 'google-github-actions/auth@v1'
    with:
    token_format: 'access_token'
    credentials_json: '${{ secrets.GOOGLE_CREDENTIALS }}'
    -
    name: Login to Google Container Registry
    uses: docker/login-action@v2
    with:
    registry: gcr.io
    username: oauth2accesstoken
    password: ${{ steps.auth.outputs.access_token }}
    -
    name: Setup node
    uses: actions/setup-node@v3
    with:
    node-version: 16.13.x
    cache: npm
    -
    name: Install
    run: npm install
    -
    name: Release and deploy with Dagger
    run: node ci/main.mjs

    This workflow runs on every commit to the repository master branch. It consists of a single job with six steps, as below:

    • The first step uses the Checkout action to check out the latest source code from the main branch to the GitHub runner.
    • The second step uses the Authenticate to Google Cloud action to authenticate to Google Cloud. It requires a service account key in JSON format, which it expects to find in the GOOGLE_CREDENTIALS GitHub secret. This step sets various environment variables (including the GOOGLE_APPLICATION_CREDENTIALS variable required by the Google Cloud Run SDK) and returns an access token as output, which is used to authenticate the next step.
    • The third step uses the Docker Login action and the access token from the previous step to authenticate to Google Container Registry from the GitHub runner. This is necessary because Dagger relies on the host's Docker credentials and authorizations when publishing to remote registries.
    • The fourth step uses the Setup Node.js environment action to download and install Node.js 16.x on the GitHub runner.
    • The fifth step downloads and installs the application's dependencies on the GitHub runner with npm install.
    • The sixth and final step executes the Dagger pipeline.

The Authenticate to Google Cloud action looks for a JSON service account key in the GOOGLE_CREDENTIALS GitHub secret. Create this secret as follows:

  1. Navigate to the Settings -> Secrets -> Actions page in the GitHub Web interface.
  2. Click New repository secret to create a new secret.
  3. Configure the secret with the following inputs:
    • Name: GOOGLE_CREDENTIALS
    • Secret: The contents of the service account JSON key file downloaded in Step 1.
  4. Click Add secret to save the secret.

Create GitHub secret

Step 6: Test the Dagger pipeline on GitHub

Test the Dagger pipeline by committing a change to the GitHub repository.

If you are using the example application described in Appendix A, the following commands modify and commit a simple change to the application's index page:

git pull
sed -i 's/Dagger/Dagger on GitHub/g' routes/index.js
git add routes/index.js
git commit -a -m "Update welcome message"
git push

The commit triggers the GitHub Actions workflow defined in Step 6. The workflow runs the various steps of the dagger job, including the pipeline script.

At the end of the process, a new version of the built container image is released to Google Container Registry and deployed on Google Cloud Run. A message similar to the one below appears in the GitHub Actions log:

Deployment for image gcr.io/PROJECT/myapp@sha256:h4si... now available at https://...run.app

Browse to the URL shown in the deployment message to see the running application. If you deployed the example application with the additional modification above, you see a page similar to that shown below:

Result of running pipeline from GitHub

Conclusion

This tutorial walked you through the process of creating a Dagger pipeline to continuously build and deploy a Node.js application on Google Cloud Run. It used the Dagger Node.js SDK and explained key concepts, objects and methods available in the SDK to construct a Dagger pipeline.

Dagger executes your pipelines entirely as standard OCI containers. This means that pipelines can be tested and debugged locally, and that the same pipeline will run consistently on your local machine, a CI runner, a dedicated server, or any container hosting service. This portability is one of Dagger's key advantages, and this tutorial demonstrated it in action by using the same pipeline on the local host and on GitHub.

Use the API Key Concepts page and the Node.js SDK Reference to learn more about Dagger.

Appendix A: Create a GitHub repository with an example Express application

This tutorial assumes that you have a GitHub repository with a Node.js Web application. If not, follow the steps below to create a GitHub repository and commit an example Express application to it.

  1. Log in to GitHub using the GitHub CLI:

    gh auth login
  2. Create a directory for the Express application:

    mkdir myapp
    cd myapp
  3. Create a skeleton Express application:

    npx express-generator
  4. Make a minor modification to the application's index page:

    sed -i -e 's/Express/Dagger/g' routes/index.js
  5. Initialize a local Git repository for the application:

    git init
  6. Add a .gitignore file and commit the application code:

    echo node_modules >> .gitignore
    git add .
    git commit -a -m "Initial commit"
  7. Create a private repository in your GitHub account and push the changes to it:

    gh repo create myapp --push --source . --private