Set up your local development environment to have a great developer experience while working on your serverless application.

Developing code for serverless platforms requires a different approach to your development flow. Since the platform provides and maintains the entire stack, from compute infrastructure up to the function handler, your code runs in a fully managed and abstracted environment. This can make it time consuming and inefficient to debug your code by deploying and invoking a Function in the cloud. Fortunately, Cloud Functions offers an alternative that lets you implement and debug your code much faster.

In this blog post you’ll learn how to do the following:

  • Run a Cloud Function locally
  • Invoke a Cloud Function with an Eventarc event locally
  • Use the same permissions as if it were deployed in the Cloud
  • Fetch secrets stored remotely from Secret Manager
  • Set Breakpoints in Visual Studio Code within a local Cloud Function

Cloud Functions builds upon the open source Functions Framework

Google Cloud strongly drives and aligns on open standards and open source. Cloud Functions are no exception. 

In fact, Google has a fully open-sourced runtime environment, responsible for wrapping a function code within a persistent HTTP application, known as the Functions Framework . It enables developers to run the same runtime environment as Cloud Functions on their machines or anywhere else. As a developer, you no longer need to make assumptions about how the serverless environment will behave or how to emulate the production experience.

As shown above, a 2nd gen Cloud Function actually represents a container hosted on Google’s Serverless container infrastructure. Google fully provides the container’s operation system, the necessary language runtime, and the Functions Framework. All these layers are packed together with your function code and its dependencies using Cloud Native Buildpacks during the deployment process. 

A real world example

Here is a typical example of a TypeScript application that processes thousands of images daily. This application derives insights from images and stores the resulting object labels. The following diagram illustrates this scenario as an event-driven microservice. 

For each newly uploaded image to a Cloud Storage bucket, Eventarc invokes a 2nd gen Cloud Function and passes along the image location including other event metadata. The TypeScript application code needs to access a secret stored in Secret Manager, uses the Vision API to extract object labels, and indexes the image to a Cloud Firestore document database.

The following code snippet shows the function code. The full example is available in this GitHub repository.

cloudEvent("index", async (cloudevent: CloudEvent<StorageObjectData>) => {
  console.log("-----\nProcessing for ", cloudevent.subject, "\n-----");
  if (!cloudevent.data) {
    throw "CloudEvent does not contain data."
  }
  const filePath = `${cloudevent.data.bucket}/${cloudevent.data.name}`;
  //Get labes for Image via the Vision API
  const [result] = await imageAnnotatorClient.labelDetection(`gs://${filePath}`);
  const labelValues = result.labelAnnotations?.flatMap((o) => o.description);
  //hash file name with secret
  const hashedFilePath = createHmac('sha256', secret)
    .update(filePath)
    .digest('hex');
  //Store with filePath as key
  await firestore
    .doc(`${cloudevent.data.bucket}/${hashedFilePath}`)
    .set(
      {
        name: cloudevent.data.name,
        labels: labelValues,
        updated: cloudevent.time
      }
    )
  console.log(`Successfully stored ${cloudevent.data.name} to Firestore`)
});

This Function code uses multiple dependencies from the Google Cloud ecosystem:

  • Eventarc to trigger an invocation 
  • Secret Manager to provide a secret for filename encryption
  • Cloud Vision API to detect and fetch labels for the uploaded image
  • Cloud Firestore for storing the result

The README file in the repository contains all necessary steps to deploy this example application in your Google Cloud project.

In the following section, you will learn how to set up your local development environment to execute this TypeScript function outside of Cloud Functions. This setup allows you to iterate faster on your code by testing locally before deploying.   

How to run a Cloud Function locally 

The Functions Framework can be installed and run on any platform that supports the language, including your local machine or remote development servers.

As shown in this example function’s directory, you can install the Functions Framework library for TypeScript with the following CLI command.  If your language of choice is Python, Go, Java, C#, Ruby or PHP, you can checkout the documentation about how to install the Functions Framework for your preferred language.

npm install --save @google-cloud/functions-framework

Since the application is using TypeScript, you need to compile the source code and reference the ‘output’ folder as the source for the Functions Framework. The following code snippet from the package.json allows for a convenient hot reload and uses tsc with the -w watch flag to watch for changes to recompile. It also uses nodemon to watch for newly compiled files to automatically restart the local Functions Framework.

"scripts": {
    ...
    "compile": "tsc",
    "debug": "node --inspect node_modules/.bin/functions-framework --source=build/src/ --target=index",
    "watch": "concurrently \"npm run compile -- --watch\" \"nodemon --watch ./build/ --exec npm run debug\"",
    ...
 }

After installing all application dependencies with npm install, you can use the command npm run watch to generate a local endpoint http://localhost:8080/. 

Let’s investigate how to use this local endpoint of your event-triggered Cloud Function.

How to invoke a Cloud Function with an Eventarc event locally

The function code expects to be invoked by Eventarc. Eventarc uses the open standards of CloudEvents specifications for event structure. The direct integration with Cloud Functions uses a HTTP protocol binding in which a HTTP request body contains event-specific headers (like type, time, source, subject and specification version) and a body with event data.

To invoke the Cloud Functions code locally with the expected HTTP request structure and payload, you can use a simple curl command. The following curl command implements the HTTP protocol binding for an google.cloud.storage.object.v1.finalized event of an uploaded image file CF_debugging_architecture.png to the bucket called image_bucket and invoking your Cloud Function listening on localhost port 8080.

curl localhost:8080 -v \
  -X POST \
  -H "Content-Type: application/json" \
  -H "ce-id: 123451234512345" \
  -H "ce-specversion: 1.0" \
  -H "ce-time: 2022-12-31T00:00:00.0Z" \
  -H "ce-type: google.cloud.storage.object.v1.finalized" \
  -H "ce-source: //storage.googleapis.com/projects/_/buckets/image_bucket" \
  -H "ce-subject: objects/CF_debugging_architecture.png" \
  -d '{
        "bucket": "image_bucket",
        "contentType": "text/plain",
        "kind": "storage#object",
        "md5Hash": "...",
        "metageneration": "1",
        "name": "CF_debugging_architecture.png",
        "size": "352",
        "storageClass": "MULTI_REGIONAL",
        "timeCreated": "2022-12-31T00:00:00.0Z",
        "timeStorageClassUpdated": "2022-12-31T00:00:00.0Z",
        "updated": "2022-12-31T00:00:00.0Z"
      }'

Alternatively, CloudEvents provides a CLI tool called CloudEvents Conformance for testing CloudEvents receivers. It helps format the HTTP request by using a YAML file to define the expected Eventarc event of an uploaded image to a Cloud Storage Bucket. You can find an example of such a yaml in the GitHub repository.

The example application assumes that the file CF_debugging_architecture.png actually exists and is stored in image_bucket bucket. If the file does not exist, the local execution will exit with an error. 

However, since this example Cloud Function relies on external dependencies like the Cloud Storage Bucket or the Vision API, the invocation throws an UNAUTHENTICATED error immediately. You’ll see how to fix this in the next section.

To recap, a simple curl command can be used to send an Eventarc event for an uploaded image using the HTTP protocol binding to invoke your code locally. The HTTP body structure for other Eventarc events is available on the CloudEvents GitHub repository.

Next, you’ll see how to authenticate the local Cloud Function. 

How to use the same permissions as if it were deployed in the Cloud

The example Cloud Function uses the Node.js client libraries to interact with Google Cloud services, like the Vision API or Firestore. Authentication for the client libraries happens automatically when deployed on Google Cloud. However, when you’re working locally, you’ll need to configure authentication.  

Google Cloud client libraries handle authentication automatically because they support Application Default Credentials (ADC). ADC automatically finds credentials based on the application environment and uses those credentials to authenticate to Google Cloud APIs.

When you deploy a Cloud Function, Google Cloud provides Service Account credentials via the metadata server. Also gcloud allows you to set ADC locally via gcloud auth application-default login. This command acquires your user credentials to use for ADC.

The --impersonate-service-account flag further allows impersonating a Service Account, and uses its identity for authentication against Google Cloud APIs. For impersonating a Service Account, your user account needs to have the Service Account Token Creator (roles/iam.serviceAccountTokenCreator) role for it. The following command is setting ADCs for a specific Service Account locally.

gcloud auth application-default login \
--impersonate-service-account=[YOUR_SERVICE_ACCOUNT_EMAIL]

For this example application, you can use the same service account that is used by the Cloud Function (i.e. the Function’s identity) when deployed on Google Cloud. If you are not sure which service account is associated with your provisioned Cloud Function, you can use the following command to retrieve its services account email:

gcloud functions describe YOUR_CLOUD_FUNCTION_NAME \
--format="value(serviceConfig.serviceAccountEmail)"

Using the same service account allows you to execute the code locally with the same permissions as the deployed Function. Therefore, you can properly test changes to the service account permissions during local execution.

After executing this command and restarting the local dev server with npm run watch, the function code authenticates against the Vision API and Firestore. Now the code can run and use Google Cloud services as if deployed on Cloud Functions directly, without any code changes. This makes local development very convenient. 

When a Cloud Function is executed in the cloud, any configured secrets from Secret Manager are included into the Function’s container. In the next section, you’ll see how you can reference those secrets in your local development environment.

How to fetch secrets stored remotely from Secret Manager

The Cloud Function service allows customers to use Secret Manager to securely store API keys, passwords, and other sensitive information and pass those secrets as environment variables or mount them as a volume. However, in a local execution environment, this automatic mechanism does not exist. You will need to connect to the Secret Manager in a different way. In this section, you can see how the example application uses such a secret and accesses it via the environment variable SECRET_API_KEY.

For a local development environment, a couple of options are available to use secrets:

  • You can use the Secret Manager client library to directly request the secret during runtime.
  • You can use the gcloud command gcloud secrets versions access to fetch a secret during runtime and inject it as an environment variable securely and in an automated way. The package.json illustrates this:
"scripts": {
    ...
    "compile": "tsc",
     "debug": "export SECRET_API_KEY=$(gcloud secrets versions access 1 --secret='apikey' --impersonate-service-account=[YOUR_SERVICE_ACCOUNT]) && node --inspect node_modules/.bin/functions-framework --source=build/src/ --target=index",
    "watch": "concurrently \"npm run compile -- --watch\" \"nodemon --watch ./build/ --exec npm run debug\"",
    ...
 }

When executing npm run watch, the API key secret from Secret Manager is set as the SECRET_API_KEY environment variable only for this node process. The impersonate-service-account flag allows for realistic secret permission handling in a local development environment at this point as well.

How to set Breakpoints in Visual Studio Code within a local Cloud Function

Debugging is a core feature of the IDE Visual Studio Code and is also very useful when developing a Cloud Function. The IDE has built-in debugging support for the Node.js runtime and can debug JavaScript, TypeScript, and many other languages.

The npm debug script in the package.json contains the --inspect flag when executing the Functions Framework.

"scripts": {
    ...
     "debug": "export SECRET_API_KEY=$(gcloud secrets versions access 1 --secret='apikey' --impersonate-service-account=[YOUR_SERVICE_ACCOUNT]) && node --inspect node_modules/.bin/functions-framework --source=build/src/ --target=index",
...
 }

This flag allows Visual Studio Code to attach its JavaScript debugger, which supports TS source maps. Source maps make it possible to single step through your code or set breakpoints in your uncompiled TypeScript files. Using these built-in Visual Studio Code debugging features provides you with a better debugging experience during local development.  

After configuring Visual Studio Code to use the debugger’s auto-attachment functionality, you can set breakpoints in the editor margin and then run npm run watch to activate the debugger mode. When you invoke the local endpoint using either curl or the CloudEvents Conformance tool, the function code will pause execution at the specified breakpoints. This allows you to investigate the current variable assignments, investigate the call stack, and much more.

Wrapping up

In this article, you saw how to develop and debug a TypeScript Cloud Function locally that contains external dependencies and is triggered by Eventarc events.

You learned how to set up hot-reloading in combination with the Functions Framework to speed up debugging and testing changes in just a few seconds. Additionally, you saw how to use local and impersonated Application Default Credentials to execute the local function code with the same permissions, as if deployed on Cloud Functions. Lastly, having the ability to retrieve secrets from the Secret Manager or using breakpoints in Visual Studio Code during debugging greatly support local development, since you do not have to make any modifications to your code.

The open cloud approach of Google Cloud enables a familiar and productive development environment, even when programming serverless applications. Just stay local, and develop and debug fast with Google Cloud Serverless.