Deploying to Cloud Functions with Terraform
In this tutorial you are going to deploy a simple Node.js API to Google Cloud Functions using Terraform.
Cloud Functions is a compute solution from Google Cloud Platform (GCP) . It provides functions as a service (FaaS), which is a way to run your code “on-demand”, without managing any servers.
For deployment we choose Terraform, a command-line tool to build and deploy infrastructure using code. Terraform will help create a predictable and reproducible environment to run your code.
Since it’s not the main focus of this tutorial, we will go with a super simple Node.js API using Fastify. Feel free to use any other languages supported by Cloud Functions on this part.
After finished, you will have an up and running API with an URL that you can make requests to.
Prerequisites
To follow this guide you will need:
- Terraform 0.13 or later. You can find the installation instructions here;
- Google Cloud SDK. Any recent version should be fine. Installation instructions here;
- Node.js 12 or later. If you don’t have Node.js installed, I recommend using nvm for that.
1. Setting up the GCP account
If you are using the Google Cloud SDK for the first time, you will need to authenticate with your Google Account. You can run the following command:
Now create the project on GCP:
Note: replace
PROJECT_ID
with an unique project identifier (e.g. “my-app-2847162”).
Set the project you just created as the default one. This will make it easier to run the subsequent commands.
Many features on GCP require a billing account linked to the project, Cloud functions is one of them. For this step, you will need to visit the dashboard:
Create a billing account on GCP.
Note: even though Google asks for your credit card, this tutorial should not cost any money to run. The first 2 million invocations of a Cloud Function are free. You will also learn how to shutdown and destroy all the resources created here.
After setting up billing, the account will be listed when you run the following command:
The output will look something like this:
Copy the account ID and run the following command to link the billing account to your project:
Now you are going to structure the project.
2. Structuring the project
Create the files listed below so your repository looks like this:
Don’t worry about adding any content now. We will do that on the next step.
Note: This is an opinionated, but very common structure for Terraform projects. Terraform doesn’t require your files to be in a certain disposition or have a specific name. Although not recommended, you can build an entire system within a single
main.tf
file.
The terraform/
folder contain files related to Terraform.
The src
folder hosts the code for the Node.js API. Remember, the API code will be very simple, only a single index.js
file is enough.
3. Writing the API
Let’s write the API using Fastify.
If you are following this tutorial with a different language, you can add your custom code at this point.
Note: If you never heard about Fastify before, it has a very similar API to other Node.js frameworks like Express.
First, initialize the project with npm and install fastify
as a dependency:
Add this content to the src/index.js
file:
Update the entry point for your code in the package.json
file:
This will tell Cloud Functions where your API is located. Now let’s jump to the terraform/
folder and start writing the infrastructure code.
4. Writing the infrastructure code
At this point, you already have all the files and folders created inside your terraform/
folder.
Before start adding code to them, let’s take a look at each file’s responsibility:
backend.tf
. Declares which Terraform backend you will use.main.tf
. Where you will write the logic for creating resources or invoking modules.variables.tf
. Lists the variables and their values that will be used onmain.tf
.outputs.tf
. Lists the values your Terraform code will return.modules/
. A place for your Terraform modules. In this case, there will be just one namedfunction
.
Note: The name
function
was chosen for the module because it creates a Cloud Function. Feel free to use a different name.
Note: If you are not familiar with modules in Terraform, think of them as a function or component. It’s something that you write once and then can reuse on other parts of your code.
Begin by declaring which Terraform backend you want to use - where you want to store your Terraform state files.
Let’s choose the “local” backend for now, meaning the state files will be stored on your local repository.
Now add the following variables to your terraform/variables.tf
file:
In terraform/main.tf
, declare the provider Terraform will connect to. In your case, the Google Cloud Platform provider (named "google"
) .
The Google provider has two required parameters, project and region. We can reference the values declared on the step above by accessing the properties in the var
object.
You will go back to this file soon to add more configuration.
Creating the function
module
In order to create a Cloud Function on GCP you will need to combine a few resources together:
- A storage bucket, to store the code that will be executed by the function
- The function itself, to run the code you wrote
- An IAM policy, to allow users to invoke the function
These resources will be grouped in the Terraform module you are about to create.
This will make it easier if you want to deploy a second environment (e.g. development & staging) or create multiple functions - you can just invoke the module again with different parameters.
On variables.tf
inside terraform/modules/function
, add the arguments needed by the module. All arguments are required, so don’t add default values.
Proceeding to terraform/modules/function/main.tf
, add the logic to create the function and all resources needed.
⚠️ This file is a bit dense. Follow through the comments in it to get a better idea of what’s happening.
This file is dealing with all the logic of compressing the source code, storing it in a bucket, creating the Cloud Function and setting the necessary permissions to it.
Using your module
Now that you have your function
module ready, you can invoke it in other parts of your Terraform code.
Go back to the entry point file on terraform/main.tf
and add the following:
When running the file above, Terraform will look for a main.tf
file on the path declared in the source
parameter and run the code there along with the other variables.
Note: The
function_entry_point
must match the name of the exported variable in your Node.js code. You will findexports.app = …
on the bottom ofsrc/index.js
.
In the terraform/outputs.tf
file, add the return values from the module you want to use. Since the module only returns one output value, your file should look like this:
Now let’s see how to deploy all the resources with the Terraform CLI.
5. Deploying
The hard work is already done! Creating the infrastructure should be an easier step.
Run the following commands on the root for your repository to create all the resources and deploy your code:
If everything works well, you will see a similar output in your terminal:
You can verify that it works with a simple curl
command. Remember to replace the URL with your own URL.
Updating the function
Your first deploy is never final. Eventually, you will want to deploy new versions of the code that runs in the Cloud Function.
After changing and testing your code, you can simply run terraform apply
in your terminal. Terraform will compress your source files, store them in the Cloud Storage bucket and update the function with the new code.
Destroying the function
You can clean up all the resources created by running terraform destroy
.
The project won’t be deleted this way (it wasn’t created by Terraform). For that, you can run:
6. Going further
This tutorial provides a quick way to get started. Many other good practices can be incorporated to build a more robust application:
Remote Terraform backend. If you check your repository you will notice that a state file was created by Terraform. It’s a good practice to store this file in a remote storage. You can change the backend from “local” to a Cloud Storage bucket, for example. See the list of available backends here.
Multiple environments. You might want to deploy the same infrastructure here under a different environment (e.g. development & production). There are many ways to do it with Terraform and you will find lots of tutorials around.
Continuous deployment. Ideally, you shouldn’t be running terraform plan
and terraform apply
from your local machine. This should be done as part of the automation process of a CI/CD solution such as Cloud Build or GitHub Actions.
This whole tutorial, plus some of the things are implemented on this repository on GitHub. Check it out!