Server­less Func­tions Explained: Set­ting Up for Local Devel­op­ment and Testing



In Part 1 of this art­icle, Server­less Func­tions Explained: Fun­da­ment­als, we talked about server­less archi­tec­ture and func­tions. Now, we are ready to apply the know­ledge we gathered. Part two aims to demon­strate how we can run a local envir­on­ment to develop and test our server­less func­tions, using a simple applic­a­tion as an example.

We will go through:

  • Applic­a­tion Description
  • Local Envir­on­ment
  • How it works?
  • Time to execute

Applic­a­tion Description

The server­less applic­a­tion we will build has three fea­tures: to cre­ate and save a mes­sage into a data­base, to get all mes­sages, and get the first mes­sage from the data­base. In other words, we are going to have three functions.

Each func­tion will be triggered by an HTTP request. For example, the “get­First­Mes­sage” func­tion will invoke the “get­Mes­sages” one, return­ing only the first. Finally, each HTTP request should go through an API Gate­way, redir­ect­ing each request to the right server­less func­tion. Let’s check the fol­low­ing image to bet­ter under­stand the application.

A Stick figure  with an arrow indicating an interaction with the cloud, with an interaction to an API Gateway. The Api Gateway has interactions with createMessage, get massages and getfirstmessage. create Message and get Massage are interacting with database.
Fig­ure 1: Applic­a­tion Draft

Now that we know what the applic­a­tion should do, we need to choose a cloud pro­vider and decide which of its ser­vices we need to use. For this example, we aren’t going to deploy the applic­a­tion to the cloud, the goal is to cre­ate a local envir­on­ment like the one in the cloud provider.

Let’s use AWS as the cloud pro­vider. AWS offers a ser­vice named AWS Lambdas, which allows the developer to upload his func­tions (lambdas).  We can use AWS DynamoDB, a NoSQL data­base, and finally, for the API gate­way, we can use AWS API Gate­way (yeah… no fancy name for this one).

A stick figure interacting with the aws cloud provider, interacting with AWS create Message, AWS get message and AWS get first Message. These interacting with Amazon Dynamo DB
Fig­ure 2: Applic­a­tion Draft with AWS services

Great! We’ve defined what to build, let’s explore the tools we’ll use to cre­ate the local envir­on­ment and run these server­less functions.

Local Envir­on­ment

In the art­icle, Server­less Func­tions Explained: Fun­da­ment­als, we men­tioned tech­niques to help us develop server­less applic­a­tions loc­ally. One of these tech­niques was emu­la­tion, which enables us to cre­ate a local envir­on­ment that mim­ics the cloud envir­on­ment as closely as pos­sible. Since the pur­pose of this art­icle is to cre­ate a local envir­on­ment for devel­op­ing and test server­less applic­a­tions using mul­tiple cloud ser­vices, emu­la­tion will be the only tech­nique we’ll use.

To emu­late the AWS ser­vices, we will use Loc­al­Stack. This tool can emu­late numer­ous AWS ser­vices under its free tier, includ­ing the spe­cific ser­vices required for this application. 

With the emu­la­tion tool chosen, we still need some­thing to define and con­fig­ure our server­less applic­a­tion, includ­ing spe­cify­ing which func­tions to deploy and the ser­vices we will need. The Server­less Frame­work is an open-source tool that helps us build and deploy cloud applic­a­tions. With a YAML file, we can define and con­fig­ure our desired setup. Addi­tion­ally, we will use the server­less-loc­al­stack plu­gin, which integ­rates the Server­less Frame­work with Loc­al­Stack. This plu­gin allows developers to deploy and test AWS ser­vices loc­ally without an inter­net con­nec­tion, avoid­ing poten­tial costs.

A stick figure interacting with the aws cloud provider, interacting with AWS create Message, AWS get message and AWS get first Message. These interacting with Amazon Dynamo DB and in the circle of AWS a serverless-localstack is plugged in.
Fig­ure 3: Local Envir­on­ment Setup

Addi­tion­ally, we’ll need Docker and Docker Com­pose to run a Loc­al­Stack con­tainer, and Node.js (ver­sion 20.x or higher) to install our depend­en­cies and com­pile our func­tions. The func­tions code won’t be shown in the art­icle, but if you want to fol­low along, you can clone the repos­it­ory.

How it works?

Loc­al­stack

First, we need to have Loc­al­Stack run­ning on our machine. We could either install Loc­al­Stack CLI or run a Loc­al­Stack docker image. In this example, we’ll run it using a Docker image.

A code of image, ports, environment and volumes.
Fig­ure 4: Docker com­pose YAML file

In this docker-com­pose YAML file, we are con­fig­ur­ing a single ser­vice named “loc­al­stack” with its latest image avail­able. The exposed port 4566, serves as the entry point to inter­act with Loc­al­Stack con­tainer.  The volume con­fig­ur­a­tion mounts the Docker socket from the host into the con­tainer. This setup is essen­tial for Loc­al­Stack to man­age Docker con­tain­ers to execute Lambda func­tions. In other words, it allows Loc­al­Stack to start a con­tainer to run a function.

Regard­ing the ser­vices envir­on­ment vari­able, this defines which ser­vices Loc­al­Stack will emu­late. The Lambda, DynamoDB, and API Gate­way are expec­ted, as we need those to achieve our applic­a­tion archi­tec­ture. The other three ser­vices are required by the Server­less Frame­work dur­ing the deployment.

The “logs” ser­vice cor­res­ponds to the AWS Cloud­Watch Log ser­vice, which is required because, when deploy­ing Lambda func­tions, each Lambda is asso­ci­ated with a Cloud­Watch Log Group. This log group cap­tures the out­put of the Lambda func­tion, which is cru­cial for debug­ging and mon­it­or­ing. By default, Server­less Frame­work cre­ates a log group for each Lambda func­tion we deploy.

AWS Iden­tity and Access Man­age­ment (IAM) ser­vice is neces­sary for Lambda func­tions and other AWS ser­vices to oper­ate cor­rectly. The Server­less Frame­work auto­mat­ic­ally cre­ates these IAM roles and policies dur­ing the deploy­ment pro­cess. For example, we have to con­fig­ure an IAM role to write and read from our DynamoDB table.

Finally, the Server­less Frame­work trans­lates our serverless.yaml con­fig­ur­a­tion into a Cloud­Form­a­tion tem­plate. This tem­plate describes the infra­struc­ture and resources (e.g., Lambda func­tions, API Gate­way end­points, DynamoDB tables) needed to be cre­ated, updated, or deleted. There­fore, Server­less Frame­work won’t be able to deploy and man­age any­thing without this service.

Server­less Framework

With the Server­less Frame­work, we need to deploy our Lambda func­tions, DynamoDB table, and con­fig­ure the end­points for the API Gate­way to Loc­al­Stack. Let’s look at how we can do this with a serverless.yaml file.

A serverless framework code.
Fig­ure 5: Server­less YAML file – A server­less frame­work code.

We start by defin­ing a ser­vice name, used to group all func­tions, resources, and con­fig­ur­a­tions asso­ci­ated with this ser­vice. Then, we need to describe our pro­vider, which cor­res­ponds to the cloud pro­vider we are devel­op­ing for, in this case, AWS.  The envir­on­ment vari­able set at the pro­vider level holds the table name we’ll use, mak­ing it access­ible to every func­tion. The same goes for the IAM role state­ment, where each func­tion will have all per­mis­sions to inter­act with the ‘Mes­sages‘ table. Each func­tion can have its own cus­tom envir­on­ment vari­ables and IAM roles, but, for sim­pli­city pur­poses, defin­ing them at the pro­vider level will do the trick.

A serverless-localstack code.
Fig­ure 6: Server­less YAML file – A server­less-loc­al­stack code.

Next, we extend Server­less Frame­work fea­tures by adding the server­less-loc­al­stack plu­gin, allow­ing us to deploy our resources into the run­ning Loc­al­Stack envir­on­ment. In the cus­tom sec­tion, we add cus­tom vari­ables and con­fig­ur­a­tions used by plu­gins. In this case, we add a cus­tom con­fig­ur­a­tion for the loc­al­stack ser­vice cor­res­pond­ing to the server­less-loc­al­stack plu­gin. Here, we spe­cify which stage should inter­act with Loc­al­Stack, the host URL, and the cor­res­pond­ing ser­vice port, and we enable live code reload­ing for Lambda functions.

A Code for createMessage, getmessage and getfirstMessage.
Fig­ure 7: Server­less YAML fileA Code for cre­ateMes­sage, get­mes­sage and getfirstMessage.

In the func­tions sec­tion, we define the loc­a­tion and end­points of our func­tions. This way, when an HTTP request hits the API Gate­way, it will invoke the cor­rect func­tion. We also set an exe­cu­tion time of 5 minutes to ensure the func­tion does­n’t run indef­in­itely. These func­tions were developed and com­piled using Typescript, but you could use dif­fer­ent pro­gram­ming languages.

A Code for AWS DynamoDB Table.
Fig­ure 8: Server­less YAML fileA Code for AWS DynamoDB Table.

Finally, the last sec­tion on the serverless.yaml is the DynamoDB table resource. Here, we define the Mes­sages table with one primary key attrib­ute. We also set a limit for the read/write capa­city per second. Although we only set one attrib­ute on the table, we can add more attrib­utes within the func­tions since DynamoDB is a schema-less data­base and allows for flex­ible and dynamic addi­tion of attrib­utes to items as needed.

Time to Execute

Once we execute the:

docker-compose up

Loc­al­Stack will start emu­lat­ing the spe­cified AWS ser­vices, but no resources will be cre­ated until we deploy our applic­a­tion using the Server­less Frame­work. To achieve that, on the same dir­ect­ory as serverless.yaml, we simply run the command:

serverless deploy –stage local
An Output from Serverless local Deploy.
Fig­ure 9: Out­put from Server­less local deploy.

With the deploy­ment suc­cess­fully done, we can con­firm that Loc­al­Stack has con­figured the API Gate­way, with the unique iden­ti­fier a5hc57n19j, to serve our deployed func­tions. Addi­tion­ally, we can con­firm the three func­tions have been deployed.

Let’s test it by call­ing the cre­ate mes­sage func­tion using the provided end­point and adding the path defined on serverless.yaml:

curl -X POST http://localhost:4566/restapis/a5hc57n19j/local/_user_request_/messages -H "Content-Type: application/json" -d '{"message": "Hello, LocalStack!"}'

Out­put:

{"id":"8671b2b9-81ce-4349-b813-b1e35e671e4b","message":"Hello, LocalStack!","createdAt":"2024-06-13T13:17:18.141Z"}

The out­put con­firms that the func­tion had per­mis­sion to access and write to the Mes­sages table. Let’s try the get­First­Mes­sage func­tion, which will invoke the get­Mes­sages function:

curl http://localhost:4566/restapis/a5hc57n19j/local/_user_request_/firstmessage

Out­put:

{"message":"Hello, LocalStack!","createdAt":"2024-06-13T13:17:18.141Z","id":"8671b2b9-81ce-4349-b813-b1e35e671e4b"}

It’s work­ing! We could also call get­Mes­sages­Func­tion dir­ectly, but to keep it simple and because we only have one mes­sage saved in the data­base, we won’t do that.

Wrap­ping Up

By fol­low­ing this guide:

  • We suc­cess­fully set up a local envir­on­ment for devel­op­ing and test­ing server­less func­tions by emu­lat­ing AWS ser­vices using LocalStack.
  • We util­ized the Server­less Frame­work to define and con­fig­ure our infra­struc­ture and resources through a YAML file.
  • We deployed our infra­struc­ture and resources to the Loc­al­Stack envir­on­ment, with the help of the server­less-loc­al­stack plugin.
  • We invoked the func­tions by HTTP requests to the end­points con­figured in the API Gateway. 

This setup demon­strates how teams can bene­fit from using local envir­on­ments to develop server­less applic­a­tions. By emu­lat­ing cloud ser­vices loc­ally, teams can sim­plify the devel­op­ment pro­cess, reduce costs asso­ci­ated with using actual cloud resources, and improve test­ing accur­acy. Addi­tion­ally, it allows for faster iter­a­tion and debug­ging, enabling developers to work more effi­ciently without rely­ing on a live cloud environment.