
Photo by Kier in Sight Archives on Unsplash
Serverless applications with OpenFaaS
Doing Without
In the rapidly evolving world of software development, one of the most transformative shifts in recent years has been the rise of serverless computing. This approach allows developers to focus on building and deploying code without having to worry about the underlying infrastructure. OpenFaaS (function as a service) [1] is a popular open source framework that enables serverless functions on any infrastructure, including public cloud environments, private data centers, or even on-premises Linux servers.
Environment Setup
Before diving in, ensure you have the following prerequisites in place:
- Ubuntu Server: an on-premises machine or cloud virtual machine (VM) running Ubuntu (20.04 LTS or later is recommended) with sudo privileges.
- Kubernetes cluster: OpenFaaS runs on Kubernetes, so you need access to a Kubernetes cluster.
- Docker: to build and push container images for your serverless functions. Install Docker on your Ubuntu machine (if using MicroK8s, Docker is optional because it uses containerd, but having Docker is useful for the OpenFaaS command-line interface (CLI)).
- OpenFaaS CLI:
faas-cli
for managing functions. The quickest way to install it on your machine is with the OpenFaaS-provided script (Figure 1):
curl -sSL https://cli.openfaas.com | sudo sh
With these prerequisites satisfied, your Ubuntu system should have a running Kubernetes cluster, kubectl
configured to communicate with your cluster, and Docker and faas-cli
. Now you can deploy your first serverless function and scale and manage it. OpenFaaS supports deploying functions through the web user interface (UI), CLI, or GitOps. In this example, I use the CLI for precision and repeatability.
Deployment
In a real-world scenario, you'll write your own functions. With OpenFaaS, you can use the faas-cli new
command to scaffold a new function in your language of choice (templates for Node.js, Python, Go, etc., are available). I won't go deep into function development in this tutorial, but as an example, consider
faas-cli new --lang python my-function
which creates stack.yaml
(deployment manifest) and a my-function
folder with a handler, handler.py
(Figure 2). You can implement your logic there, then build and deploy with
faas-cli up -f stack.yaml
This command builds a Docker image, pushes it, and deploys the function in one step (Figure 3). Ensure you're logged in to a container registry or use the OpenFaaS default (pushes to Docker Hub, or, adjust image:
in the YAML file).
For now, stick with the example function to focus on deployment mechanics.
Function Scaling
One of the powerful features of OpenFaaS is automatic horizontal scaling of functions in response to demand. When a function is deployed, by default it runs with one replica (pod). The OpenFaaS autoscaler monitors incoming requests and scales the number of function replicas between a minimum and maximum value.
In OpenFaaS Community Edition (CE) [2], autoscaling is handled by Prometheus and AlertManager. The gateway and function pods expose metrics (e.g., request rates). AlertManager is configured with rules that trigger when the requests per second for a function exceed a threshold. When triggered, AlertManager sends an alert to the OpenFaaS gateway's /system/alert
endpoint, which in turn scales up the function's Deployment (increasing replicas). Similarly, if load drops off, alerts can scale the function down (but not below the configured minimum). This process happens behind the scenes automatically.
Buy this article as PDF
(incl. VAT)
Buy ADMIN Magazine
Subscribe to our ADMIN Newsletters
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Most Popular
Support Our Work
ADMIN content is made possible with support from readers like you. Please consider contributing when you've found an article to be beneficial.
