Here at Clear Function we’ve been experimenting with new tools in cloud native computing, specifically serverless functions and container orchestration technologies like Kubernetes. Today I’d like to tell you a bit about serverless computing, why you might use it, and what resources are out there to help you get started.
Functions as a service (often written as FaaS) may be best known under Amazon’s Lambda product. Lambda is a platform that enables developers to upload small, single purpose programs to the AWS cloud. Amazon then runs these programs on demand when certain events come in – either a message pops off of a job queue like “run job X now with inputs A B C” or maybe a new web request comes in that looks like http://my-lambda.my-company.net/do-useful-thing. This model may not look super compelling at first, but there are a few things about it that make it hard to beat in the right contexts.
What’s so great about a serverless function?
- Per-request pricing means you can compute ROI on your feature development. This is probably the biggest advance with Serverless pricing – now you can figure out exactly how much each microservice or function is used and how much it’s costing you month to month. This makes software investment decisions much easier.
- Easily separates bursty workloads from user-facing traffic.
- Super useful for sysadmin tasks: Do this specific thing infrequently; don’t make me keep a whole server up forever.
- Good for highly variable infrequent workloads in general: Image thumbnailing and other media processing, ops alert handling, etc.
A lambda is a function that runs in its own context. Standalone cloud-hosted event handlers, mostly. The cloud vendors tend to want you to use their own proprietary event buses, but there are ways to be lazy and just set up an HTTP microservice instead of learning everyone’s new events API.
What should I be doing with serverless functions?
Serverless functions are ideal for infrequent or bursty workloads, so if you have some background processing out there complicating a web app you can always consider extracting the work into a background job and then moving that to a standalone lambda. Another great fit for lambdas is for sysadmin alerts and event handling: imagine you’ve got 50 nodes that occasionally emit warnings and events that will need to trigger other jobs or post to a chat room or email list. This type of work can be easily written up as a background job.
Who supports serverless functions?
- Amazon: Lambdas
- Microsoft: Azure Functions
- Google: Cloud Functions
- Iron.io Workers
What are the drawbacks?
Serverless functions naturally lead you in the direction of having microservices everywhere. This means you might have more distributed systems problems where you didn’t before: version mismatches in shared code, data migration complexities, thundering herd problems, and more. It’s not a silver bullet, but having this tool in your toolbox allows for smaller, more cohesive designs.
What tools make it easy?
Up
Up from open source veteran TJ Holowaychuk is a solid way for a web developer to get her feet wet writing Lambdas.
- I deployed some Go and JS functions with this, easy peasy. Except when I tried domain names, then it was terrible, because AWS and CloudFormation.
{ "name": "emoji-service-5", "profile": "default", "regions": [ "us-east-2" ], "stages": { "development": { "domain": "emoji-dev2.pritchettbots.com" }, "production": { "domain": "emoji.pritchettbots.com" } } }
Serverless.com
- Feels bad because it tries harder to be everything to everyone, but still easier than actually learning how to Lambda up front. So many moving parts on AWS!
URL proxying
One common problem with hosting a single-purpose service is integrating it with a larger web application. You can always have your frontend app call out to a serverless function directly, but there are other tools that let you serve them directly under you main URL. You’ll need to set up a proxying service using something like Amazon’s API Gateway or the Kong NGINX-based microservice proxying toolkit. This will allow you to turn the messy “my-service.my-cloud-functions.com” URLs referenced above into something more palatable like myapp.com/api/my-function.
Examples
Instagrabber: Go on Lambda with up
Instagrabber is a little demo I put together to show how to build a modest lambda with Go. Here’s a snippet of the main
function that calls out to an Instragram parsing library.
func main() { rand.Seed(time.Now().Unix()) // initialize global pseudo random generator addr := ":" + os.Getenv("PORT") http.HandleFunc("/", renderPhoto) http.HandleFunc("/raw", printRawPhotoUrl) log.Println("Serving on port " + os.Getenv("PORT") + "...") log.Fatal(http.ListenAndServe(addr, nil)) }
Emojiaas: Node on Lambda with up
This one is a spike on a full-featured emoji-as-a-service offering that stitches together a few Node.js libraries. The goal is to provide emoji translation from unicode symbols (😊) to plain-text representations (:smile:) and to crossplatform image files used to render emoji in a way that doesn’t depend on Unicode.
const http = require("http"); const url = require("url"); const emoji = require("./emoji/emoji"); const { PORT = 3000 } = process.env; http .createServer((req, res) => { route = url.parse(`http://${req.url}`).pathname; if (route.match(/\/named\/\w+/)) { const longName = route.split("/named/")[1]; const imageUrl = emoji.imageUrlFromName(longName); res.end(`<img src="${imageUrl}" />`); } else { res.end(`Hello World from Node.js at ${route}\n`); } }) .listen(PORT);
Emozure: Node on Azure Functions with serverless
This is a clone of emojiaas
but deployed to Azure Functions using the serverless
tool. Here’s the serverless.yml
manifest that wires up my Node code so that Azure can run it as a web service:
service: emozure provider: name: azure location: West US plugins: - serverless-azure-functions functions: hello: handler: handler.hello events: - http: true x-azure-settings: authLevel : anonymous - http: true x-azure-settings: direction: out name: res
What’s next in serverless?
Containerized FaaS is coming. There’s already a Kubeless project to bring FaaS concepts to Kubernetes. More generally, expect services like Lambada to eventually support container-based deployment. Currently most of the stuff (including Iron.io workers a million years ago) has you upload the code directly and then it builds it Heroku-style. Cool if you need it, but Amazon for instance is using old versions of Node and no Ruby at all.