everythingpossible - Fotolia
Azure Functions and AWS Lambda offer similar functionality and advantages. Serverless compute shoppers should focus on the differences to make an informed choice.
Both Azure Functions and AWS Lambda come from major public cloud providers. Both include, above all, the ability to pay only for the time that functions run, instead of continuously charging for a cloud server whether or not it's active. However, there are some important differences in pricing, programming language support and deployment between the two serverless computing services.
Both cloud providers charge their serverless users based on the amount of memory that their functions consume and the number of times the functions execute.
There is one important pricing difference between Azure Functions and AWS Lambda. AWS charges additional fees for data transfers between Lambda and its storage services, such as S3 and Lambda functions -- if the data moves between different cloud regions. There is no fee if the Lambda functions and data storage exist within the same region. Azure Functions doesn't charge for inbound data transfers, although it does charge for outbound data movement from one Azure data center to another Azure cloud region.
AWS also charges higher rates for Provisioned Concurrency in Lambda. Provisioned Concurrency keeps functions initialized so that they can handle requests more quickly. Rates are based on function memory consumption and execution time. Azure Functions offers a similar feature for users who sign up for the Premium plan, which also provides additional virtual networking and enhanced function performance over the base offering.
These small nuances in AWS Lambda vs. Azure Functions pricing are significant for certain types of deployments. Teams that use multiple cloud regions might find Azure Functions to be more cost-effective because it does not charge for inbound data transfers. The additional features, beyond concurrency, that come with Azure Functions' Premium plan may also be attractive for some organizations.
Programming language support
Serverless functions that are written in a programming language supported by Lambda but not Azure Functions, or vice versa, will be easier to deploy on whichever service supports it. But it is possible to use virtually any other programming language on either service with Lambda custom runtimes or Azure Functions custom handlers. Lambda custom runtimes use binary files that are compiled for Amazon Linux to run code written in a programming language that is not directly supported by Lambda. Azure Functions custom handlers rely on HTTP primitives to interface with code written in unsupported languages. The Azure Functions approach is a bit more complex for developers to implement, but it is also more flexible.
Azure Functions is more flexible and complex in another area too: how users deploy serverless functions as part of a larger workload.
AWS Lambda deploys all functions in the Lambda environment on servers that run Amazon Linux. Lambda functions can interact with other services on the AWS cloud or elsewhere in a variety of ways, but function deployment is limited to the Lambda service.
Azure Functions users can deploy code directly on the Azure Functions service. But they can also run software inside Docker containers, which gives programmers more control over the execution environment. Azure Functions works with Dockerfiles that define the container environment. These functions packaged inside Docker containers can also be deployed to Kubernetes, through an integration with Kubernetes Event-driven Autoscaling.
Azure Functions also offers the option to deploy functions to either Windows or Linux-based servers. In most cases, the host operating system should not make a difference. However, if your serverless functions have OS-specific code or dependencies, such as a programming language or library that runs only on Linux, this is an important factor.
Dig Deeper on Public cloud and other cloud deployment models
Related Q&A from Chris Tozzi
There's no one-size-fits-all answer in terms of how many nodes should make up a Kubernetes cluster. Instead, that number varies based on specific ... Continue Reading
A local Kubernetes deployment -- enabled by Minikube, MicroK8s or K3s -- is a great option to test code or simply learn the container orchestration ... Continue Reading
Learn how to build AI into your apps with the help of cloud services and prepare for some of the challenges you can expect to face along the way. Continue Reading