Serverless architectures are the latest craze among cloud providers, but this nascent approach to harnessing public...
cloud resources may be one trend deserving of all the hype.
Amazon Web Services (AWS) was the first to introduce so-called event-driven, serverless computing resources with AWS Lambda in 2014. The service remained largely unchallenged until this year, with IBM, Google and Microsoft each rolling out their own iterations. All are trying to get ahead of a market where users increasingly offload responsibility to cloud providers, while also seeking more granular control of resource allocation.
The idea behind such serverless services is that developers deploy their code without having to worry about procuring, provisioning or managing any underlying resources. Of course, there are servers in serverless architectures somewhere in public cloud vendors' massive data centers, but such abstraction allows both the user and the provider to achieve greater efficiencies and focus on what they each do best.
"In this case, the hype is definitely warranted," said Dave Bartoletti, principal analyst with Forrester Research.
Applications are traditionally designed in monolithic fashion that wraps all the code into one big piece. Serverless architectures allow developers to chop their app into smaller pieces and deploy them in highly scalable fashion on an elastic infrastructure -- even more easily than with containers, Bartoletti said.
Dave Bartolettiprincipal analyst with Forrester Research
A common example for the merits of these serverless computing models is uploading a photo to a website. An instance could be spun up and a developer could write a large string of code with a host of responsibilities, including launching a folder, resizing the images, making a backup copy and ensuring the image loads properly.
Alternatively, the developer could write a snippet of code and use a Lambda function to watch a directory, execute the code and upload the image. The user only pays for the milliseconds this function runs, rather than the minutes or hours that cloud platforms otherwise require to run an instance.
Serverless computing, then, is less about the technology than it is about pricing and packaging, explained Andrew Reichman, research director at 451 Research. It has the potential to change how resources are used, more closely linking the infrastructure and app developer platform, existing somewhere between infrastructure as a service and platform as a service (PaaS).
"It's a huge leap forward to rent [a server] by the hour or minute, but realistically, even that is less granular than what you need," Reichman said. Ultimately, users want to "do the computing [they] need and actually pay for it when [they] actually use it, rather than pay and sit waiting for a job."
It can be difficult to know which server to choose for a job because of the uncertainty around demand, Reichman said. Even though it may not be a typical five-year commitment in a private data center, developers are still forced to make a decision on a server to program their workloads.
Google, Microsoft and IBM follow Amazon's lead
Lambda remains the best example of the potential of serverless computing because of Amazon's sizable lead in the market, longer track record and reputation among users. Google began alpha-testing Cloud Functions in February, but has been tight-lipped about it. IBM followed in March by adding OpenWhisk to its PaaS offering Bluemix, though that service is currently listed as experimental. Microsoft closed out the flurry of releases at the end of March by adding Azure Functions, currently in preview.
Serverless architectures didn't begin with Lambda, though -- much like how containers existed long before Docker. In fact, some cloud vendors have taken to rebranding existing services as serverless amid all the hype. At its recent user conference in San Francisco, Google cited at least four serverless products in Google Cloud Platform, including App Engine, its PaaS offering first introduced in 2008.
Amazon hasn't disclosed the growth rate for Lambda, and it's still seen as a service for early adopters, but it is being implemented by high-profile customers Netflix, Capital One and MLB. Popular use cases include functions for serverless data processing, coordinating with Simple Storage Service via an API Gateway to run microservices for Web applications, using it to make Internet of Things devices as a development platform and providing connective tissue for myriad AWS environments.
The x86 revolution allowed for lazy app design because efficiency was irrelevant when a server was sitting idle 90% of the time, but now, serverless architectures are reversing course and getting deep into optimization, Reichman said. It's somewhat reminiscent of the early mainframe days of using punch cards and scheduling jobs to execute, he added.
Still early days
Tools like Lambda are hard for many IT pros to wrap their heads around, especially for those primarily evaluating price versus performance between on premises or public clouds, said David Pippenger, senior server operations engineer at GREE Inc., a San Francisco-based gaming company.
There are some really easy use cases, but the real potential lies down the road, Pippenger added.
"The whole analogy about cloud being like turning the dial to get more water -- we're getting closer and closer to that."
GREE has used Lambda, but the company is still getting used to the service. The gaming company originally intended to use it as triggers during a migration from Amazon Relational Database Service (RDS) to DynamoDB, but eventually scrapped that plan. RDS was in a virtual private cloud, and the number of extra steps needed to make sure the transfer was secure while using Lambda over the public Internet made it prohibitive, Pippenger said.
Though some of those security access controls have apparently improved, it's those types of examples that underscore its infancy. "It's not quite ready for prime time," Pippenger said.
Lambda only supports certain types of events, and while the big selling point is being able to just write code and go, today, it only supports Node.js, Python and Java. To get more production use cases, it would be helpful to see more service-level agreement language around latency guarantees, Reichman said.
Before evaluating a serverless computing approach, companies should survey their developers to understand how much their current applications might benefit; there's no need to waste time doing a task that can be better handled by a microservice, Bartoletti said. For new apps, developers should be looking at microservices architectures, but proceed with caution because of the added complexity with dividing processes into smaller bits that may run all over the place.
It's not easy to retrofit for legacy applications and should be limited to companies working in a DevOps mode with cloud-native architectures, Reichman said.
"If you have an app that is really just an infrastructure play, then this is irrelevant to you," he said.
Trevor Jones is a news writer with TechTarget's data center and virtualization media group. Contact him at firstname.lastname@example.org.
How to configure an AWS Lambda function
Using an AWS Lambda function with microservices
Understanding the relationship between microservices and cloud