everythingpossible - Fotolia

Get started Bring yourself up to speed with our introductory content.

How to apply serverless in front-end cloud computing

As application front ends move to the cloud, it's important for app designers and cloud users to understand how serverless and other models affect performance and cost.

Instead of moving whole applications to the cloud, organizations have the option to build cloud front-end computing elements for existing applications. And they have options regarding the technology they use to execute this approach, including serverless computing and containers.

The use of web servers as front ends that provide online access to applications isn't a new idea. Neither is the tight integration of web pages with hosted processes -- the common gateway interface (CGI) has been in use for decades. However, front ends designed for cloud computing create a model where presentation or GUI features are hosted on cloud resources for scalability, resilience and performance improvements, whereas the application back end can reside anywhere.

An organization could still implement this hybrid model through the traditional web-server-and-CGI approach, but modern cloud technology offers better options. With the deployment of a cloud front end, reliant on serverless technologies and microservices, IT teams can reduce overhead and cut costs, while also adding flexibility and scalability to their applications.

Pressure to modernize

Typical modern application front ends center on an API gateway or broker. This broker element presents a series of APIs that are invoked from either webpages or mobile applications. These APIs can either be connected to web servers or invoked directly from webpages via the programming language, such as JavaScript. Behind the APIs are the software components of the applications themselves, hosted in the cloud or in the data center.

Even though this front-end cloud computing model has only begun to take hold over the past two years, there's already modernization pressure. The leading edge in application front-end design uses microservices, which are small stateless components of logic that can scale or get replaced dynamically. Serverless is an architecture for applications that only consumes resources when it executes code, such as these microservices.

A microservice and serverless approach makes the front end fully scalable and resilient to failures. With this type of strategy there is no server management and the cloud client only pays for active hosting -- low activity levels don't cost as much as always-on cloud-hosted applications.

Transactions and events

Microservice and serverless designs are about events, whereas other application designs are built around transactions. When designing cloud front ends for microservices and serverless, developers must think of transactions in relation to events.

In a typical application, users create a transaction through a multistep process. The steps of the transaction correspond to events. Each event must go into the transactional context somewhere. Microservices and serverless developers commonly dissect a transaction into events at the source -- meaning the mobile device or the web server.

Don't get trapped into doing the latest thing when it's not the best thing.

The API gateway model suits serverless implementation. The gateway can invoke the proper serverless code based on a call from the front-end web server or mobile app. The front end can also access an online database. This access then triggers a serverless workflow. Applications built on this model, for example, access a database for order creation, then trigger a serverless workflow to transfer the processed order to the back-end application for inventory management.

Some application front ends are rich, more like a distributed processing function than a simple event handler. In these designs, cloud developers can use workflow orchestration tools -- such as AWS Step Functions or Microsoft Azure's Durable Functions -- to build complex multiserverless-function workflows. These workflows resemble traditional application logic, except that they are decomposed into microservices to maximize cloud value.

Microservices, serverless and containers

The major cloud vendors offer a way to easily shift between a serverless deployment of microservices and an always-available container deployment. Microsoft focuses more directly on microservices deployments, though AWS and Google also enable it.

Application teams should aim to think in terms of microservices rather than serverless computing. A microservices architecture deals directly with one of the common issues surrounding serverless computing: Serverless is cost effective when it's used sparingly. Serverless customers pay for usage, so as usage increases, the cost of serverless activations can exceed the cost of dedicated, always-on container hosting of the same application code.

State control is an important consideration to build serverless applications, particularly if the application might switch to more conventional cloud-native hosting in containers. A microservice or serverless function is stateless. It can't store information between activations, which is what makes it suitable for on-demand activation, scaling and replacement. Thus, applications that involve multiple steps with context that must be remembered have to provide state control.

There are multiple ways to control state with the API gateway model of a cloud front end. The mobile device or web server accessing the application can provide state as part of the events it generates in the app. Everything a microservice or function needs is passed to it by that user-interface-connected state information. An API gateway can be implemented to remember context, making it the state source. Or, the microservice or function can obtain state information from a back-end database that maintains the context for each user transaction.

Orchestration is a method that maintains state in an internal process or workflow map. To use this approach, first research whether this map is available or usable from your chosen cloud provider for microservices that have been hosted in a container. If you're considering transitioning some serverless microservices into persistent containers, it's crucial to know how that is done before you commit to a specific cloud provider and orchestration model.

Watch serverless workflows carefully. Cloud providers must load and run serverless components on demand -- these components are inactive otherwise -- so there is a delay associated with execution. Too many serverless elements in a workflow can add up to noticeable increases in response time. This problem wouldn't occur if the same components were deployed in conventional containers.

Microservices and stateless execution define the architecture of a cloud front end, not serverless. The serverless hosting model is suitable for many applications, but many applications are more cost-effective, and even perform better, when they're executed another way. If you map out workflows in advance, you can spot applications where the cost and performance could be affected by serverless hosting. Don't get trapped into doing the latest thing when it's not the best thing.

Dig Deeper on Cloud application monitoring and performance

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

SearchServerVirtualization

SearchVMware

SearchVirtualDesktop

SearchAWS

SearchDataCenter

SearchWindowsServer

Close