This content is part of the Essential Guide: AWS Lambda architecture brings serverless to enterprise cloud

The outlook for cloud computing 2017? Lower costs and more options

In 2017, serverless computing technologies, data analytics and Docker containers will help drive innovation in the cloud. And you'll even get pay-as-you-consume time in the cloud.

Cloud computing is no longer a new idea. But it's still evolving in big ways. Here's a look into the future at the trends that are poised to dominate the way companies tackle cloud computing 2017.

Serverless computing explodes

Amazon Web Services (AWS) Lambda functions have become massively popular with DevOps folks, because Lambda lets you run code without provisioning an entire server.

The result is serverless computing, which means developers no longer have to worry about setting up or managing servers. Instead, they can just upload their code to the cloud and run it.

Lambda is a couple of years old already. It took other cloud vendors some time to follow the AWS lead when it came to serverless computing. But they are starting to catch up -- see Google Cloud Functions, for example -- and this trend is set to continue in 2017, as more cloud hosts follow suit.

Lower costs for the cloud

Another cool thing Lambda lets you do is pay only for the cloud resources you consume when your code is actually running. There is also no need to pay for a virtual server 24/7 in order to host a service that you only use a fraction of that time.

That translates to lower costs for cloud computing. And it's only one of the several forces that helps drive down the amount of money that companies are willing to pay for the cloud.

The ever-decreasing cost of commodity hardware and the availability of platform-as-a-service products that organizations can run on premises instead of in the public cloud are also creating pressure on public cloud providers to lower their prices. So are the next-generation monitoring tools from vendors like Netuitive, which help cloud users analyze costs to assure they get the best bang for their public cloud buck.

All of these changes help explain why AWS reduced the costs of some of its services toward the end of the year 2016. Expect this cloud computing 2017 trend to continue, as hosting providers vie to stay cost-efficient in the eyes of their customers.

More containers move to the cloud

For the first few years after Docker was founded in 2013, setting up and managing containerized infrastructure required a lot of manual work.

Today, however, cloud-ready containers-as-a-service (CaaS) platforms make Docker implementation a turnkey affair.

Some CaaS options are public cloud services, like Azure Container Service and AWS' EC2 Container Service. Others, such as Rancher and Kontena, can be deployed on private clouds, as well as virtual servers on the public cloud.

The rise of so many cloud-friendly CaaS offers is making the cloud an obvious place to run Docker containers. That means 2017 is likely to see many more containers that run in the cloud.

Cloud analytics matter as much as cloud storage

Low-cost storage services in the cloud have made cloud servers an obvious place to store data.

As we look into the cloud computing 2017 world, however, organizations are looking forward to do more with data than just store it. They also need to make it actionable. For this reason, real-time data analytics based on cloud data are now equally important.

Cloud providers are able to meet this demand by building data analytics platforms, like Spark and Hadoop, into their clouds as software-as-a-service offerings. Vendors who can do the most to deliver not just data storage, but also data analytics through the cloud at a lowest cost, will stand out within the market during the next year.

Security becomes a driver of cloud adoption

Traditionally, security concerns have limited cloud adoption.

But the next generation of security threats is turning this calculus on its head by giving enterprises reasons to move more workloads to the cloud as a way to stay secure.

Consider, for example, the Dyn domain name system distributed denial-of-service (DDoS) attack of October 2016. One way to help mitigate the effect of such an attack is companies move more apps to the public cloud. Ideally, distribute them across multiple clouds so a DDoS intrusion against one instance of an app does not bring down an entire operation.

Since DDoS attacks like the one that affected Dyn are poised to become only more common, the cloud is likely to become more popular among companies that seek to survive security threats they might face in 2017 and beyond.

Next Steps

Learn how to use AWS Lambda and microservices together

How to prevent ransomware threats to cloud apps

Learn how to avoid common AWS Lambda problems

Dig Deeper on Public cloud and other cloud deployment models