Public cloud adoption has grown significantly over the past few years, as even the most skeptical IT pros admit the technology is right for certain use cases. Spending on public cloud services is expected to exceed $127 billion in 2018, up from $56.6 billion in 2014, according to industry analyst group IDC.
But, even as adoption grows, many organizations still question whether they can use public cloud to run all mission-critical workloads.
The answer, however, depends on several factors. To protect data, a business' most mission-critical applications are kept close. In fact, by law, certain data must be protected. And while this does not preclude operating on or storing that data in a public cloud, it does require special care, including data encryption at rest and in transit.
Evaluating public cloud security, efficiency
Concerns around public cloud security abound. But, with a multi-tenant environment and fear of a security lapse hurting their business, most cloud providers strive to be as secure as possible. And, because they have significantly more resources than the average data center operator, cloud providers fully maximize their security investments. As a result, an enterprise's in-house security operations rarely match up to the cloud.
Like security, many IT pros question the efficiency of public cloud operations. Because anything else would be unmanageable, cloud instances tend to be very cookie-cutter. While most of your own instances will easily map over, there will be corner cases, especially when handling big data. And public cloud is just beginning to offer a solution for extra-large or GPU instances.
The use of containers rather than traditional hypervisors also impacts server instances in a major way. And until the dust settles around this new container and infrastructure approach, organizations may delay moving jobs to the public cloud.
In addition to server instances -- which are really just one aspect of the cloud -- networking and storage have major efficiency implications. Most cloud or virtualized instances are underprovisioned with storage I/Os, substantially impacting both performance and cost.
Through the use of local instance storage, more I/O is becoming available. However, failure to treat this as just temporary storage may lead to a loss of data access. If a server crashes, local stores are no longer available, making it difficult to guarantee that a copy of written data is sent onto a network store in a timely fashion. Ask your cloud provider about this before signing up.
Meanwhile, networking is becoming more sophisticated, as public cloud providers implement software-defined networking (SDN). More tenant control over the VLANs that connect virtual servers and storage is coming into place. As industry standards emerge, this should become relatively easy and better than most on-premises alternatives.
Storage resources remain somewhat inflexible in the cloud, but SDN is leading to software-defined storage, which will offer another layer of orchestration and virtualization.
IT skills gap also deters some from public cloud adoption
While public cloud is ready to host many mission-critical workloads, it seems, as aforementioned, that it's not fully there yet.
Additionally, some IT pros don't feel ready to take the leap. Many have lots of legacy gear and a profound reluctance to let it go. On the other hand, some fear job security or the inevitable application re-write that comes with cloud, but that's not really a cloud issue -- it's a matter of corporate or strategic need.
The job security question is inevitable when data center teams consider that, with cloud, part or all of a data center is moving from a hands-on model to a remote and virtual model. Running clouds properly takes new skills. Wise admins have been obtaining those skills, creating a gap between themselves and those that aren't ready for cloud. At some point, most data centers will reach a tipping point where the cloud-ready team has critical mass -- but it doesn't have to be that way. A good IT strategy involves a training and career path for the team.
Hybrid clouds have been offered as a public cloud alternative, but they carry a severe risk; data can be in the wrong place, either needed in-house but in the cloud or vice-versa. Additionally, hybrid cloud data can be vulnerable when moving between public and private clouds. Hybrid clouds are exceedingly popular at the moment, but, after organizations tackle issues related to staffing and legacy systems, the hybrid model may just be a half-way step to public cloud.
About the author
Jim O'Reilly was Vice President of Engineering at Germane Systems, where he created ruggedized servers and storage for the US submarine fleet. He has also held senior management positions at SGI/Rackable and Verari; was CEO at startups Scalant and CDS; headed operations at PC Brand and Metalithic; and led major divisions of Memorex-Telex and NCR, where his team developed the first SCSI ASIC, now in the Smithsonian. Jim is currently a consultant focused on storage and cloud computing.
What to consider when choosing a public cloud provider
Which market leading public cloud provider is right for you?
The world of cloud requires a new IT mindset