As the large cloud providers duke it out for market share, they've expanded the variety of cloud instance types...
they offer. Users now, for example, can deploy instances that have access to graphic processing units, which are especially beneficial to compute-intensive workloads like artificial intelligence.
The competition between the top cloud providers over graphic processing unit (GPU) instances is only starting to heat up. Azure in December 2016 rolled out its N-Series virtual machines that include Nvidia GPUs, and, in September 2016, Amazon Web Services unveiled the new Amazon Elastic Cloud Compute P2 instance type, which also includes Nvidia GPUs. Google, for its part, revealed in February 2017 that Google Compute Engine and Cloud Machine Learning will support Nvidia GPU instances, as well.
Before you deploy one of these GPU instances, it's important to understand the types of applications and workloads they best support. These workloads include:
Business analytics applications benefit from GPUs that offer massively parallel computing. Hadoop-like applications with data processing that can be mapped across a set of engines fits that bill. The public cloud offers a pay-as-you-go model, and the ability to handle variable workloads through cloud bursting. These capabilities are especially important, for example, in the retail industry, where users require analytics responses within a couple of minutes and where workload demands peak at different times.
Video production work also benefits from GPU instances because it involves large amounts of rendering and real-time editing.
Artificial intelligence (AI) is still in its infancy, but having ready access to GPUs provides a way to test scalability without a huge expense. This will stimulate the growth of AI startups, and open up new AI opportunities for industries ranging from healthcare and biotechnology to military applications and self-driving vehicles.
Virtual desktop infrastructure (VDI) could also get a boost from GPU instances. For instance, Google also revealed the future availability of AMD's FirePro GPUs, aimed at high-performance VDI, on Google Compute Engine.
The effect of field-programmable gate array (FPGA) cloud instances is somewhat more complicated. In general, FPGA-based computing can deliver enormous performance for a single, narrowly defined use case. The technology is still young and much of the emphasis is toward accelerating specific compute tasks, such as compression and encryption. As it matures, the idea of using an FPGA-based cloud instance will follow the same adoption curve as GPUs, and organizations can use low-cost sandboxing to kick start adoption.
Super-computing has already been made available to smaller scientific organizations through high-performance computing clouds, which can dramatically accelerate projects. GPU instances will help bring super-computing to the under-graduate level, benefitting academic research.
Engineering simulation, such as that used in the oil and gas and automobile industries, will also be impacted by cloud-based GPU instances. Car manufacturers rely on engineering simulations that can be very time-consuming, but GPU instances will remove the need for field-portable processing clusters, lower analysis costs and generally accelerate projects.
Explore cloud instance types from Azure
Do you need to upgrade to a larger cloud instance?
Map Google cloud instance types to your workloads
Dig Deeper on Big data, machine learning and AI
Related Q&A from Jim O'Reilly
OpenStack Cinder has added a revert-to-snapshot function, enabling enterprises to recover from corrupted data sets. However, if the feature falls ... Continue Reading
Don't let backup data encryption fall through the cracks. When encrypting backups, key management and compression are just two of the best practices ... Continue Reading
While tape is notably offline and thus protected from cyberattacks, the cloud could comprehensively surpass it for backup if service providers figure... Continue Reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.