The definition of cloud, like the technology itself, is constantly changing. The concept of cloud has moved from...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
identical x86 servers virtualized into identical small chunks of compute to large groups of computer resources that can be automatically orchestrated together into a shared pool. And that change opens a whole Pandora's Box of new applications like big data analytics and high-performance computing.
Developments in non-traditional computing yield some spectacular alternatives. The advent of big data challenges computing to reach new performance and flexibility limits. Bigger virtual instances are one way forward. And in the last year, most cloud service providers' instance-size portfolios have exploded. Still, many question large-instance performance, such as high-performance computing (HPC) in the cloud versus on-premises.
Moving big data and HPC to the cloud brings into focus graphics processor unit (GPU)-based options. The GPU approach, which runs as many as 1,000 cores on a single chip, works well for searches, genome matching and other tasks with a high degree of parallelism. GPU performance is increasing at twice the rate of Moore's Law, placing GPUs prominently in next-generation supercomputers. Companies such as Nvidia offer clouds with GPU performance for the enterprise.
Big data and beyond
For many enterprises, the short-term decision to use a GPU and the longer-term options may be different. Big data is in a transitional phase -- where most businesses see its potential, but data still moves slowly through the system. Big data trends point to the use of real-time streaming services. At the same time, data flow will increase exponentially, making it critical for companies to select the right big data strategy. And that makes large instances and GPU clouds even more valuable.
Niche public clouds are emerging to provide services for certain verticals, such as biomedical research and financial markets. To add value, these clouds provide networking and storage tuned to specific-task use cases. IBM uses Watson for inference generation on a neural net model. The concept is still in its early stages and suffers from scalability constraints, but it resonates with big data analytics' processing capabilities and could be a winner in several markets.
The cloud will evolve on multiple layers -- both in platforms and market segments -- and is moving beyond the early adopter stage. Enterprises are faced with a lot of choices, many of which focus on performance. In this rapidly evolving space, IT teams must hop on the big data bandwagon before it leaves the station. And public performance-oriented clouds can help avoid costly investment errors.
About the author:
Jim O'Reilly was vice president of engineering at Germane Systems, where he created ruggedized servers and storage for the U.S. submarine fleet. He has also held senior management positions at SGI/Rackable and Verari; was CEO at startups Scalant and CDS; headed operations at PC Brand and Metalithic; and led major divisions of Memorex-Telex and NCR, where his team developed the first SCSI ASIC, now in the Smithsonian. Jim is currently a consultant focused on storage and cloud computing.
Apache Sqoop key to big data analytics in the cloud
AWS chief data scientist breaks down big data trends
How to hit five-nines cloud app availability