Cloud computing benchmarks on the rise

CloudHarmony.com has released the first set of comprehensive benchmarks for various cloud computing providers, including VMware. But do fledgling cloud users care yet?

Just about everywhere you go in the IT world, benchmarks -- tests that perform the same procedure against competing devices -- are ubiquitous. You can find in-depth revelations on the performance of monitors, hard drives, CPU, chops and embedded components of all types, down to the nanosecond and in great detail.

I suspect many CTOs and architects at these cloud providers are thinking about how they can get their CloudHarmony numbers up.

John Treadway, the director of cloud computing at Unisys

Not so in cloud computing; there's no way for a consumer to know exactly what it's getting when it buys, for example, the m1.small CPU instance from Amazon Web Services (AWS). But as independent researchers and analysts try and test the cloud -- including detailed performance data on notoriously test-phobic VMware -- that situation has begun to change.

"There just isn't quantifiable information out there," said Jason Read, an IT infrastructure and software consultant and ex-IBMer. Read has busily tried to pick up the slack by running what may be the first comprehensive benchmarks for cloud computing out there and publishing them on his website, CloudHarmony.com.

Emerging cloud benchmarks
In February, Read first published the cloud "speed test" with AWS' Mechanical Turk job brokerage service by paying random users around the world a few cents to test data transit speeds. Amazon and IBM had double the transit capacity of the worst performers, and the results showed a massive disparity in performance dependent on location.

Read has gone on to post comprehensive tests of CPU performance and disk I/O speeds for more than 20 providers side-by-side, the first independently produced metrics of what a cloud providers (and some traditional hosters) can offer. Providers almost universally offer a CPU designation and price it according to capacity, but there are no rules about what constitutes a "small" or "large" CPU and -- until now -- no good way to judge what providers actually deliver.

The benchmarks use a battery of common tools, such as the Phoronix Test Suite for disk I/O and POV-Ray and Super PI for CPU tests. Read says that he has collected dozens of benchmark tools into a "cloud benchmark suite."

"We tried all the standard benchmarks that seemed like they'd be relevant," he said.

CloudHarmony.com generated thousands of results on hundreds of different configurations available to cloud consumers, and Read said that it helped him form a nuanced view of how providers operate. For example, he noted that "it's pretty easy to tell who's using local-attached storage and who's using network storage" by disk I/O under some tests, but not others. And as the size of a CPU instance increased, Amazon's I/O performance also increased, which didn't occur in the case of other providers, he said.

Many providers do not reveal details about their infrastructure (although some, like Rackspace, share details fairly freely). Read said that users can review his benchmarks for clues, which may help them decide whether one provider or another is better suited to their needs.

So who cares?
"Most [cloud] users are really there just for the capacity," said Frank Gillett, a Forrester Research Inc. VP and principal analyst. He said that the great majority of cloud consumers don't seek a performance edge and wouldn't care much about the benchmarks for making decisions.

"The thing to remember is that cloud is a very, very new market," emphasized Gillett, with most users being impressed by the novelty of easy-on, utility-billed servers. When the market has settled down, more people will care about relative performance and expect a catalog of public information on cloud providers.

He didn't discount Read's work, however, saying that it is an early sign of future efforts, as providers begin to splinter into purpose-built environments for different types of applications.

"It's certainly the right time to be experimenting with this," he said, but he didn't estimate much demand for cloud benchmarks from the public today. Some cloud consumers welcome the idea, however.

"As a user, it's interesting. Sometimes, I care; I really want the bang for the buck. Sometimes I just want a server somewhere," said John Kinsella of Protected Industries, an infrastructure and security consultant who regularly develops and hosts on many of the major cloud computing environments. Kinsella said that Read's work is fascinating partly because of the data and partly because of its novelty, which he called a breath of fresh air.

The need for independent review
But as Kinsella noted, CloudHarmony.com's effort may begin to bring independent testing to the cloud marketplace. He likes that it's an end-run around cagey IT providers, including well-known holdouts, on independent review.

"In the past, VMware has been notorious about letting people publish benchmarks on their software; these might be the first benchmarks I've seen on them [in public]," he said. VMware's vCloud Express service was compared side-by-side to several providers and demonstrated marked differences in performance and price.

Kinsella would like to see more, especially as a security wonk, but admits that's a tough nut to crack.

"There are two pieces I see missing: One is security -- that's hard to quantify -- and the networking side of it," he said.

Kinsella and Gillett both said the most interested audience for CloudHarmony.com would be the cloud providers themselves.

"I suspect many CTOs and architects at these cloud providers are thinking about how they can get their CloudHarmony numbers up," wrote John Treadway, the director of cloud computing at Unisys and veteran cloud watcher at CloudBzz, in an email. He said the results raised as many questions as they answer , like "how can Terremark's SAN-based storage outperform many of the local storage approaches, and how does GoGrid get such high performance?"

Read confirmed that cloud providers are the most interested parties so far; some, but not all, have donated server instances and are eager to see the results of Read's work. Read said that he hopes to gather enough information from his work to make a business out it. Currently he supports CloudHarmony.com with consulting work.

Most [cloud] users are really there just for the capacity.

Frank Gillett, principal analyst at Forrester Research

A few others have tried to track the cloud in various ways, but the list is short. The "State of the Cloud" series by Guy Rosen uses Web tracking data to estimate how many users the major cloud vendors have, and Matthew Sacks at  the Bitsource has conducted several performance reviews of Amazon and Rackspace (one sponsored by Rackspace but performed independently). Developer Guillaume Plessis also benchmarked MySQL on EC2 against Amazon's own Relational Database Service (RDS) recently -- in which RDS lost hands down -- and that's about it.

In the meantime, Read said the site and the benchmarks are a good faith effort to broaden the conversation on cloud and that the market needs independent sources of technical information to act as a backstop against vendor self-interest.

"We're not selling anything [on CloudHarmony.com]; we really want to be an objective, third-party source for cloud data."

Carl Brooks is the Technology Writer at SearchCloudComputing.com. Contact him at cbrooks@techtarget.com.

Dig deeper on Network and application performance in the cloud

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchServerVirtualization

SearchVMware

SearchVirtualDesktop

SearchAWS

SearchDataCenter

SearchWindowsServer

SearchSOA

SearchCRM

Close