IT shops interested in moving workloads to the cloud should check out the performance rankings of cloud providers...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
by CloudSleuth, a cloud monitoring service owned by Compuware Inc. But experts say to take the results with a hefty grain of salt.
Google App Engine came in first, followed by Microsoft Azure, then GoGrid. The rankings measures the response time to a test application that all providers in the study agreed to run in their cloud. The tests are run from 125 end-user U.S. locations in all 50 states and from 75 international locations in 30 countries and are conducted every 15 minutes. For more details on the methodology and results, check out CloudSleuth's application Global Provider View.
"A few seconds matter a great deal -- to some applications more than others, but at a certain scale it matters to almost everybody," said Geva Perry, cloud computing analyst and consultant. "If users have to wait for an application to load, you have a user experience problem right there."
At scale, Perry said this also becomes a cost problem. In a 10-second period, for example, how many calls and responses can you get from your application? With hundreds or potentially thousands of users, that operation can be parallelized which has a direct bearing on the cost savings.
Real-world implications of cloud performance tests
CloudSleuth's test application is a simple, simulated retail shopping site, but user experience is important for any application, not just for online bookstores. Salesforce.com's business depends on it. CloudSleuth's data is helpful, but there are some gaps.
"It is useful, assuming your app is anything like the application they use," said James Staten, VP and principal analyst at Forrester Research.
Architectures and workloads are different. They can be compute intensive versus data intensive, or throughput sensitive versus latency sensitive, and each provider might rank differently based on the different kinds of applications its cloud service targets. Their architectures might use more memory or disk storage or use the network differently, and so the results would not be the same for every application.
If users have to wait for an application to load, you have a user experience problem right there.
Geva Perry, cloud computing analyst and consultant
Staten advised enterprises to perform a test of their own application on different providers' clouds to get a real indicator of performance. He added that there's some "gaming of the system" going on too, where service providers will tweak and tune their service to run the CloudSleuth app really well.
"[CloudSleuth] is really a sales tool for Compuware to go to service providers and say, 'here's how you improve performance' … but still, putting the data out there is a way to keep everyone honest," Staten said.
Ideally, users would like to be able to see how much it costs for different kinds of performance and workloads. If the best performance costs twice as much and you only have one small app and a few users, it's not worth the price. But if you have to run 10,000 instances of your application and you can't tolerate even a microsecond's delay, it's a big deal.
The Wall Street banks are a good example of the latter, as they are typically price insensitive. Their trading applications are worth millions of dollars to the bank, so a microsecond matters, and they will spend any amount of money to get the best performance. But for most companies, cost matters significantly.
CloudSleuth is planning to offer users the ability to benchmark and test their own applications. Competitors to CloudSleuth include Keynote and Cedexis. The cloud providers mentioned in this story were contacted for comment but did not get back to us by press time.
Jo Maitland is the Senior Executive Editor of SearchCloudComputing.com. Contact her at firstname.lastname@example.org.