Cloud computing is firmly in place as a new technology option for companies looking to cut investment costs on hardware. Its pay-as-you-go buffet model is used by online businesses such as Netflix and Target.com, and developers are increasingly turning to it for testing and development.
"With these large-scale distributed systems, there are so many uncertainties that come with them," said Dr. Liu, a specialist in system architecture and planning at the University. Liu said she built a simple web application to test response times and other capabilities for each cloud service. She also noted that the results for Azure are least reliable, due to the service still being in beta testing.
"We noticed quite dramatic performance variations over time," she said. She added that it was clear the different clouds were built for different uses. For instance, Google App Engine was only made for simple, fast web apps; any computation request that took more than 30 seconds was subject to huge rates of error up to 0.07%, she said. Meanwhile, AWS displayed high variation in the response time and availability for queries made against its bare-bones database product, SimpleDB.
Amazon is attempting to learn more about Liu's research, as it hasn't been made public yet. "We believe that this particular test issued a large number of requests to Amazon SimpleDB in a very short period of time in an attempt to overload the system," said Amazon spokeswoman Kay Kinton in an email. She said SimpleDB returned "service unavailable" as a protective measure. Kinton also said she was confident SimpleDB could scale to handle "practical, real-world applications."
Similarly, as she racheted up traffic while testing Google App Engine, Liu said she bumped into Denial-of-Service protections put in place by Google. She compared it to a site launched in response to the Victoria wild fires in February, also on Google App Engine. Google engineers had to fix the site after users overwhelmed it trying to learn about the blaze as it happened.
"What our study wants to do is get to the next level of just what to do" to understand seemingly arbitrary limitations and performance gaps in the cloud, Liu said. Users may not know where or when they'll hit a ceiling or experience lag, and that makes it very hard to plan for problems. Liu said that her research was not conclusive; it was a "snapshot in time" that would surely change as cloud technologies matured and providers adapted to growing needs. She said her research, expected to be published some time this fall, wasn't an attempt to "benchmark the cloud," since providers clearly targeted many different types of use.
"A global benchmark would be pointless if it compared Amazon's server instances against Google's programming platform," she said, but Liu sees a stark need for independent performance numbers that can accurately describe how performance can vary in the cloud. "There will be commercial [benchmark] applications," she said, although she stressed that her research is very basic.
"It has shown up some real issues and complexities around cloud that no one has been talking about," said Kevin Francis, a systems architect with Australian firm Object Consulting. He said that this kind of research was necessary to show the limits of the cloud model, instead of how it's normally pitched. He added that research such as Liu's was only the tip of the iceberg in developing objective ways to plan for and use cloud computing. "It's showing up this idea that [cloud] is a wondrous thing you can throw anything at and it doesn't matter," he said.
Cloud users work around the weaknesses
Others see cloud users already developing ways to cope with these built-in weaknesses. Forrester infrastructure analyst James Staten said that Amazon and other public cloud resources were clearly developed with cost rather than performance in mind, but developers had adapted to uncertain conditions by using one of cloud's notable features, instant-on self service.
"There's a mantra that [cloud developers have] -- launch four times, keep the fastest," he said. Developers expect uncertainty, he said, so they'll fire up an array of virtual instances, figure out which one is the best performer and then shut down the rest, a 'shotgun' approach that could cost less than a dollar. This has implications for the growing private cloud market, he said.
Staten said that kind of uncertainty is not attractive to enterprise IT. He said that seeing performance research like Dr. Liu's just adds weight against adoption of public cloud resources in favor of private cloud development, along with security and management concerns.
Carl Brooks is the Technology Writer at SearchCloudComputing.com. Contact him at email@example.com.