Others say "not so fast." Cheap, on-demand computing power is fine, but cloud providers simply can't be trusted. They point out that information is uniquely valuable, especially to the firms that provide cloud services, and some of it simply should not go outside the walls, especially if the organization has ethical obligations to be independent.
"My first question, instead of why not, is why?" said Yale professor Dr. Michael Fischer.
Earlier this year, Fischer was instrumental in delaying Yale's switch from in-house email systems to Google Apps and Gmail. Some of his objections were that Google could not guarantee that user data would be secure from intrusion by governmental snooping, or even that it wouldn't leave the country.
He pointed out that even though Google offers its Apps service free to educational institutions, it was going to be expensive in the short term. Besides that, the university was perfectly capable of providing email to its students and staff.
"Running email is not rocket science," he said.
Fischer said he's perturbed by the idea that everyone should be rushing to use online services instead of doing it themselves. He doesn't see the necessity in many cases, only the convenience, which comes at a steep price in the loss of control over sensitive personal or business data.
"It's a little bit like my reaction to learning that some people thought they could make money selling [bottled] water," he said.
Fischer said that, fundamentally, an organization has to understand what they are getting into when they outsource an IT service and accept the ramifications of what that may mean. He said he understands the attraction of easy-on, always-there computing power, but that, especially for a modern university, IT services should be something delivered in-house first and outsourced only after careful consideration.
Having your cloud and eating it, too
Some have it both ways. Dr. Mladen Vouk at North Carolina State University runs the school's Virtual Computing Lab, a cluster of about 2,000 blade servers that delivers anywhere from 80,000 to 120,000 virtual machine hours per semester in classes, labs and research. It's on-demand and user-controlled, the definition of a cloud environment.
However, the university also uses Google Apps mail service for its students. Vouk said that's because they negotiated long and hard with Google before the search engine giant would agree to park NCSU's email data in a specific location and agree not to move it. Also, state law requires that staff and faculty emails be fully discoverable by regulators, so those services will likely never move off campus, along with other sensitive communications.
"We do have certain services, especially those that are highly confidential, that will remain in-house," he said.
Vouk said that the ability to run a private cloud gave him comfort in moving a non-essential service like student email off the campus. He said that a big part of the hesitation was the fear of getting too invested in a third party. Cloud providers right now are far too disparate in practices and policies, and many don't come close to satisfying organizational concerns about compliance, security or even basic trust.
"I want to be able to pick up my toys and leave if I don't like your cloud. That is not easy to do these days," he said.
Like the majority of computer researchers, Vouk is a strong proponent of open standards and open software, and NCSU's cloud platform is maintained by the Apache Software Foundation as an open project.
Is federal regulation needed in the cloud?
Vouk said that cloud providers needed to understand exactly how gun-shy large organizations can be about their data before they would begin to see widespread adoption. He said that even if a service is cheaper or better than what an enterprise has, it will not be considered until it can be relied on, and promises and even contracts aren't good enough.
"I suspect there's going to be a need for federal regulation," said Vouk.
He cited recent moves by the FCC to partially regulate ISPs as communications providers, something that might prevent them from interfering in competition among cloud service providers. He added that it needed to be closer to the regulated financial industry, which had a series of catastrophic crashes before being regulated after the Great Depression. Current events notwithstanding, Vouk says enterprises simply won't take a provider's word on it.
He said it may take a major cloud disaster to provoke such regulation, or it may never happen, and the utility model of IT services may not come to pass. Business will continue the trend of turning their own data centers into cloud-like environments and public cloud will stay the province of small businesses, individuals or applications uniquely suited to a Web-based, low security model.
It's not an insurmountable challenge, said Vouk.
"We are all 'locked in' to power plants, we are all buying Internet from somebody," he said. "It's not like when you go outside you're on DARPANET anymore."
It's just that, unlike a bank collapse, a blackout or a telecom outage, nobody understands what will happen with a catastrophic failure at Amazon or Google, and that's the rub. "We really don't have the legal framework or, how shall I put it, the evidence to show that these things won't go wrong."
Vouk agreed in spirit with Yale's Fischer, who says that at some point in the future, there might be truly private, anonymous and self-service commodity compute process and data storage, but it's not here yet.
Until then, large organizations, or institutions with special ethical or legal considerations, are waiting for better assurances and better applications of public cloud before they'll move to a true utility model, and they'll happily wait until doomsday.
Carl Brooks is the Technology Writer at SearchCloudComputing.com. Contact him at firstname.lastname@example.org.