Anybody with a need to make data and processing available to a large number of users is a good candidate. Software development organizations will be among the first because they have a vested interest in making it work, but they are only marginally more likely than other organizations to adopt it. At this stage it looks like most early adopters are putting non-mission-critical, consumer-facing applications on the cloud. CDNs [content delivery networks] are a good example of this. What needs to happen in cloud computing technologies for development projects for massive applications to thrive there?
Eliminate the buzzwords. Cloud computing doesn't mean anything. Establish whether the offering is infrastructural, Software as a Service (SaaS) or Platform as a Service (PaaS). Provide credible SLAs and back them up with data. Underpromise and overdeliver. Provide better monitoring tools -- i.e., transparency -- for the environments. Amazon Web Services is quite good at that. Google App Engine, on the other hand, is rather opaque. Salesforce is in between. Make it easy to monitor and manage the environments. Let's say a software development firm wants to run a pilot on the cloud. What are some good options? Should one part of a project be placed on the cloud for starters?
I'd put the entire project on the cloud. That's the only way to find out how viable an option it is. I'd also be the first
Yes, staging and production are where you need maximum scalability, redundancy and availability ... the cornerstones of most cloud offerings. That's why I don't think they are as relevant for quality assurance or development. Quick turnaround and unfettered access to the system are more relevant to these groups. Let's look at virtualization and cloud computing. Should a development lab be fully virtualized before development is ported to the cloud?
It shouldn't matter. The applications can largely run in local virtual environments, or dedicated clones like the Eucalyptus setup is to Amazon Web Services. As long as the APIs don't change, and as long as basic application profiling is equivalent, then development and QA can happen in a local virtual environment, and only staging and production should go to the cloud. Couldn't a development department save enough money by virtualizing a lab to lessen the need for using the cloud also?
It's hard to tell. Developers still need to have local development environments on their desktops and use a cloud-like environment for integration with other application components. Perhaps the savings kick in when migrating environments to QA and onward.