Virtualization to cloud: Planning, executing a private cloud migration
A comprehensive collection of articles, videos and more, hand-picked by our editors
Creating an internal or private cloud that gives IT managers the benefits of a cloud running inside the data center...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
isn't easy. IT pros will have to build it and glue it together themselves, as no single vendor currently provides all of the pieces needed. Some packaged software is beginning to appear to help create private clouds, such as Ubuntu Enterprise Cloud (UEC), but it's limited to Linux-based clouds.
What is a private cloud, anyway?
It resides inside your data center (on-premise), giving IT managers complete control over the available resources. A typical private cloud relies on the security measures available within the cloud and the data center. It automates workflows and takes out human tasks like configuring routers and load balancers or setting up firewall rules. These are steps that many enterprises with virtualized servers are doing by hand. Enterprises often have a firewall guy, a router guy, a load balancing expert, a storage expert, a guy who is responsible for the operating systems and more.
Chris Swan, CTO of Capital SCF, says that cloud computing goes well beyond server virtualization by offering customers more options for increasing data center flexibility and reducing costs. There is a big difference between setting up VMware ESX clusters in a data center for server virtualization and implementing an internal cloud.
For private clouds to be useful and continue in the direction set by public clouds, IT pros need to automate as many manual tasks as possible to bring up capacity on the fly and put it back when it's not needed. This will require an inventory management system that does not exist commercially today, one that can keep track of all IT space, the operating systems running on each machine, how many physical devices you have, how much access capacity is available, trigger points for adding more disk space on storage area networks (SANs), the software stored on each machine and more.
It also requires orchestration to ensure that tasks are executed in the right order. When a user asks for a server with one CPU, one GB of RAM, a 250 GB hard drive and Red Hat Enterprise Linux (RHEL) 5.3, the cloud software layer has to go out and grab an IP address, set up a virtual local area network (VLAN), put the server in the load balancing queue, put the server in the firewall rule set for the IP address, load the correct version of RHEL, patch the server software when needed and place the server into the nightly backup queue.
Automation beyond our current means
This type of automation replaces hundreds of discrete tasks normally done manually by specialists. The investment that cloud vendors are making is in using the correct cloud layer to automate in hours these tasks that used to take many days. None of the current external cloud providers, however, are close to providing this level of automation.
The data center staff will have to create the automation layer for their internal cloud because no current vendor provides a complete software layer. The staff will essentially have to buy the pieces and put them together. Jeff Deacon says that Version Business uses a combination of Cisco software on the front end, HP for compute power and OpsWare to create its cloud environment. Deacon also says, however, that it costs a lot of money and time to build a real production-quality private cloud.
IT managers will have to meld together tools to manage private clouds and other resources in a data center. The tools to manage both physical and virtual resources have been slow to emerge. Even though server virtualization is growing rapidly, the management tool industry is playing catch-up.
There are no system management tools to seamlessly manage a mixed environment that incorporates existing data centers and cloud computing. System management tools were developed at a time when hardware was expensive and difficult to replace. Clouds are designed based on very different assumptions: hardware is cheap and hardware will fail. Cloud providers also build in redundancy. This requires a different management philosophy, one that public cloud providers such as Amazon and Google have adopted.
As you might expect, not everyone believes that applications should be virtualized in your own data center using server virtualization or run on private clouds. According to Hylton van Zyl, R&D specialist at Credit Suisse, IT managers should considering using Software as a Service (SaaS) as a delivery model for their applications. He says that SaaS providers, such as Salesforce.com, will likely provide you with good security, as they own the entire software stack.
Private clouds are less risky but not perfect
Public clouds and external private clouds have a number of barriers and risks that are balked at by many IT managers. Some of the risks that surround public clouds are also risks for private clouds, but the degree of risk is generally less. In some cases, such as control of resources and security, the risks are much less.
Cloud security is always a top risk when using external clouds. This is one of the major reasons that IT managers favor private clouds over public clouds for many of their applications. Chris Hoff, a well-known security guru at Cisco, says that security is just one of several issues with public clouds. He says that on Amazon EC2 you have virtual machines, virtual appliances, and Amazon Machine Images (AMIs) running on hundreds or even thousands of servers. You have no idea what is in these images, who built them and where they came from. With private clouds, though, IT managers have the control that public clouds are missing.
While the degree of a risk can be much less with private clouds, compliance and regulatory issues, software licensing, availability, scalability, service-level agreements (SLAs) and the effects of new technology like server virtualization on IT workers and their jobs are still issues with private clouds.
One of the issues with public clouds is that cloud providers oversubscribe, and oversubscribing can lead to the equivalent of downtime, loss of availability, for unlucky users. Cloud providers occasionally move workloads around because there are too many users sharing the same resources at one time and performance suffers. These issues are less likely to happen with private clouds, but they can without proper management tools.
Adhering to cloud standards is important for private cloud users
Private cloud users must work with standards organizations to ensure that their clouds are tracking the important standards when they appear. The reason standards are so important is that users will eventually want to move applications from cloud to cloud, and this is very difficult without standard interfaces.
Amazon's EC2 interface is currently as close to a standard as we have; this is the reason that Canonical adopted it for UEC. Red Hat's deltacloud open source project is an effort to try to make it easier for a data center, through a single interface portal, to work with differing cloud providers' using drivers to interface with each cloud application programming interface (API).
Given the almost non-existence of cloud interface standards, companies like Johnson & Johnson are working with RightScale to develop an abstraction layer for Amazon that will make it easier to deploy and move applications onto clouds. Management software that creates an abstraction layer, such as RightScale, will serve as fill-ins until real cloud standards are created and adopted.
About the author:
Bill Claybrook is a marketing research analyst with more than 30 years of experience in the computer industry, with the last 10 years in Linux and open source. From 1999 to 2004, Bill was Research Director, Linux and Open Source, at Aberdeen Group in Boston. He resigned his competitive analyst/Linux product marketing position at Novell in June 2009 after spending over four and a half years at the company. He is now president of New River Marketing Research in Concord, Mass. He holds a doctorate in computer science.