News Stay informed about the latest enterprise technology news and product updates.

New OpenStack standards boost cloud interoperability

With new federated identity standards, OpenStack is closer to delivering on the promise of being able to scale and switch between cloud vendors.

VANCOUVER -- OpenStack has taken a step to meet its long-held promise of no vendor lock-in amongst its myriad cloud...

offerings.

The OpenStack Foundation added new interoperability and federated identity standards this week for some of the core technologies at its semi-annual conference here. Some of the biggest vendors in the ecosystem support the new criteria seen as an important step in the maturation of the technology.

Internet betting exchange Betfair Group plc in London is in the proof of concept phase of a total migration to OpenStack with Red Hat. The company has looked at bursting to public clouds with Amazon Web Services (AWS), but the federated identity feature could change those plans, said Steven Armstrong, principal DevOps automation engineer for the company.

"That really struck home," Armstrong said. "We could use that so instead of bursting to AWS, I just burst it to another OpenStack cloud."

Sixteen vendors of the open source technology meet the new requirements for interoperability, including IBM, HP, Red Hat, Rackspace and Mirantis. Public and private cloud products branded "OpenStack Powered" will have to meet interoperability benchmarks for certain core services and the test results will be available on the OpenStack marketplace. There are varying degrees of requirements needed to attain the branding, but they generally involve core technologies of compute, storage and networking.

In addition, 32 vendors support a new federated identity feature for OpenStack Kilo, the latest release of the software. The feature will be available later this year and is intended to enable hybrid or multi-cloud deployments within the OpenStack ecosystem.

Guillaume Aubuchon, CTO of DigitalFilm Tree, a post-production, software and process consulting company in Los Angeles, demonstrated during the keynote here its proof-of-concept implementation of these new standards in action with HP and Blue Box.

They're doing a better job of certifying vendors so vendors that just pay lip service are out of there.
Donna Scottvice president and distinguished analyst with Gartner, Inc.

The OpenStack standards are an important first step in meeting the goal of interoperability, Aubuchon said. There are still rough edges around authentication and trust between vendors is always a hurdle, but getting the rest of the way shouldn’t take too much longer, he added.

"A large reason why I chose OpenStack to begin with was the idea of federated identity three and a half years ago -- the promise of being able to scale and switch between vendors as necessary," Aubuchon said. "I feel like, as we have the first taste of that, it will come to fruition quickly."

OpenStack has been used primarily for private cloud, but the new identity feature could foster more growth of public cloud platforms because enterprises want hybrid environments, said Donna Scott, vice president and distinguished analyst with Gartner, Inc., based in Stamford, Conn.

"It's a big deal because it would enable you to federate across not just your own OpenStack clouds but public clouds too, and enable that bursting and getting extra capacity," Scott said. "If OpenStack is going to compete in the API-driven, faster software development [world], it needs to do both private and public."

This addition doesn't help with compatibility between OpenStack clouds and the major public clouds such as AWS, but if OpenStack APIs become dominant in the next five years, it's feasible that those public-only cloud vendors could implement a translation layer to support those APIs, Scott said.

Interoperability has improved since the Icehouse release a year ago and these new OpenStack standards are important, too, Scott added.

"You can't rely on a foundation that isn't strong, reliable and stable," Scott said. "They're doing a better job of certifying vendors so vendors that just pay lip service are out of there."

Providers declared the new standards to be important steps in meeting the mission of the project, but acknowledged the ultimate goal isn't a reality yet.

"There's still a lot more work to be done," said Jonathan Bryce, executive director of the OpenStack Foundation. "We've been taking our time and talking through the process, but we feel very comfortable this is going to help establish an interoperable set of capabilities."

There had been a level of interoperability prior to this, but there were gaps in the past, Bryce said. The Foundation spent over a year going through the key pieces of functionality to establish a formal set of standards.

"As you go very broad, you risk fragmentation and that's one of those concerns we've heard over time," Bryce said. "[These standards] are preventative against that because it provides a target so everyone knows this is how closely I'm matched up to the OpenStack core."

Trevor Jones is the news writer for SearchCloudComputing. You can reach him at tjones@techtarget.com.

Dig Deeper on Cloud interoperability and portability

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

It'll be interesting to watch if more people care about interoperability at the OpenStack (IaaS) level, or at the PaaS/Application level with things like the new Cloud Foundry standard that was released - https://www.youtube.com/watch?v=JmxpX1F55bY
Cancel

-ADS BY GOOGLE

SearchServerVirtualization

SearchVMware

SearchVirtualDesktop

SearchAWS

SearchDataCenter

SearchWindowsServer

SearchCRM

Close