Google is starting to piece together a cloud platform strategy beyond just being a lower-cost option than the competition,...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
but it's a considerable risk, banking on a set of services many enterprises likely won't use for years to come.
Machine learning and deep analytics are the latest trend to gain attention in the cloud market, and Google has latched on wholeheartedly -- it sees these tools a way to differentiate in the market, by externalizing what has driven it to be one of the largest corporations in the world.
The most full-throated endorsement of this strategy emerged at the GCP Next user conference back in March. Eric Schmidt, chairman of Google parent company Alphabet Inc., talked in broad strokes about creating an internet operating system, adding that in five years, every major IPO will be for companies using machine learning.
"The platform is not the end; it's the bottom, and above it is machine learning," Schmidt said.
Google also released several machine learning services for image and speech recognition and translation. And recently, the firm said it's been running specialized Tensor Processing Units tailored for machine learning inside its own data centers for the past year.
TensorFlow, which was open-sourced in 2015 as a machine learning framework, follows a similar route as Kubernetes, the container orchestration tool based on internal Google technology that was open-sourced in 2014, just as Docker and Linux containers were taking off. Kubernetes has become one of the go-to tools for running containers at scale, and Google hopes TensorFlow can do the same for machine learning, while luring in customers with higher-level services.
Focusing on machine learning won't bring the meat-and-potatoes workloads onto its platform, but it could position Google well for the future, said Andrew Reichman, research director for cloud data at 451 Research.
"It's less about the stuff you do today and more about the stuff you're going to do tomorrow," Reichman said. "Analyzing web traffic and understanding sensor data and stuff outside the data center is going to reap huge amounts of IT spending in the next 20 years, so Google is positioning itself around its strengths."
That will require a holistic approach, because users will want all that underlying data on the platform they use to run analytics, if and when these newer types of applications eventually come into vogue. If Google is too far ahead of the market and doesn't do enough to get enterprise workloads on the platform now, it runs the risk of losing out, because potential customers will be locked in to other platforms.
"It's risky because it has a lot of dependencies and lots of bridges that need to be built, and it's unclear that enterprises are going to spend money or plan products around it when it's so nascent," Reichman said.
Google is in a tenuous situation. Amazon has the early lead and tremendous pace of innovation, while IBM and Microsoft have deep roots with enterprise customers and know how to work on a broad range of workloads at scale, Reichman said. If those companies are viewed as the bookends of the public cloud market, Google basically has neither of those things going for it.
Google has emerged as a hyperscale leader in the market and is expected to be among a group that takes a larger share in the future, though its revenue falls well below Amazon Web Services (AWS), which commands close to 40% of the total infrastructure-as-a-service and platform-as-a-service market, according to 451 Research's Market Monitor Cloud Computing service.
Price has been Google's biggest selling point to date. The strategy played a role in the tit-for-tat price cuts bandied about by Google, Amazon and Microsoft in 2014 and 2015. That never seemed to help the company improve its traction in the market, however, as enterprise adoption lagged and Azure settled comfortably into the No. 2 role behind AWS. In fact, the conversation in 2016 largely has shifted to which platform provides the best means to utilize the more expensive, higher-level services.
Big data, big draw
Google Cloud Platform customers are quick to point to big data services as the major draw to the platform, and the first item they often list is data warehouse service BigQuery. Some industry observers don't see BigQuery as a differentiator for Google, but users say it saves them huge amounts of time and money.
Kabam Inc., a San Francisco-based entertainment company that makes video games for mobile devices, shifted to BigQuery over the past 18 months to address scale and performance issues after launching Marvel: Contest of Champions, which was generating a terabyte of data a day. Kabam was relying on internal services and Amazon Redshift, but the service had limitations around scaling with storage and CPU, said CTO Jeff Howell.
"We've been down that path, and it's just a night and day difference" between Redshift and BigQuery, Howell said.
Kabam also has machine learning running behind the scenes for many of its games for predictive analytics based on how players interact with games, and plans to expand those capabilities and the number of games on the platform as they invest more in data science.
Jeff HowellCTO, Kabam
"[Google is] solving the big problems that you don't want to solve, and they're managing these solutions, which is a lot different than someone hosting a database on hardware and you manage scalability and maintenance," Howell said.
Another Google Cloud Platform convert is WinField Solutions LLC, an arm of Land O'Lakes Inc. that provides IT services to the food industry, which developed a data management system for farmers, called Data Silo. When researching where to build the system, there were certainly benefits to moving to AWS, particularly around the breadth of services available, said Teddy Bekele, vice president of IT. WinField also uses Azure for a variety services, but eventually opted to build Data Silo on Google Cloud Platform because of its mapping capabilities tied to Google Maps.
WinField also uses Google Cloud Storage and PostrgeSQL database. And while the Data Silo service has been constructed so it can be lifted off Google's cloud if needed, eventually, the company wants to use Google's machine learning and analytics capabilities, too.
"Right now, we are more focused on getting the data into Silo, and then obviously making sure folks can get access to it -- but that's today's problem," Bekele said. "What we really want to get into is the ability to leverage that machine learning capability, whether it's Google BigQuery or Cloud Bigtable -- things that they have available like that to look at the data, and then start looking at patterns both for a single farmer, but also a group of farmers together."
Google's efforts around machine learning are reminiscent of when IBM came out with Watson -- it was interesting, but unclear how it would translate to something most traditional enterprises could use, said Meaghan McGrath, analyst with Technology Business Research Inc., based in Hampton, N.H.
"Machine learning and artificial intelligence are still really new ideas," McGrath said. "For more traditional enterprises, it's more like pie-in-the-sky sort of stuff."
It's going to take someone appealing to IT departments and turning them into advocates, because at this point, most use cases don't apply to traditional enterprises, she added.
SearchCloudComputing site editor Kristin Knapp contributed to this report.
Trevor Jones is a news writer with TechTarget's data center and virtualization media group. Contact him at email@example.com.
A look at the biggest Google cloud updates in 2015
Google's enterprise struggles continue with cloud
Google, AWS face off in container market