BACKGROUND IMAGE: iSTOCK/GETTY IMAGES
Rigid rules-based analytics falls short with unstructured data arriving from multiple sources. Cognition allows analytics to flex and consider a wider range of possibilities.
First there was analytics; then there was cognitive computing. Put them in a blender and the result is something completely new: cognitive analytics.
"We see cognitive analytics as the way in which the human brain approaches a problem," says Stuart Gillen, director of business development at SparkCognition Inc., an Austin, Texas, company that creates products powered by artificial intelligence to enhance cybersecurity and leverages machine learning technology to predict equipment failures before they happen. "Rather than being focused on one particular technique, where we see a lot of artificial intelligence organizations going, we use a variety of different patterns and learn from them."
Developers within corporate IT will play a key role in integrating cognitive analytics services with existing data stores and incoming data streams.
Traditional analytics is built upon a rigid rules-based approach. But cognitive analysis attempts to assimilate collected data and metrics using an analytics engine that flexes, or adjusts, in response to changing conditions. Just as the human brain does not look at one isolated aspect of any situation, cognitive analytics considers everything within view and context to come up with the best answer to a question or propose the most-advantageous solution to a problem, Gillen says.
Stuart Gillendirector of business development, SparkCognition
SparkCognition applies its analytics technology to its SparkSecure security offering, treating IP addresses or network ports as if they were internet of things sensors. The company provides cybersecurity in the retail, banking and power-generation utilities. The same underlying technology drives the company's other product, SparkPredict, which is geared toward the predictive maintenance industry, he says.
Citing electricity-generating wind turbines as an example, Gillen says SparkCognition can predict failures, allowing utilities to save money by servicing several of the giant towers as a group, rather than responding to individual failures in hundreds of towers spread across thousands of square miles. With turbine tower heights now exceeding 250 feet plus individual rotor blade lengths of nearly 150 feet, the cost of a crane can approach a million dollars per day. Predictive servicing in groups is highly economically favorable, he says. Understand that if, within a month, five turbines might break, a planned approach can be implemented, sending the crane and crew out once, avoiding millions of dollars in wasteful spending.
Data uniqueness is a myth
One reason cognition plays a role in modern analytics is that data is not structurally static as it was in decades past. "Data is continuously changing and being modified," Gillen says. As an example, he notes that the same model gas turbine operated in California and Alaska are going to generate vastly different performance metrics, which, in turn, dictates different approaches to maintenance. "Taking a strict, structured approach is not going to work," he says. The melding of machine learning, cognitive computing and artificial intelligence represents the next revolution.
Similarly, while it's something of a tradition for companies in any industry to say that their operations and information flows are distinctive compared with competitors, it just isn't so, says Gillen. "Everybody thinks their data is unique," he says. "I think a lot of processes are unique, but the data and patterns we are finding are not unique." For that reason, SparkCognition is able to apply the same cognitive analytics to a pump and interpret the data properly, regardless of where it's installed -- working miles below ground in a mine, in an oil field or in a nuclear power plant. "A piece of data is a piece of data, and we're looking for patterns," Gillen says.
Discovering hidden data patterns
Cognitive analytics excels at finding patterns in data that users might not have been searching for in the first place. These latent patterns can turn out to be crucial.
Returning to his example of a company that has been manufacturing pumps for more than a century, Gillen says SparkCognition was handed a mass of accrued data which the manufacturer believed contained five failure patterns. Gillen says that SparkCognition's analytics quickly found the five failure patterns, and it also discovered two additional "modes of failure" within the data that the manufacturer had not seen. The key, Gillen says, was allowing SparkCognition's cognitive analytics to take an ambiguous view of the data without preconceptions.
A key role for developers
"Everybody today talks about open (software developer's kits) SDKs and open APIs. We're no different," Gillen says. The nature of SparkCognition's technology is that it can be everywhere "except the last mile," he says. Through its SDK and APIs, the final 20% of a typical implementation -- whether on premises or in the cloud -- falls to developers to gather results and show them through a customer experience, such as a dashboard, business intelligence system or custom application. "It is up to developers … to take results and present them in a meaningful way within their organizations," he says.
As uses for cognitive analytics continues to broaden -- now extending to the realm of driverless cars -- the handling of data and how developers apply the results of analysis will continue to present new opportunities.
Is it feasible to combine big data and cognitive computing?
Use cloud analytics to view the past and 'see' the future
Cognitive computing compels developers to refocus skills