Developers who build artificial intelligence applications have reaped the benefits of improved machine learning...
tools and souped-up hardware performance.
Advancements in graphics processors and AI accelerators have contributed to generational advancements in AI application development, said AI experts on a panel at IBM's Index Developer Conference last week.
New AI frameworks such as TensorFlow, the open source library for data flow programming developed by Google, have lowered the barriers for developers to more easily to create AI applications, said Alex Smola, director of machine learning and deep learning at AWS.
"Things I would give my Ph.D. students in 2015 to solve I can now do with a few lines of code," he said.
Order-of-magnitude improvements in hardware performance help to power performance-hungry AI applications, said Smola, who previously taught machine learning at Carnegie Mellon University. Companies such as AWS offer ready-made AI components and services, such as speech recognition, image recognition, natural language processing and others, that enable developers to quickly assemble more intelligent applications.
"The tools are bringing AI to normal engineers," added Rajat Monga, principal engineer for TensorFlow at Google.
Facebook also has some homegrown machine learning tools and has built its own hardware to run machine learning apps that screen out offensive images and text, said Index panelist Yuri Smirnoff, a Facebook optimization expert.
Although the leading AI players each have their development toolkits, they also encourage developers to use what is most suitable to them, the AI experts agreed. AWS supports all of the major AI development tools to give developers the choice of any tool they want to use, Smola said.
Google also supports many tools, Monga said. "We're not trying to lock people in."
IBM allows developers to use whatever AI tools they like to build apps on the IBM cloud, but the company also wants to help developers understand which tool or combination of tools are better to solve a particular problem, said Francesca Rossi, a distinguished researcher in AI ethics at IBM's Thomas J. Watson Research Center in Yorktown Heights, N.Y.
What's next for AI development tools
The next generation of AI tools will continue to make developers' lives easier. Almost certainly, they will need to be easy to use on all hardware platforms -- PCs, servers, mobile devices, embedded devices -- and generate models that look much more like programming languages, Smola said.
Deep learning toolkits such as TensorFlow, Apache MXNet, Deeplearning4j, Torch, PyTorch, Microsoft's Cognitive Toolkit, Caffe, Theano and others address the wide range of capabilities for which people want to use AI and machine learning.
AI is still complex because practitioners must understand data models and how to train neural networks. Fortunately, the AI tooling has evolved rapidly in the space over just the past five or six years, said Ronald Schmelzer, senior analyst at Cognilytica, an Ellicott City, Md., firm that specializes in AI.
"The market for AI developer and data scientist mindshare and usage is really heating up and very competitive," he said. "This is driving vendors to make these tools a lot more advanced, user-friendly and usable to both sophisticated data scientists and more typical enterprise application developers who aren't AI experts."
The most excitement in the AI tools space right now is around machine learning, with major innovation over the past five to seven years, said Sam Charrington, analyst at CloudPulse Strategies.
"In the good old days, most models were developed in SAS or R and thrown over the wall to developers to code and deploy by hand," he said. Now, much of the exploratory work is done in true programming languages like Python, which greatly simplifies deployment. "There's a ton of work left to do in this space, to be sure, but without a doubt, it's easier than ever before for developers to incorporate machine learning and AI into their applications."