3 December 2017
Artificial Intelligence is hip. Hype seems more descriptive, but at the very least it’s hip. With so many promising new applications that see the light of day, I can appreciate why venture capital investors are keen to stay on top of this. A quick Google search suggested that some $15 Bn has been invested in AI over the past 5 years. At the same time, I struggle to think of any killer applications that have wowed me. Why? Maybe because my perspective on AI is a little too “down to earth”…
What is AI, really? The way I see it, data science has put Machine Learning largely in the mainstream, and then the next step towards integration in a value chain leads to AI. It is that simple. If an algorithm that can “predict” spelling errors gets integrated into a text processing solution, you have a piece of AI: “auto correct.” Love it or hate it, these efforts indisputably are aimed to make software smarter (as in: more intelligent).
When computer program AlphaGo defeated Lee Sedol (9-15 March 2016), this came as a surprise to the Go community. Even more so because only a year before (October 2015) a previous release showed great promise (AlphaGo vs Fan Hui), but didn’t seem nearly on a par with world class players. For thousands of years people have been playing and extensively studying this game, yet AlphaGo came up with several unexpected and highly innovative moves. This shows the remarkable promise that “thinking machines” can show. Unlike some of the silly and stubborn auto-correct jokes that float on the internet 🙂
Integrating machine learning outputs into primary process workflows is anything but new. In one of the (very) early applications of machine learning, retail consumer credit scoring, we have been doing that for decades. I never heard anyone refer to that as “artificial intelligence”, yet we have known for many years that computers are (much) better at assessing credit risks (underwriting). It has turned that whole industry upside down, albeit quietly.
What we do when we embed a trained (!) algorithm in our work process is both fairly advanced, yet at the same time pretty mundane. You remove friction from deployment, to streamline processing, That’s it. No more, no less. Maybe I am jaded and old, but I struggle to justify artificially inflating it to anything else…