Artificial intelligence (AI) is often described as a major breakthrough that demands attention, with all companies encouraged to have an “AI strategy.” The core technology of AI producing results that fuel this perspective is machine learning using deep neural networks (“Deep Learning”). Deep Learning has indeed provided significant breakthroughs, including improved speech recognition and natural language understanding of digital assistants such as Apple’s Siri, Google Assistant, and Amazon Alexa. Deep Learning has also motivated more specific enterprise applications, such as reducing energy consumption in Google’s data centers or helping Facebook remove objectionable content more quickly.
What is Deep Learning?
But deep learning is not based on new inventions. The core technology has been around for decades. For example, the “backpropagation” technique used to find the best deep neural network matching a dataset was developed by Rumelhart, Hinton, and Williams in 1986. A 729-page book from MIT Press, NeuroComputing: Foundations of Research, published in 1988, reprints 43 articles on the subject. As the power of deep learning became evident, researchers naturally made substantial methodological improvements, but no one claims to have recently “discovered” the core technology.
If the methodology is not new, what has caused AI to have such an impact? The answer is simple: the increase in computer power and memory has crossed a threshold that makes methods practical. When this author wrote a book on machine learning for computer pattern recognition in 1972, computing power was about a billion times more expensive than today. When a company I founded applied the technology to speech recognition development a decade later, it was typical to run a machine learning analysis for several months before converging. This limitation led to what was called “AI winter,” where research on core technology was discredited.
A Technical Prerequisite
Part of the growth in computing power has been the improvement of computer chips as described by the “Moore’s Law,” with the number of transistors on a chip doubling approximately every two years. The cost of an hour of computation has also decreased at about the same rate.
Learning is just one technology that the increase in computing power has enabled over time. Smartphones, which have had a significant impact on our lives, are an obvious example.
What Limits
Will the growth of computing power continue, or is it reaching the limits of Moore’s Law? Other trends may indeed accelerate the growth of affordable computing power. A long-term trend is quantum computing, but short-term trends also drive improvement. This includes the growth of cloud computing, where computing power can be rented rather than requiring costly investments in a server farm. Additionally, specialized chips such as graphics processing units are incorporated into data centers, providing parallel computing for specialized tasks such as deep learning. By performing multiple processes simultaneously, parallel computing provides significant acceleration of appropriate tasks.
Moreover, devices connecting cloud computing to individuals, such as smartphones and cars, gain more computing power with each new model. This further increases the total computing power available.
A Strong Trend for AI
AI is nothing new; it is just a reflection of the increase in “computational intelligence,” as I characterized it in my recent book. If the impact of AI is just an example of the exponential growth of computing power over time, we can expect future breakthroughs as it crosses new thresholds—perhaps crossing these thresholds even more rapidly than in the past.
Continuing today’s trends, for example, will lead to increasingly personalized digital assistants that are easy to use and provide growing amounts of information and services. Connecting with computers will become more and more like a human conversation. Children who grow up with a digital assistant constantly at hand through smartphones or smartwatches will find that such “augmented intelligence” becomes almost a part of being human.
Artificial intelligence is a category of symptomatic applications of what is to come, driven by the long-term trend of increasing computing power. More generally, computational intelligence will increasingly impact our lives in surprising ways.







