When we look at history, we often compress a series of events into a larger narrative that looks inevitable with hindsight. It is not novel to note that we’re in the middle of a period of astounding change in human history. During the agricultural revolution, ancient man may have debated the revolutionary types of dances necessary to bring on the rain. During the industrial revolution, maybe the emphasis was on coal at some points rather than the larger picture of automation.
If we look down on our currently landscape from a sufficiently high altitude, all of our technological progress is towards a future of intelligent machines.
Shortly after transistors were invented, Gordon Moore of Intel noted that computers were essentially doubling in capability every couple of years. This continual exponential progress took place from the invention of the transistor until deep into the new millennium. This happened not just with chips, but also with computer memory and storage as well.
We’ve since noticed this as a general tendency in technology to improve exponentially over time rather than a more intuitive slower (linear) improvement that we see in the natural world. People are using improved technology to create even better technology.
Much of improvement in processor speeds that Moore noticed in the early days of the microprocessor have slowed or even stopped in the latter 2000’s, but that amazing advancement in hardware set the stage for the next advancements in communication and collaboration.
The Internet and global collaboration
Late in the 90’s, internet use started to accelerate with the introduction of the World Wide Web. The internet of the 90’s and beyond was arguably enabled by hardware innovations from the Moore’s law explosion. Routers and other core networking hardware were finally fast enough to support the masses. Technology to convert cable signals to internet service matured. Desktop computers and display hardware and processing were to the point where normal people could share media online.
In addition to the rise of the “sharing economy” in the form of media piracy, the internet also accelerated the pace of discovery and collaboration. Open source software including the Linux operating system were all enabled by the internet. The availability of low cost, high quality software enabled by the internet would be the underpinning of the world to come. It allowed companies to shortcut the decades of man years of development to create new platforms.
Linux and Darwin, and to an even greater extent, open source web browsers would form the core of the upcoming mobile revolution.
Mobile and cloud computing
In the mid 2000’s, the mobile revolution grabbed the baton and started sprinting down the track. As desktop computing waned in popularity, mobile brought ubiquitous computing to everyone and was enabled by the communication innovations and open source software made possible by the internet. The explosion in mobile devices brought scaling challenges to service providers on the internet. It is no coincidence that just as billions of devices were coming online, “cloud computing” was beginning to enter the lexicon.
Cloud computing may be a buzzword, but it represented the availability of massive amounts of computing to those who needed it. It accelerated the pace of experimentation in the new world of mobile and internet computing. With the recent availability of cloud GPUs (graphics and AI accelerators) and FPGAs (custom programmable chips), the cloud continues to play a role in bringing exotic and powerful hardware to people on demand.
With unlimited computing available from the cloud vendors, and billions of devices around the globe in the hands of users, internet providers started to gather incredible volumes of data. With this data coupled with advanced GPUs, researchers noticed that previously discarded artificial intelligence approaches started to work with startling results. The data collected by mobile devices, facilitated by cloud computing enabled the next (and most important revolution), artificial intelligence.
In 2012, the new artificial intelligence revolution became apparent with a presentation of breakthrough speech recognition and translation using neural networks and deep learning by researchers at Microsoft.
Deep learning requires massive amounts of data to train neural networks; data that was provided by users of mobile devices numbering in the billions.
In the future, it’s likely that all of these different revolutions will be viewed as the Artificial Intelligence revolution in a similar way as many different events are now viewed as the “Industrial Revolution” by laymen.
Artificial intelligence will be so important moving forward that it will seem as if it must have been the ultimate goal of our technology all along. With recent improvements in speech recognition and understanding, computer vision and object recognition, we don’t need any more innovations to transform our world. Machine’s do not have to think like people do or have behavior we’d normally associate with consciousness to change our world. We only require them to “think” about our problems and provide us with solutions, which they’ve already begun to do. If progress in artificial intelligence stopped today, we’d already have the tools necessary to disrupt every industry in our economy.
Of course, innovation isn’t slowing. Our industries have been unable to absorb even the last several year’s innovations before they are overshadowed by more recent ones. The changes coming to our world are hard to predict and seem difficult to exaggerate.
Human-like intelligence is likely very far off in the future, if it is possible with our current approaches at all, but it hardly matters. Perhaps in the future all our technological progress over the last couple hundred years will look like a natural progression towards what we are all witnessing now; the introduction of a new type of mind to the world.