The Industrial Era Ended, and So Will the Digital Era
19 July, 2018 / ArticlesIn a famous scene in the 1967 movie The Graduate, a family friend takes aside Dustin Hoffman’s character, Benjamin Braddock, and whispers in a conspiratorial tone, “Plastics….There’s a great future in plastics.” It seems quaint today, but back then plastics really were new and exciting.
If the movie had been set in another age, the advice to young Braddock would have been different. He might have been counseled to go into railroads or electronics or simply to “Go West, young man!” Every age has things that seem novel and wonderful at the time, but tepid and banal to future generations.
Today digital technology is all the rage because after decades of development it has become incredibly useful. Still, if you look closely, you can already see the contours of its inevitable descent into the mundane. We need to start preparing for a new era of innovation in which different technologies, such as genomics, materials science, and robotics, rise to the fore.
To understand what’s happening, it helps to look at earlier technologies. The rise of electricity, for example, began in the early 1830s, when Michael Faraday invented the electric dynamo and motor. Still, it wasn’t until 50 years later that Edison opened his first power plant, and then 40 years after that, during the 1920s, electricity began to have a measurable impact on productivity.
Every technology follows a similar path of discovery, engineering, and transformation. In the case of electricity, Faraday uncovered new principles, but no one really knew how to make them useful. They first had to be understood well enough that people such as Edison, Westinghouse, and Tesla could figure out how to make things that people would be willing to buy.
However, creating a true transformation takes more than a single technology. First, people need to change their habits, and then secondary innovations need to come into play. For electricity, factories needed to be redesigned and work itself had to be reimagined before it began to have a real economic impact. Then household appliances, radio communications, and other things changed life as we knew it, but that took another few decades.
Our world has been thoroughly transformed by digital technology. It would be hard to explain to someone looking at an IBM mainframe back in the 1960s that someday similar machines would replace books and newspapers, give us recommendations on where to eat and directions for how to get there, and even talk to us, but today those things have become matters of everyday habit.
And yet today there are several reasons to believe that the twilight of the digital age is upon us. (Importantly, I’m not arguing we’ll stop using digital technology — after all, we still use heavy industry, we just no longer refer to ourselves as being in the Industrial Age.)
I see three main reasons that the digital era is ending. First is the technology itself. What’s driven all the excitement about computers is our ability to cram more and more transistors onto a silicon wafer, a phenomenon we’ve come to know as Moore’s Law. That enabled us to make our technology exponentially more powerful year after year.
Yet now Moore’s Law is ending and advancement isn’t so easy anymore. Companies such as Microsoft and Google are designing custom chips to run their algorithms because it is no longer feasible to just wait for a new generation of chips. To maximize performance, you increasingly need to optimize technology for a specific task.
Second, the technical skill required to create digital technology has dramatically decreased, marked by the rising popularity of so-called no-code platforms. Much as with auto mechanics and electricians, the ability to work with digital technology is increasingly becoming a midlevel skill. With democratization comes commoditization.
Finally, digital applications are becoming fairly mature. Buy a new laptop or mobile phone today, and it pretty much does the same things as the one you bought five years ago. New technologies, such as smart speakers like Amazon Echo and Google Home, add the convenience of voice interfaces but little else.
While there is limited new value to be gleaned from things like word processors and smartphone apps, there is tremendous value to be unlocked in applying digital technology to fields like genomics and materials science to power traditional industries like manufacturing, energy, and medicine. Essentially, the challenge ahead is to learn how to use bits to drive atoms.
To understand how this will work, let’s look at the Cancer Genome Atlas. Introduced in 2005, its mission was simply to sequence tumor genomes and put them online. To date, it has catalogued over 10,000 genomes across more than 30 cancer types and unlocked a deluge of innovations in cancer science. It has also helped inspire a similar program for materials called the Materials Genome Initiative.
These efforts are already greatly increasing our ability to innovate. Consider the effort to develop advanced battery chemistries to drive the clean energy economy, which requires the discovery of materials that don’t yet exist. Historically, this would involve testing hundreds or thousands of molecules, but researchers have been able to apply high-performance supercomputers to run simulations on materials genomes and greatly narrow down the possibilities.
Over the next decade, these techniques will increasingly incorporate machine learning algorithms as well as new computing architectures, such as quantum computing and neuromorphic chips, that function very differently than digital computers do.
The possibilities of this new era of innovation are profoundly exciting. The digital revolution, for all of its charms, has had a fairly limited economic impact, compared with earlier technologies such as electricity and the internal combustion engine. Even now, information technologies make up only about 6% of GDP in advanced economies.
Compare that to manufacturing, health care, and energy, which make up 17%, 10%, and 8% of global GDP, respectively, and you can see how there is vastly more potential to make an impact beyond the digital world. Yet to capture that value, we need to rethink innovation for the 21st century.
For digital technology, speed and agility are key competitive attributes. Techniques including rapid prototyping and iteration greatly accelerated development and often improved quality, because we understood the underlying technologies extremely well. Yet with the nascent technologies that are emerging now, that is often not the case.
You can’t rapidly prototype a quantum computer, a cure for cancer, or an undiscovered material. There are serious ethical issues surrounding technologies such as genomics and artificial intelligence. We’ve spent the last few decades learning how to move fast. Over the next few decades we’re going to have to relearn how to go slow again.
So while the mantras for the digital age have been agility and disruption, for this new era of innovation exploration and discovery will once again become prominent. It’s time to think less about hackathons and more about tackling grand challenges.
The science man and innovator, Fernando Fischmann, founder of Crystal Lagoons, recommends this article.