Fernando Fischmann

A New Era Of Innovation

16 November, 2016 / Articles

For the past 20 or 30 years, innovation, especially in the digital space, has been fairly straightforward. We could rely on technology to improve at a foreseeable pace and that allowed us to predict, with a high degree of certainty, what would be possible in the years to come.

That led most innovation efforts to be focused on applications, with a heavy emphasis on the end user. Startups that were able to design an experience, test it, adapt and iterate quickly could outperform large enterprises that had far more resources and technological sophistication. Agility was often the defining competitive attribute.

Yet in the years to come the pendulum is likely to swing from applications back to the fundamental technologies that make them possible. Rather than being able to rely on trusty old paradigms, we’ll largely be operating in the realm of the unknown. In many ways, we’ll be starting over again and innovation will look more like it did in the 1950’s and 1960’s.

Moore’s Law And The Rise Of Agility

To understand what’s going on, let’s look at the fundamental paradigm of the digital age, Moore’s law, based on Intel cofounder Gordon Moore’s remarkably prescient observation that the number of transistors on an integrated circuit would double every two years. Moore also predicted that this rate would continue for the foreseeable future.

As it turned out, Moore was right and we’ve mostly been able to maintain that pace, which has served as a roadmap. Those who wished to innovate in the digital space could design products knowing that they would have twice the processing power to work with on a regular basis.

It also had a less obvious effect. Since the rules of the game were well known, the fastest player had an enormous advantage. Agile startups emerged out of garages to run circles around their larger rivals, with well financed development efforts and well oiled marketing machines. The Davids, so to speak, were taking the Goliaths to the woodshed.

There were some important exceptions. Microsoft, for example, completely missed the move to mobile computing, but continued to prosper thanks to some long-term bets it made in cloud technology. IBM has been another company that has been able to transcend technology cycles through fundamental discoveries.

Still, for the most part, agility trumped scale. The faster you were, the better you were.

The 2020 Tipping Point

The idea of agility as the defining competitive attribute has become such an integral part of the conventional wisdom that few today realize that is is a relatively recent phenomenon. It used to be that research and development were significant barriers to entry, especially in information technology, where IBM and the BUNCH companies once reigned.

Yet after 2020, things will begin to change. That old trusty friend, Moore’s law, will end.  Advances in lithium-ion batteries, which we’ve come to depend to power our laptops,  smartphones and, increasingly electric cars, will slow to a crawl. Bloomberg also predicts that electric cars will be cheaper than gasoline cars by 2022, ending the dominance of the internal combustion engine.

However, while some things are ending, others are just beginning. Solar energy is expected to hit global grid parity by 2020 and we should be able to decode genomes for less than $100 in a decade or so, unlocking completely new scientific possibilities. Experts also predictthat there will be 10 million self-driving cars on the road by 2020.

So in a nutshell, we’re likely to see transformations across a wide variety of industries, including information technology, healthcare, energy and transportation. What’s more, the fundamental nature of these changes will be unlike anything we’ve experienced since the early 20th century, when electricity and the internal combustion engine were just beginning to have an impact.

New Paradigms For A New Era

In the recent past, the biggest challenge was the pace of change. Things moved fast so we had to race to keep up. But over the next 20 years or so, we will be working with new technologies that we are just beginning to understand. That will greatly change the problems we will face. It won’t be just the pace of change, but the very nature of that change we will struggle with.

Consider quantum computing and neuromorphic chips, two post-Moore’s law technologies that are likely to become widely deployed after 2020 and that function very differently than traditional computing frameworks. While we know theoretically what the potential of these should be, in practical terms, we know very little. After all, nobody has ever used them before.

There are also completely new fields emerging such as genomics, nanotechnology and robotics, which are truly cutting edge technologies that require PhD level specialists to work with them. Unlike building a new iPhone app or creating a user interface, these won’t lend themselves to the old “iterate, adapt and pivot” approach, at least not for a few decades.

Another thing to consider will be resource constraints. There are relatively few trained specialists in areas like machine learning and those that are qualified are rumored to be paid like professional athletes. Few firms, outside of the likes of Google, IBM, Microsoft and a few others are able to compete for them.

The Challenge Ahead: Overcoming The Valley Of Death

Over the past generation, innovation has mostly been an engineering problem. Software developers learned languages like Python and C++, which themselves were based on earlier languages, tracing their lineage all the way back to early ancestors like Fortran and COBOL. Chip and battery designs followed similar paths.

Now, however, we’re entering truly new territory and simply moving faster won’t be enough any more. The central challenge will be to bridge the gap—unaffectionately known in the scientific world as the “Valley of Death”—between discovery and commercialization. In the past, this has mainly been a government role, but in the years to come, the private sector will have to step up.

Rather than simply hacking their way to success, managers will find it increasingly important to identify and access new discoveries in the academic world. We’ll also need to create a new breed of innovative organization, which integrates the efforts of government agencies, academic institutions and private companies.

Dharemendra Modha, who leads IBM’s team developing neuromorphic chips told me, “We’re largely working in uncharted territory, so there is literally no one person on earth who has all the answers. Building a shared vision and a collaborative spirit among world-class scientists from a wide range of organizations has been absolutely crucial to our success.”

As we enter this new era of innovation, collaboration will become a key competitive attribute. It will no longer be enough to be agile and disrupt, we will have to discover and build.

The science man and innovator, Fernando Fischmann, founder of Crystal Lagoons, recommends this article.

SOURCE

Share

Te puede interesar