Evaluating For the Future: Measuring What Works and What Doesn’t In Information Technology29 August, 2017 / Articles
The digital transformation of the business world is at the top of CEO and CIO agendas, as it should be. Customer behaviors and expectations are changing at a dizzying pace, and the innovation required to effectively compete will increasingly be defined by software.
While this shift is underway, completely reinventing the information technology wheel is not a sustainable starting point. Rebuilding existing business processes and services on new platforms consumes too many valuable resources, takes too long and introduces unacceptable risk.
As I’ve discussed before, the pressure is on to decide what technology can move from being part of the “keep the lights on” agenda to delivering innovation in the newly digitized enterprise. Not every technology will make the cut, nor should it. Yet wholesale recreation is not an option.
How should beleaguered CIOs decide what parts of the IT infrastructure have a viable future on the other side of the digital event horizon and what will be phased out to make room for advancement? Essentially, the decision must be based on business value, impact and risk. There are also technology hurdles that investments must also get over simply to be considered.
When evaluating where it makes sense to push technology into the new digital age, there are three critical areas to measure against: compatibility with a DevOps practice, how easily that platform or service can extend into a more connected world, and where it stands in its security lifespan.
If innovation is defined by software, then the capacity to deliver software capabilities faster and more effectively than the competition is going to define successful companies. DevOps practices and the culture of continuous delivery that accompanies it are going to be at the heart of that competitive weapon.
For an IT investment to live a long and happy life in the brave new world of faster, more agile software delivery, it needs to effectively mesh with a DevOps worldview. How can CIOs ensure this happens? The answer, in some cases, may surprise you.
Many mainframe developers, for example, are actively integrating what are very traditional application development environments into the high-speed DevOps world with great success. DevOps compatibility will soon need to be part of your evaluation criteria — or that IT investment today may not be able to keep up with tomorrow.
Connectivity And Flexibility
Hybrid IT environments will rule the business IT world — not just as an accumulation of on-premise and cloud service solutions but as a real hybrid model: a complex (and changing) mixture of platforms, delivery mechanisms and consumption models built and operated with enough agility to keep pace with the constantly changing business needs.
Therefore, the next question that needs to be asked of any technology is, “How well will it coexist in this world with everything else?” Standalone platforms are rapidly becoming a thing of the past.
Soon, everything will need to be connected, flexible and responsive to change. If a critical business system is unable to find a new connected home, it may be approaching the end of its days. However, the most venerable business systems could have long lives ahead of them if they are able to make that critical connectivity work.
As more and more of a company’s core business model runs on information technology, so the criticality and scope of intellectual property rise in proportion. Therefore, security will finally complete its slow evolution from often-forgotten business impediment to pivotal element in innovation strategy.
As a result, it’s essential that systems and tools — both current and future — have long security lifespans ahead of them. The recent global ransomware attacks were successful in no small part because they targeted systems that were already well past their security sell-by dates.