Screen Shot 2016-06-10 at 08.27.41

Faster delivery starts with knowledge – especially in Enterprise IT

By Derek Britton (MicroFocus) and Steven Dickens (IBM)

The increasing IT backlog often referred to as “IT Debt,” for example in this 2010 Gartner report illustrates the urgency of improving the throughput IT can deliver. DevOps is the latest IT industry buzz word and everyone is excited about the possibility of delivering a new generation of business value faster than ever. The promised benefits of rapid delivery of digital services at the ever increasing speed the business needs, providing a new generation of vocal users conditioned by the consumerisation of IT, is a lofty ambition which we would all welcome coming to fruition.

Digital transformation and the API Economy is a $3.75B market, and to capture some of this share, clients must first understand and modify so-called legacy mainframe software to be at the center of their digital enterprise. Alas, nothing is going to stop ambition in its tracks than looking at a monolithic system of record and saying, “OK so where do we start?”

The Trouble with Big Business

The trouble with big business is that it is, well, big. And the trouble with big things are they are harder to understand, let alone manage. What makes it hard is fundamentally twofold: technology changes, and people changes.

First, the technology.

One banking customer commissioned an external consultancy to build them a single “application atlas” – a visual representation of all applications in the IT portfolio. The CIO felt they needed to fully understand the scope of the estate. A project of several months ensued as the 3rd party interviewed key IT stakeholders to understand the names, relationship, functions of all key IT systems. The day finally arrived to unveil the “atlas”, which, upon seeing it, caused the CIO to say “it’s wrong”. Why? Some of the information, collected early in the process, was now out of date. IT had moved on.

The problem with IT is that it is typically a nebulous, and evolving, mass of inter-related, disparate systems, across platforms, geographies and business functions. Understanding all of this technological ectoplasm (Ed: this was Derek’s choice of word not mine) is not easy, and typically not possible through manual processes.

Second, the people.

The challenge to continue to “know” what IT is, does, and where everything sits, is exacerbated when you consider the people who are employed to know such things. IT teams are increasingly dynamic with project lifecycles driving more dynamic department structures (especially in more agile-oriented organisations).  This, coupled with increased turnover, and the people in IT who really know the applications and the business they serve, move around or out leaving gaps in the people driven knowledge base.

Another issue is skills. In this article from CIO magazine, the impending shortage of COBOL specialists will complicate efforts to keep on top of all core systems, is just one example.  The challenge faced is simple; a team that has been in place for decades, and has probably created a significant proportion of the portfolio they are now maintaining, will have an easier time keeping up with the backlog than will a team of individuals who are unfamiliar with the code.

Quenching a Thirst for Knowledge

So, how do you fix the problem, which put simply is ‘that you don’t know what you don’t know?’

People alone, cannot be the answer. The answer hinges on finding a way, just as DevOps would propose, of automating the process.

Application discovery is a necessary part of the work of a developer, or programmer, who is new to a project or to a part of the application portfolio they are unfamiliar with. Traditionally, it is a trial and error process consisting of searching through tens or hundreds of source files, deciphering cryptic comments and locating references to significant data elements. And the language of these core systems? More often than not, COBOL.

A DevOps Approach

The benefits of replacing error-prone manual tasks with automated tools are well understood and form the bedrock of the rationale for the DevOps initiative.

Understanding of an application is crucial not just to get the new programmer up to speed. It’s also necessary for performing due diligence and following good practice. DevOps is about automating as much of the application lifecycle as is feasible, to shorten time to production and reduce errors and resulting delays. This includes the early stages of discovery, analysis, requirements gathering, and so on.

Automating the Discovery process

If we take the DevOps perspective of seeing what could be done to eliminate application discovery – usually a laborious, manual effort – it holds that this is an activity that is ripe for automation. What if, instead of chasing through one file after another, the programmer had at his disposal, a means to quickly and accurately visualise the structure and flow of the application? Such a solution could be used to not only reduce the effort of discovery; it could also automate another crucial task: complete and accurate impact analysis. Application updates have been known to fail in production due to an inadequate understanding of the impact of the update.

Application Discovery Benefits

Both IBM and Micro Focus are tackling the issue. Solutions from Micro Focus and IBM help automate discovery by automatically creating a visual representation of the application. By revealing artifacts like control flow and data references in an IDE instead of through the ISPF editor, the developer’s task of familiarising themselves with a new application is simplified. At the same time, the capability to automatically create impact analysis reports helps move the organisation further along the path to DevOps.

Better yet, the same analysis information can be provided not only at the stage of initial examination (potentially scoping out a task for others), but also at the point of change, when the developer needs to know what to change, where and why, and what impacts this will have. The code quality and efficiency improvements possible are considerable.

 

Conclusion – Automating the Journey

The pace of business evolution, business expectation levels and demographic trends in the IT world are helping, the increasing role of outsourcing or out-tasking the development function are all exacerbating the IT backlog issue. The challenge that senior IT leaders face is how to not only deliver on the increasing demands their line of business peers place on them, but also get their own house in order.

A solution that speeds up development activities reduces the risk through elimination or reduction of manual steps and delivers on the promise of ‘agile’, makes a lot of sense. In addition to the cultural change involved, moving the organization closer to their own DevOps objectives involves automating as much as possible. Starting that automation by scientifically understanding the systems being changed, using contemporary technology, should be seriously considered. Regardless of the vendor you chose, be it IBM or MicroFocus this area must be high on the list of priorities for any senior IT leader.

 

Editors Note: Derek Briton is fantastic collaborator and supporter of the mainframe platform, and while IBM and Micro Focus are competing for your Dev Ops tools business, he and the rest of Micro Focus know that when the mainframe wins we all win!  Find and follow Derek on twitter here @DerekBrittonUK