Re: Genuine question...
Ok. I feel qualified to answer this. I was tech lead on a project for a major bank that needed to migrate a few hundred small programs from running on old/obsolete x86 architectures to a modern, fully supported and maintainable environment. These programs were not the core banking applications, these just ran in the gaps between the main systems, doing one or more jobs necessary to keep data processed and flowing across the enterprise.
Sounds a straightforward task,
- we had the knowledge and skills to read the old code;
- all the programs were in a small, isolated area, walled off from most other systems;
- there were minimal functional/logical changes, just rewrite it to do the same as it did;
The project ran for over a year and had still not been completed when I moved on. We found significant challenges in multiple areas
a. No-one knew what each program was doing or why. There was no documentation. There were no business users who remembered why the program had been needed. There was no IT knowledge as to why something was done.
b. No-one understood what the data was. As much was scraped from application screens as written to database tables. Many fields had multiple types of data in, mixes of dates, numbers and alphanumeric codes. Many fields were bit masks for other fields.
c. The enterprise world is dynamic and ever changing. One example springs to mind where data from source A and outputs it to destination b, takes data from B, mixes with source C and D, writes out to destinations E, F and G. but when source C is a mainframe screen with 15 fields, of which only 12 still exist, and there is an update part way through the rewrite that removes that screen completely. what now? Anyone know where the data can still be found.
d. There were 100s of hard coded edge cases, all interwoven into an impenetrable web of conditional logic statements, tress and branches.
e. The test data was not fit for purpose in most cases. With 20+ years of industry mergers, new products, obsolete products, half migrated systems etc, there could be 4000 different types of data from records in a single table.
f. The number of test cases required to test every possible combination of data was rising towards infinity. We stopped calculating past 100 billion.
g. For the majority of the programs it was not known how many systems would be impacted by an individual program, let alone be able to plan a proper regression test.
Yes, we had the original source. Yes, we could write it again to do the apparent same thing in the new environment. That was usually the trivial bit. Finding out whether it still did the same in every use case in every combination with no impact to anything else. Almost impossible.
These programs dealt with moving transactions between hundreds of thousands of current accounts, savings accounts, loan accounts, bad debtor accounts, accounting systems and so on every day, with total revenue in the hundreds of millions.
To anyone who has never seem the complexity and scale of enterprise systems, it is like trying to handover air traffic control duties at Heathrow to a parking attendant in a small car park.
So save the smug old-timer comments for the playground. Realise that there is a whole world of IT technology, skills, knowledge and implementation that has been achieved by the efforts of many from several differing generations across the last 50 years. You think you know it all. Think again.