adesso Blog

A few weeks ago, an architect at a mid-sized bank showed me his dependency graph. Not a pretty one—an honest one. A Wiki export, supplemented with handwritten notes on printed pages. What it revealed: dozens of COBOL modules, chained together via batch jobs, most of which haven’t been properly documented in fifteen years. Somewhere in this chain, it was clear, there’s also a SEPA converter. He couldn’t tell me exactly where. The developer who knew that retired two years ago.

I’m telling you this because situations like this are not uncommon. Anyone who regularly visits the payment processing departments of German banks is all too familiar with variations of this scenario. And if the environment were stable, one could live with it. But it isn’t.

Banks at the Limit: Outdated Systems, Dwindling Expertise

McKinsey estimates that operational costs for payment processing account for 30 to 40 percent of a typical universal bank’s expenses. BCG estimates that over 60 percent of IT budgets are tied up in “Run the Bank” operations, and with regulatory changes, this figure rises to as much as 70 percent. That is the leeway left for genuine transformation—and everything is currently crowding into that space at once.

Instant payments with 24/7 availability and ten-second processing. ISO 20022 with mapping, enrichment, and years of parallel operation. DORA with resilience as a board-level obligation. PSD3/PSR with heightened liability for fraud. Then on the horizon: FiDA, the Digital Euro, the EUDI wallet. And all of this impacts the same systems—payment engine, core banking, fraud, sanctions screening, channels, reporting. It doesn’t just add up. It multiplies.

It’s important to realize what instant payments alone mean in practice: real-time scoring in the sub-second range, 24/7 fraud monitoring, pre-funding in TIPS even at night and on weekends, and payee verification across all channels. This impacts architectures designed for nightly batches. For settlement windows that will soon simply no longer exist.

Anyone who wants to change such systems faces a problem that has less to do with the technology than with knowledge of it—or more precisely: with the lack of that knowledge.

The problem isn’t COBOL; it’s a lack of system knowledge

In most discussions I have on this topic, the conversation reflexively turns to COBOL, mainframes, and batch processing. But let’s be honest: COBOL does what it’s supposed to do. The language and the systems have been running reliably for decades.

The real risk looks different. It’s about a lack of system knowledge in an environment that can’t afford any mistakes. Converter logic that has to be manually tweaked with every SEPA release by a team that’s getting smaller every year. Fee modules with special routines that were created as workarounds at some point and that no one wants to touch today because no one knows for sure what else is tied to them. Batch dependencies that only become apparent when something goes wrong.

The figures are sobering. Pelican estimates that exception handling accounts for 42 percent of payment processing costs. The global STP rate in the cross-border sector stood at 26 percent in 2023 three out of four transactions require manual rework. IDC expects operating and maintenance costs for legacy payment systems worldwide to rise to $57 billion by 2028. This has little to do with the programming language and a lot to do with a lack of control.

When the foundation becomes a black box

“We’re already modernizing.” I hear this regularly, and most of the time it’s actually true. Many institutions have set up payment hub programs, work with equens or PPI, are migrating to the cloud, and are implementing ISO 20022 projects. All of that is correct.

What concerns me, however, is a different question: Does the bank actually know what’s happening in its legacy systems really in detail, not just at the slide-deck level—before it makes decisions?

In my experience: often not. I know institutions that commission SEPA converters without having an overview of the dependent batch jobs. That evaluate payment engines without having a complete dependency graph of the existing landscape. They launch DORA programs without being able to quantify how few people possess which critical knowledge. Incidentally, this is not a criticism of anyone—it is the logical consequence of legacy landscapes that have been functionally expanded over twenty years without keeping the documentation up to date.

And it by no means affects only the major players. Every institution that operates host systems for payment transactions faces this question—the medium-sized private bank just as much as the specialized institution or a firm that has outsourced parts of its operations to a data center but continues to be responsible for the business logic itself. It is not about total assets; it is about whether a professionally assessed view of dependencies and impact paths exists.

At the same time, the economic leeway continues to narrow: Account-to-account payments bypass card systems at transaction costs of 0.1 to 0.5 percent instead of 0.3 to 3 percent for cards. Anyone operating legacy card systems, clearing, and instant rails simultaneously without consolidating is caught in a bind.

If that sounds abstract: TSB in the UK, 2018. Eight months of disruptions following a core migration. £400 million in costs. A £48.65 million regulatory fine. The problem wasn’t the new platform, but that they hadn’t understood the old one.

AI in the Engine Room: Where It Helps and Where It Doesn’t

The question comes up every time: “And where do we use AI here?”

My answer is usually less enthusiastic than my counterpart hopes for.

Generative AI can help with documentation, no question. Summarizing COBOL modules, generating comments, describing business logic in natural language. When someone has to familiarize themselves with an unfamiliar codebase, it actually saves weeks. It also provides useful services in test case generation or in detecting potentially dead code paths.

Where things get tricky: in the complete analysis of dependencies. A Large Language Model can plausibly describe a COBOL routine, but no one knows whether it has actually correctly captured every call path, every JCL concatenation, and every side-effect chain. And in an environment where an overlooked batch job can trigger millions in erroneous postings, “probably correct” simply isn’t enough. That’s where you need deterministic parsers that go through the code line by line—not probabilistic models that hallucinate in five percent of cases.

The same basically applies to code conversion—COBOL to Java, for example. Deterministic, module by module, verifiable through regression testing: that works. AI as a supplement, fine by me. But as a primary tool in critical payment infrastructure, it simply isn’t robust enough today.

What matters is a sober distinction: Where does AI deliver real time savings, and where does it create a new risk that looks strikingly similar to the old one?

Why Legacy Control Is a Discipline of Its Own

What’s missing in the market and I say this after many years in this field is not the next replatforming project. Nor is it a better code parser. And certainly not a Big Bang.

What is missing is a control discipline. The ability to know, before every decision in payment processing, what you’re dealing with, what’s connected to it, and what can be shut down.

Sound trivial? Practice shows otherwise. Most analysis tools provide technical facts call graphs, data flows, complexity metrics. What they don’t provide is the business context. No code parser in the world will tell you: “This module is critical from a regulatory standpoint because it contains your SEPA credit converter.” Or: “This batch is linked to sanctions screening and is therefore immediately relevant.” Or and this is particularly critical under DORA: “Knowledge of this logic lies with two people, both over 60.”

It is precisely this translation service technical analysis plus business evaluation in payment processing that we at adesso and KIWI Consulting refer to as “Payment Legacy Control.” The ability to know, before making any regulatory or technological decision, what it will trigger in the existing system.

What this means in practice

Specifically, we combine the automated analysis of the adesso transformer platform which deterministically parses the entire host code, COBOL modules, JCL batch structures, converter logic, and interfaces with the payment transaction expertise of KIWI Consulting. The platform provides the graph.

We provide the meaning:

  • Which module is the SEPA converter?
  • Where is the fee logic located?
  • Which routine validates the structured addresses that will become mandatory starting in November 2026?
  • What can be eliminated, what must remain, and what is worth rebuilding?

This results in a system map with impact paths, a risk heatmap, and a demographic overlay that is, the question of who holds what knowledge and for how much longer. Based on this, we make a Retire/Replace/Rewrite decision for each business domain. Not a consultant’s slide deck, but something a Steering Committee can work with, a foundation for a CFO to build their business case, and a basis for a CRO to conduct their DORA assessment.

What sets us apart from traditional modernization projects? It’s not the technology. It’s the result. Not “we’ve understood your code,” but “we’ve understood your payment processing and can identify where the risks lie, what it will cost, and the order in which you should proceed.”

Not a massive project, but a starting point

A Payment Legacy Risk Scan delivers the facts in four to six weeks. No commitment to a multi-million-dollar program.

Simply a solid foundation on which to decide what needs to happen next. Once the target state and decommissioning plan are in place, the transformation begins domain by domain with regression testing to verify that the old and new systems deliver identical results, down to the cent.

The real savings don’t come from the analysis. They come from decommissioning. As long as old systems continue to run in parallel, the costs keep running too. Those who fail to control legacy systems end up paying twice for every new regulation once for the adaptation and once for the parallel operation that no one ever shuts down. Projects with ING Germany, ITZBund, and DAK-Gesundheit have demonstrated that this approach works, even in environments where downtime is not an option.

In the end, a simple question remains: Does anyone in the organization really know what is being modernized? And if not—who is actually in charge? Regulation sets the pace, the shortage of skilled workers exacerbates the situation, and margins are shrinking. Against this backdrop, Payment Legacy Control is not an efficiency project. It is a matter of operational capability.


Digitalization in Finance

Consulting, Development, Industry Expertise

As a creative and reliable service provider for banks, we take responsibility and stand by your side as a partner. Our solutions combine technology, expertise, and methodology to create personalized customer experiences. We see ourselves as an end-to-end service provider for future-proof business models, delivering tailored solutions and integrating high-performance standard software. We act in the best interests of our clients as entrepreneurs.

Learn more


Picture Enrico  Köhler

Author Enrico Köhler

Enrico Köhler is Senior Manager and Head of the Payment Transactions Competence Centre at KIWI Consulting, a subsidiary of the adesso Group. He has more than 20 years of experience in financial IT and supports banks and payment service providers in the strategic development of their payment transaction systems. His focus is on the implementation of regulatory requirements and the harmonisation of payment infrastructures. With analytical depth and an eye for the big picture, he designs future-proof solutions at the interface between technology, regulation and market requirements.