Learning from COBOL modernization in the real world

by
0 comments
Learning from COBOL modernization in the real world

There is a lot of excitement right now about AI enabled mainframe application modernization. The boards are paying attention. A plan is being sought from the CIO. AI is a real accelerator for COBOL modernization, but to achieve results, AI requires additional context that source code alone cannot provide. Here’s what we’ve learned from working with 400+ enterprise customers: There are two very different parts to mainframe modernization. The first part is reverse engineering, understanding what your existing systems actually do. The second part is forward engineering, the creation of new applications.

The first part is where mainframe projects live or die. However, coding assistants are really only good at the second part. Give them a clear, valid specification and they will build modern applications faster.

We learned that delivering successful COBOL modernization requires a solution that can deterministically reverse engineer, produce valid and traceable specifications, and help flow those specifications into any AI-powered coding assistant for further engineering. A successful modernization is required Both Reverse engineering and forward engineering.

What is needed for a successful mainframe modernization

bounded, complete context

Mainframe applications are large. Really big. A single program can run thousands of lines, pull shared data definitions from across the system, call other programs orchestrated through JCL that span the entire landscape. Today, AI can only process a limited amount of code at a time. Feed it a program and it can’t see copybooks, so-called subroutines, shared files, or the JCL that ties everything together. This will produce output that looks reasonable for the code he can see but is missing dependencies that he was never shown. In working with customers, we solve this by first abstracting out all the underlying dependencies, then feeding the AI-connected, entire units with everything it already needs. This way the AI ​​focuses on the connections it is good at (understanding business logic, formulating specifications) rather than guessing the connections it can’t see.

Platform-aware reference

Here’s something that surprises people: the same COBOL source code behaves differently depending on the compiler and runtime. How numbers are rounded, how data sits in memory, how programs talk to middleware. These are not in the source code. They are determined by the specific compiler and runtime environment for which the code was created. Decades of hardware-software integration cannot be replicated by simply moving code. We found that AI does its best work when platform-specific behavior has already been addressed. Feed clean, platform-aware inputs to the AI, and it delivers. Feed it raw source code, and it will generate output that looks correct but behaves differently from the original. In financial systems, integer differences are not a cosmetic issue. This is a physical error.

a traceable basis

If you’re in banking, insurance or government, your regulators will ask one question: Can you prove you didn’t leave anything behind? AI on its own is not enough to extract business logic and produce documents that regulators will accept. Regulatory compliance requires each output to have a formal, auditable connection to the parent system. We already learned that traceability does not come from AI reading source code. This comes from structuring the code into precise, bounded units so that we know what goes into the AI ​​and can trace each output back to its source. For clients in regulated industries, this is often the difference between a project moving forward and one stalled.

How we positioned AI for success in AWS Transform

We built AWS Transform to modernize mainframe applications at scale. The idea is simple: Give AI the right foundation, and customers will get traceable, accurate, and complete results that they can take into production. AWS Transform starts by creating a complete, deterministic model of the application. Specialized agents extract code structure, runtime behavior, and data relationships across the entire system – not one program at a time, but entire scenarios. It produces a dependency graph aligned with actual compiler semantics, capturing cross-program dependencies, middleware interactions, and platform-specific behavior before AI gets involved. From there, larger programs are decomposed into bounded, processable, units. Platform-specific behavior is definitely fixed. The size of the units has been determined for the AI ​​to process effectively. The AI ​​then extracts business logic in natural language, and each output is validated against the deterministic evidence we have already extracted. Specs map back to the original code. When a regulator asks “Did you miss something?”, there is a verifiable answer. What sets it apart is that AI never works in the dark. Each unit that processes it has known inputs and expected outputs, so we can verify what comes back. No other approach on the market closes that loop. What emerges is a set of validated, traceable technical specifications that plug into any modern development environment. The hard part of modernization is understanding what exists today. Once you’ve captured it in the exact specifications, AI-powered IDEs can build new applications with confidence.

A complete platform for enterprise transformation

No one modernizes an app. Our customers are looking at portfolios of hundreds or thousands of interconnected applications, and they need much more than analytics support. AWS Transform automates the entire lifecycle: analysis, test planning, refactoring, reimagining. The whole thing. And within that, different apps require different paths. Some are reimagined. Some just need a clean, deterministic conversion to Java. Some will need to exit the data center first and modernize later. Some will remain on the mainframe. We learned the hard way that treating them all the same causes projects to fail. Portfolio decisions (which app, which path, which order) matter as much as the technology. In our experience, this is the only way enterprise modernization really pans out. A one-size-fits-all approach is what causes these projects to fail. Another thing that is consistently overlooked: testing data. You can’t prove that a modern app works without real production data and real scenarios. We’ve seen teams get to code transformation and then stop because no one planned for data capture. So, we built test planning and on-premises data capture into the platform from day one. There is no practice of cleaning at the end. This is exactly what it looks like when it works. End-to-end automation, including correct path, validation for each app.

How to get this right

The question is not “Should we use AI to modernize COBOL?” Of course you should. The question is how you set up AI to deliver: traceability to regulators, platform-specific behavior properly controlled, consistency across your application portfolio, and the ability to scale to hundreds of interconnected programs. That’s what led us to create AWS Transform. Deterministic analysis as a basis. AI as an accelerator. An AWS service that covers the full range of modernization patterns.

And it is working.

The BMW Group reduced test time by 75% and increased test coverage by 60%, significantly reducing risk while accelerating modernization timelines.

Fiserv completed a mainframe modernization project that was supposed to take 29+ months, in just 17 months.

Itau cuts mainframe application discovery time and testing time by more than 90%, enabling teams to modernize applications 75% faster than previous manual efforts.


About the authors

Dr. Asa Kalavade

ASA leads AWS Transform, which helps customers migrate and modernize their infrastructure, applications, and code. Previously, he led the AWS go-to-market tool transformation incorporating generic AI capabilities. He also managed hybrid storage and data transfer services. Before joining AWS in 2016, Asa founded two venture-backed startups and remains active in advising Boston startups. He has a PhD in electrical engineering and computer science from UC Berkeley and over 40 patents.

Related Articles

Leave a Comment