Thursday, January 29, 2026
More
    More
      Learn from the past.
      Prepare for the future.

      Rewiring Wall Street: Modernizing Without Forgetting What Works

      How Banks Can Integrate Legacy and Modern Systems Through Data Virtualization and Real-Time Streaming, Without Disruption

      For more than half a century, every major software advance has been built upon what preceded it. As a result, capital markets run on both yesterday’s code and today’s interfaces. This creates a paradox: the same systems that process trillions of daily transactions also anchor institutions in architectures from a different era.

      CTOs and their teams spend most of their time and budget “running the bank,” managing and maintaining existing systems, instead of “changing the bank,” introducing new technologies and processes; a reality that slows modernization and, by extension, limits the impact of newer capabilities like AI. Creating a strategic and sustainable path forward doesn’t require pretending the past never happened, but it does require starting by understanding it and learning from the mistakes of years prior as lessons for the future.

      Historically, durable progress in software has come from abstraction (hiding complex details behind simpler interfaces) and layering (structuring systems in levels to separate concerns), principles that still hold true on Wall Street. If banks apply those lessons at the enterprise level, they can modernize safely while preserving what works.

      The Inheritance Trap, and Why “Big-Bang” Rewrites Fail
      Large financial institutions operate on billions of lines of legacy code, some of which was written decades ago. Many still use mainframes for books and records because they provide near-continuous uptime. This reliability keeps mission-critical systems in place, but it also creates brittleness. Even small changes must reconcile with half a century of architectural decisions. Attempts to replace a monolith wholesale usually falter due to cost, risk, and timeline pressures.

      This tension appears in budgets. About 70% of bank IT spending is allocated to maintaining, patching, and reconciling legacy platforms. Meanwhile, 45 of the world’s 50 largest banks still rely on IBM Z in the core. This helps explain why “big bang” replacements rarely succeed. Even top tech spenders only retire 6% of legacy per year, so an incremental approach becomes necessary.

      As with past advances that built upon earlier systems, modernization succeeds when implemented in conjunction with the core rather than in opposition. The following approach outlines how to achieve this deliberately.

      A Three-Step Model to Modernize Without Disruption
      1) Shield the new from the old:
      Introduce a high-performance abstraction layer that virtualizes access to core systems. Instead of connecting every new application directly to fragile interfaces, standardize access through governed endpoints. This reduces integration risk, centralizes entitlements and lineage, and provides teams with a real-time source of truth without requiring re-engineering of the back end.

      2) Replace iteratively: Retire legacy components in bounded increments by workflow, application cluster, or product line. Each sunsetted system frees budget and specialist effort that can be reinvested in the next phase, turning modernization into a compounding cycle rather than a one-time gamble.

      3) Architect for resilience from the start: Adopt event-driven streaming so data is timely by default; embed observability for latency and lineage; and design for elastic scale and governance. Building these controls in rather than bolting them on later keeps modernization from recreating fragility in a new form.

      Why Data Virtualization and Real-Time Streaming Are the Hinge
      Virtualization and streaming alter the modernization calculus by enabling institutions to deliver value at the edge of the stack without first refactoring the core. Traders, risk, and operations teams can work from a shared, real-time view while the underlying systems are upgraded in a step-by-step manner. In production environments, these patterns enable the continuous updating of dashboards and workflow apps at volumes measured in millions of orders while maintaining bank-grade reliability and governance.

      When combined with a unified build/runtime environment, data access, processing, and UI can work together to enable teams to iterate without brittle point-to-point integrations. This approach aligns with data-fabric guidance, where data virtualization provides an abstraction layer instead of requiring heavy refactors. The budget reality supports this incremental approach. In short, coexistence only works if it enhances what users are currently doing.

      Practical Hallmarks of a Modernized Bank

      • Unified front ends on governed data: Front-office, risk, and compliance operate from a single source of truth. Live publications and historical context ensure that decisions and audit trails remain aligned.
      • Low-latency performance at scale: Interfaces remain responsive under load. Tables and charts update as data changes. Reporting is generated on the fly, and derived metrics recalculate with the stream. That stability is more than a UX detail. It’s the difference between seeing risk as it happens and reconstructing it after.
      • Shorter delivery cycles without sacrificing control: With a workbench that unifies scripting, data exploration, and GUI design, teams ship new views quickly. Entitlements, lineage, and testing remain in the loop. This control is critical in regulated markets.

      A Modernization Agenda That Mirrors What CTOs Want
      AI is only as reliable as its systems and data. In regulated markets, models should use information that is auditable, governed, and current. Treating AI as an evolutionary step, a new layer on a resilient foundation, makes automation meaningful, not just aspirational.

      Recent executive conversations reveal a consistent theme: shifting from AI-first promises to modernization-first execution, freeing the budget from maintenance for measurable change, reducing stack fragmentation, and strengthening resilience and governance so new capabilities scale safely. The three-step approach aligns with this agenda, translating historical lessons into daily architectural choices on trading floors and in control rooms.

      Where This Ends: Continuous, Not Catastrophic, Change
      The financial services industry doesn’t need another idea for a headline-grabbing rewrite. It needs a reliable way to evolve. By shielding new systems from legacy, replacing components iteratively, and architecting for resilience, banks can modernize while markets remain open and obligations are met. This approach moves Wall Street forward without forgetting what works or reinventing the wheel, and it’s happening at scale.

      About the Author

      Robert Cooke is the CEO and Founder of 3forge, bringing over two decades of experience in designing mission-critical systems for global financial institutions. Early in his career, he optimized middle-office workflows at JPMorgan and managed regulatory compliance and high frequency electronic trading teams at Bear Stearns. He later modernized post-trade processing and transaction cost analysis at Liquidnet, establishing himself at the forefront of real-time data streaming, market infrastructure, and complex event processing. At 3forge, Cooke has spearheaded the company into a global powerhouse in high-impact application development, enabling global firms like Morgan Stanley, Bank of America, Barclays, and Wafra to accelerate trading, risk management, and operations.

       

      MOST READ

      PODCAST