Banks Confront the Cloud Contracts Built for Yesterday

bank cloud contracts

Highlights

Legacy cloud agreements designed for predictable workloads don’t fit AI’s compute-heavy demands, forcing banks to rethink pricing, scalability and data movement terms.

As AI moves into core banking functions, institutions are shifting toward greater sovereignty, flexibility, and reduced dependence on single providers.

Banks are renegotiating around AI needs — focusing on interoperability, regulatory compliance, and avoiding vendor lock-in — where control and adaptability matter as much as price.

Financial services are facing a technological reckoning. Cloud contracts are at the center of it.

    Get the Full Story

    Complete the form to unlock this article and enjoy unlimited free access to all PYMNTS content — no additional logins required.

    yesSubscribe to our daily newsletter, PYMNTS Today.

    By completing this form, you agree to receive marketing communications from PYMNTS and to the sharing of your information with our sponsor, if applicable, in accordance with our Privacy Policy and Terms and Conditions.

    Ever since cloud migration first started to displace on-premises solutions for core banking architectures, banks have approached modernization initiatives as primarily a story of cost optimization, scalability and resilience. Contracts were negotiated around storage tiers, uptime guarantees, and predictable compute usage. Those agreements reflected a world where workloads were largely transactional and covered payments processing, customer databases, risk models running on scheduled cycles.

    That world no longer exists, and the agreements underpinning financial infrastructure may already be outdated.

    The reason, of course, is artificial intelligence. As the Monday (April 27) earnings call from Customers Bank showed, with the lender’s CEO using an AI avatar to host the first part of investor call, AI isn’t going anywhere except closer to the core of banking. That trendline was emphasized separately by the news that the Dutch Central Bank was breaking with Amazon Web Services (AWS) for a homegrown IT provider in large part due to data sovereignty and AI concerns.

    Running AI inference and scaling privacy-critical data pipelines can require far more compute, and far tighter integration, than traditional workloads. Against this backdrop, control over data and AI capabilities can matter as much as price for how core banking infrastructure is bought, governed and scaled.

    See also: Earnings Show Banks Turning Transaction Banking Into a Platform Business 

    Advertisement: Scroll to Continue

    AI Is Reshaping the Economics of Compute and Control

    AI is not just another application layer. It is becoming the organizing principle for how data is processed, analyzed and monetized. That shift is forcing banks to reconsider where their data lives, how it moves, and who ultimately controls the intelligence derived from it.

    Legacy cloud agreements weren’t necessarily designed for AI training workloads, stricter regulatory scrutiny, or growing geopolitical pressure around data sovereignty. Institutions that once viewed cloud providers as utilities are discovering that their agreements may constrain how, and how fast, they can deploy AI.

    Legacy cloud agreements often assumed a more siloed architecture. They did not anticipate the need for continuous data flows across multiple systems and providers. As a result, banks are encountering friction as they try to build AI-driven platforms on top of these foundations.

    “Legacy core is an impediment to innovation,” PYMNTS CEO Karen Webster noted during a recent conversation with Kathleen Pierce-Gilmore, senior vice president and global head of issuing solutions at Visa, as part of the “The Edit” series.

    “If you don’t have well understood, well-managed, well-governed data, it’s going to be really hard to use AI,” Pierce-Gilmore said. “I would put my whole life savings on the modernization of infrastructure.”

    Citigroup, for example, has reportedly increased its global AI market forecast amid rising enterprise adoption. The banking giant now expects the worldwide AI market to exceed $4.2 trillion by 2030, with nearly half of that total — $1.9 trillion — related to enterprise AI.

    See also: CFOs Are Flying Blind on Counterparty Risk as Cross-Border Uncertainty Spikes 

    Future of Core Banking Infrastructure

    The battleground is shifting from infrastructure provisioning to ecosystem orchestration. Banks are seeking partners who can support this shift, but they are also recognizing the need to retain greater control over how their systems are connected.

    In legacy cloud contracts, banks often ceded a degree of control in exchange for convenience and scalability. Data was stored and processed within provider ecosystems, with integration managed through proprietary tools. For traditional applications, this trade-off was acceptable. For AI, it is problematic.

    Data sovereignty is emerging as a particularly critical issue. Geopolitical tensions and evolving regulatory frameworks are pushing banks to ensure that sensitive data remains within specific jurisdictions. This requirement can conflict with the global architectures of major cloud providers, which often distribute data across regions for efficiency and resilience.

    Data in the “Global Payments Tracker Series” report by PYMNTS Intelligence, “Moving Money Forward: The Power of Payment Hubs,” showed that 60% of banks have implemented payment hubs or are in the process of doing so.

    There is also a growing focus on exit strategies. As the importance of AI increases, so does the risk of being locked into a single provider’s ecosystem. Banks are seeking terms that allow them to move workloads, access their data, and adapt to changing technology.

    For banks, the challenge is to align their infrastructure with their AI ambitions. This may require not only technical innovation but also a rethinking of the agreements that underpin their systems.