i2c Urges Banks to Use Caution When Adding AI to Playbooks

Highlights

Traditional data sources are critical but must be combined with vetted alternative data to optimize decision-making.

AI should accelerate analysis, not replace human-led data intelligence oversight.

Fraud prevention is a competitive-neutral space where consortium data sharing can benefit the entire payments ecosystem.

Watch more: Fueling Payments Innovation with Traditional and Alternative Data Power

    Get the Full Story

    Complete the form to unlock this article and enjoy unlimited free access to all PYMNTS content — no additional logins required.

    yesSubscribe to our daily newsletter, PYMNTS Today.

    By completing this form, you agree to receive marketing communications from PYMNTS and to the sharing of your information with our sponsor, if applicable, in accordance with our Privacy Policy and Terms and Conditions.

    In the payments industry, data isn’t just a tool; it’s the bloodstream of operations.

    From credit risk models to customer loyalty programs, the accuracy and accessibility of data shape decisions and give rise to financial ecosystems. Yet the very foundation of that information is under strain.

    Government data conduits, long considered bedrock sources, are facing scrutiny. Public agencies, such as the Bureau of Labor Statistics, have seen their datasets questioned, sometimes due to political pressures, but increasingly because of the rise of artificial intelligence and its ability to create, manipulate or reinterpret large data pools.

    The situation demands vigilance, not abandonment, of the old guard, said i2c Senior Vice President of Transformation David Durovy.

    “We need to think about which legacy data sources really drive the foundation of the models and outcomes we’re trying to achieve,” he told PYMNTS as part of the What’s Next in Payments series, “Searching for Reliable Signals in Banking’s New Data Reality.”

    Advertisement: Scroll to Continue

    The challenge is to preserve the credibility of these sources while supplementing them with relevant, vetted alternative data without losing sight of compliance and risk oversight.

    Data’s Role Before the First Transaction

    While many industry conversations focus on data’s impact on ongoing performance, Durovy stressed its importance before any customer makes a purchase or swipes a card.

    “Before we even get to the first transaction, what are we targeting?” he said. “Are we designing our programs, our loyalty schemes for a particular audience, a particular market segment and geography? Are we using effective channels to get to those customers, onboard them, and then optimize their performance and experience?”

    Data fuels more than revenue and profitability, he said. It influences customer satisfaction, product design and the emotional contours of the user experience. Customer experience metrics are inseparable from the analytics that track profitability.

    The payments sector is moving toward “top-of-funnel” analytics that inform product development, marketing strategy and onboarding practices, he said. This shift will allow providers to engage with customers in ways that drive lifetime value rather than just short-term gains.

    The Limits of AI

    AI’s promise in financial services includes pattern recognition, risk modeling and process automation. But Durovy warned against over-reliance.

    “It becomes very dangerous when we don’t put the rigor behind it,” he said.

    If AI takes the “51% seat” in decision-making, meaning it tips the scales in most cases, institutions may lose sight of the data’s provenance and the methodologies that produced it, he said.

    The key safeguard is maintaining “data intelligence” capacity, or human-led analytics and oversight that ensure data quality, context and regulatory compliance, he said.

    “We all need to have that data intelligence capability to see the first-party data, to make sense of it, to ensure our third-party data is not driving aberration in our portfolios and customer experience,” Durovy said.

    He said he envisions a model in which AI accelerates certain processes but never replaces the human expertise needed to validate and interpret outputs. Without that balance, institutions risk building models on unstable foundations, which could undermine performance and trust.

    The Enduring Value of Traditional Data and the Consortium Approach

    Despite the rise of real-time behavioral data and geolocation analytics, traditional historical data, especially first-party data, remains indispensable.

    “It will always have a significant place,” Durovy said, pointing to its role in underwriting, compliance and customer experience design.

    However, integrating traditional and new data sources requires a careful vetting process.

    “When you’re doing things like underwriting, you can’t use unproven data,” he said.

    The danger is not only regulatory exposure but also the potential for disparate treatment of customers if flawed datasets influence credit decisions, he said.

    Firms should use parallel sourcing, or marrying trusted legacy data with new, well-tested data streams, Durovy said. This layered approach allows for more dynamic decision-making while preserving reliability.

    One promising avenue for improving data quality and resilience is the consortium model.

    “Nobody wins when there is rampant fraud in the industry,” he said.

    Fraud risk management is a rare, competitive-neutral space where institutions can share intelligence without compromising proprietary advantage, he said.

    The idea is to collaborate, without colluding, on high-quality, real-time fraud data. Such efforts could feed individual models with the best intelligence on bad merchants, fraud rings and emerging threats, while protecting customer privacy and regulatory compliance.

    How i2c Supports Clients in the Data Race

    As a global financial technology innovator, i2c sits at the intersection of data processing and decision-making. The company serves as the “system of record” for transactions, meaning the integrity of its data feeds directly into the integrity of client operations, Durovy said.

    “If the information on the platform is not accurate, if our intelligence platforms, if our reporting tools are not accurate, then we all run a huge risk,” he said.

    That risk extends beyond decision accuracy to data security, information governance and regulatory compliance.

    The company’s approach includes building secure pathways for receiving and transmitting data, ensuring only the right parties have access and maintaining consistent validation processes. Internally, the company is also exploring ways to surface anonymized, aggregated fraud trends across portfolios to improve detection rates without compromising client confidentiality.

    Challenges lie ahead, especially in creating industry-wide frameworks for secure, compliant data sharing, Durovy said. However, collaboration, rather than competition, is the way forward for certain high-stakes use cases.

    “There are many areas where we don’t need to view our neighbor as our competitor because we’re in business together, and fraud hurts the entire industry,” he said.