Watch more: What’s Next in Payments With WEX’s William Fitzgerald
Payments fraud continues to evolve at a pace that strains conventional defenses, as increasingly automated attacks expose the limits of static controls and legacy authentication methods. The risks are no longer confined to isolated incidents. They now reflect a system under continuous probing, where response times and data quality determine whether losses are contained or compounded.
As William Fitzgerald, vice president of Global Anti-Financial Crimes at WEX, explained, traditional anti-fraud efforts have imposed a natural ceiling on performance.
“You were really reliant on what a human or a spreadsheet or a binary rule engine could correlate at one time,” he told PYMNTS, noting that analysts often needed hours or even days to connect related signals across accounts. That lag created exposure.
Fraud decisions are no longer made with the benefit of extended review cycles.
“We don’t get 30 seconds to evaluate that transaction,” Fitzgerald said. “We get time boxed to 500 milliseconds.” Within that narrow window, systems must assess risk, authenticate users and determine whether to approve or decline a transaction. The shift has made real-time data processing and model-driven analysis essential rather than optional.
Advertisement: Scroll to Continue
Data as the Control Layer
At the center of that transformation is data itself. Fitzgerald was direct on this point. “Data is the lifeblood of AI,” he said. “Your capabilities with AI are directly tied to how governed and accurate and enriched and contextualized your data is.” In practice, this means fraud detection is no longer about collecting large volumes of information but about structuring and enriching that information so it can be acted upon instantly. The quality of inputs determines the precision of outputs.
That precision carries implications for the customer experience. Fraud controls that rely heavily on one-time passwords and other interruptive methods introduce friction that can alienate legitimate users. Fitzgerald argued that the objective is to reverse that order of operations.
“We want to be as passive as possible upfront,” he said, as part of the What’s Next in Payments series on the “data game.”
Instead of forcing users through repeated authentication steps, institutions can rely on background signals that confirm identity without disruption.
Artificial intelligence has accelerated that transition by enabling systems to process far more variables simultaneously than earlier tools could manage.
“AI takes binary decisions and allows us to correlate hundreds of features all at once,” Fitzgerald said. This broader view improves both detection rates and false-positive reduction, two metrics that have historically been difficult to balance.
Behavioral Signals and Constrained Payments
Those features extend beyond transactional data into behavioral analytics, which has become a valuable signal set. Fitzgerald pointed to behavioral biometrics as one example, including how users type, how they interact with devices and even whether they are left- or right-handed. When combined with spending patterns and temporal indicators such as transaction sequencing, these signals create a layered profile of expected behavior. “When you consider all of those signals … they become highly predictive,” Fitzgerald said.
He highlighted passkeys as a mechanism that can strengthen identity assurance early in the customer journey. By establishing a higher level of confidence at the outset, institutions can reduce the need for repeated challenges later. “If we can collect that and get a user to interact with that authentication protocol upfront, everything downstream becomes easy,” he said.
The same data principles extend into payment instruments themselves, particularly virtual cards. Fitzgerald described how constrained parameters can limit exposure if credentials are compromised. “You can only spend it at this merchant for this amount,” he said. “To a fraudster … that’s a far less useful card.” By narrowing where and how funds can be used, organizations reduce both the incentive and the opportunity for misuse.
Across these developments, a consistent theme emerges around how institutions will compete in what Fitzgerald described as the “data game.” Success is not defined solely by blocking fraud. It is defined by doing so while maintaining a seamless experience and controlling operational costs.
“Winning starts with customer experience,” he said, emphasizing low friction and high precision as core objectives. At the same time, organizations are expected to scale without proportional increases in expense, placing pressure on efficiency.
Adaptability rounds out the equation. Fraud tactics change quickly, and defenses must evolve just as rapidly. Fitzgerald pointed to the importance of modular infrastructure and flexible data architectures that support fast decision-making and continuous refinement. Institutions that cannot adjust in real time risk falling behind.
Fitzgerald underscored the balance required to move forward. “Low friction, low interference, very high precision,” he said, describing the standard institutions must meet. “If those aren’t your three, you’re going to be pretty far behind.”