The contours of the modern scam economy are becoming clear. And its edges are growing sharper.
Fraud is no longer a fringe problem driven by clumsy phishing emails or naive users clicking the wrong link. It is a sophisticated, fast-moving industry built on social engineering, impersonation and psychological pressure, all amplified by artificial intelligence and always-on digital channels.
To learn more about how the conversation is evolving for businesses, PYMNTS sat down with Brian Boates, chief risk officer at Block; Dave Szuchman, head of global financial crimes at PayPal; and Angelena Bradfield, head of policy at the Financial Technology Association.
“When a lot of us think about the typical profile of a scam victim, we often think of elderly customers,” Boates said. “But these days … it’s not just older customers falling victim to scams, but it’s also a lot of Gen Z and millennial customers as well. Scammers don’t discriminate.”
Nearly 1 in 5 adults in the United States have experienced at least one scam in the past five years, and the fallout extends beyond dollars, the experts said. Victims question institutions, withdraw from digital life, and reconsider where, and whether, they should trust financial platforms at all.
Advertisement: Scroll to Continue
The Myth of the ‘Typical’ Scam Victim
The scam attack surface has shifted from email inboxes to social feeds, direct messages and online marketplaces, all places where trust is built socially, not institutionally. Younger, digitally native users are increasingly targeted precisely because they are comfortable moving money quickly, navigating multiple apps and responding in real time.
“Social media has been a great equalizer in scams,” Szuchman said. “That’s where the grooming activities are occurring … and then our platforms are being utilized as part of that ecosystem. This is an ecosystem problem.”
In other words, by the time a payment hits PayPal, Venmo or Cash App, the damage has often already been done. Trust has been established elsewhere, urgency has been manufactured, and the victim is primed to act.
“We do need to address the approach vectors,” Bradfield said. “It is difficult to move at the speed that scammers are moving, sometimes because of some of the regulatory frameworks and pressures.”
Speed favors the scammer. More than half of victims send money within the first 24 hours of contact, and nearly 1 in 4 do so within 30 minutes.
“Either they try to create a sense of urgency in the imposter case—someone’s in trouble and they want to get them to move money quickly—or it’s a long game,” Boates said. “They build rapport and trust … for a long time before asking them to move funds.”
AI has only made these tactics more convincing, enabling more realistic messages, cloned voices and plausible narratives at scale. For platforms, the challenge is intervening at the right moment. Late enough to avoid disrupting legitimate activity, but early enough to break the psychological momentum scammers rely on.
“We don’t want to get in the middle of a good payment,” Szuchman said. “We want trust in our product. It’s finding that right needle to thread.”
Smart Friction as a Design Strategy
Technology alone, however, is not enough. Education must happen before users are under pressure. That insight underpins the Financial Technology Association’s Smarter Than Scams campaign, a national effort focused on consumer awareness during high-risk periods like the holiday shopping season.
“Scammers really do play on timing and the emotions of their victim,” Bradfield said, adding that the campaign encourages simple but powerful behaviors, like strong passwords, two-factor authentication, checking messages through official apps and, most importantly, pausing before you pay.
This is where smart friction enters the conversation. Rather than blanket warnings or constant interruptions, Block and PayPal are experimenting with highly targeted, risk-based interventions designed to create a pause without alienating users.
At Block, payment warnings are powered by machine learning and appear in roughly 1% of transactions, Boates said.
“It’s based on a risk assessment of our models,” he said. “When we surface that warning, we’re catching the customer’s attention, giving them a moment to pause.”
Those warnings are contextual. Users can click into the counterparty’s profile and see what Block calls “trust signals,” like when the account was created, whether the recipient is in their contacts, and whether they’ve transacted before. In impersonation scams, the absence of shared history can be enough to snap someone out of the illusion.
PayPal takes a similar approach. Boates and Szuchman said the consistency of these interventions matters as much as their accuracy. Over-warning leads to “click-through culture,” where users reflexively ignore prompts, which is exactly what scammers are counting on.
Trust as the Ultimate Metric
Perhaps the most telling statistic is not about fraud rates at all. Forty-two percent of scam victims considered switching banks afterward, and 19% actually did. Scams don’t just steal money; they drive people away from digital finance altogether.
“When someone goes through an experience like that, they lose sight of who they can trust,” Boates said. “Re-highlighting the security features … and having that reimbursement program ultimately shows that we do have their back.”
Despite best efforts, some scams will still succeed. What happens next can determine whether trust is permanently broken or cautiously rebuilt.
PayPal’s long-standing dispute and resolution centers serve a similar dual purpose: customer protection and intelligence gathering. It’s important to think globally, Szuchman said.
“These crimes are not being committed in one jurisdiction,” he said. “If you’re a global company, your ingestion points have to be global.”
Block has invested in simplifying this process. Boates described an enhanced reporting flow that allows customers to flag suspected scams and submit evidence. When investigations confirm an imposter or deception scam, eligible customers can be reimbursed.
The investigation data feeds directly back into Block’s machine learning models, improving future detection.
“The whole machinery is kind of a closed-loop feedback loop,” Boates said.
The fight against scams isn’t over. However, the conversation is evolving to redesigned systems, moving from isolated defenses to ecosystem collaboration, and from silent losses to explicit trust-building.
In an economy increasingly defined by digital relationships, that shift may be the most important security upgrade of all.