BACK TO RESOURCES

The Social Engineering Economy

By Clyde Williamson
Nov 20, 2025

Summary

5 min
  • Impersonation ROI is exploding:
    AI-polished scams turn “personal touch” PII into precision fraud; U.S. losses hit $12.5B (+25% YoY) on roughly flat reports—evidence attackers are earning more per engagement.

  • Treat PII as critical infrastructure:
    Minimize collection, default it private, and protect at the data layer (tokenization, encryption) while training people for context-based verification—not just typos—and pushing for stronger privacy standards.

It started the way most heists do, with a file that shouldn’t exist.

Bob, the vendor account manager, exported a spreadsheet to send out Holiday greeting cards to his customer’s family. Bob always takes copious notes on his customers, their family, how many kids, their names, ages, and which college they got accepted to. It’s that personal touch that makes Bob a success. Two thousand names from across the country, emails, addresses, mobile numbers, and titles. Americans lost then his new agentic business tools put all that data together. The only mistake was that he accidentally left the public setting switched on. Nothing dramatic at risk, nothing at all, just the scaffolding of trust, the ingredients of his “personal touch”.

Somewhere else, with a thrumming beat and a synth rich tune; a crew gets to work. Well, a year ago it was a crew, now Danny Ocean and his gang have been replaced by ChatGPT and one lone criminal.

By Friday, they have everything they need. On Monday morning, a finance officer wires $400,000 to a “new account.” The money is gone before lunch. There were no firewalls breached. No malware is deployed. Just the alchemy of leaking personal data turned into profit; the digital equivalent of a casino scam that never trips an alarm.

Two months later, hundreds of customers get calls about their college kid being in some emergency and there’s a need for money to be wired immediately. Most people know it’s a scam. A few very busy parents though are overwhelmed by the social engineers and their terrifying imagery.

Welcome to the social-engineering economy, where your email address is a skeleton key, and your family’s personal information is a slim jim. In Europe, regulators have closed the garage at least. In the United States, we leave the keys in the ignition and call it “innovation.”

Don’t believe me? Here’s a wild statistic; Americans lost $12.5 billion to scams last year, according to the FTC. Using the median income for the US, that’s almost 150,000 households’ worth of income wiped out. The loss number $12.5 billion is a 25% increase over 2023, but the number of reported events remained close to the same. What this means is that the social-engineering economy is growing in a very specific way which I refer to as a strong ROI, Return on Impersonation. We could call this optimization, profitability per engagement, or maybe process maturity. Whatever it is, it’s a sign of massive growth and really a booming new economy.

I’m not an economist, but I do have a ChatGPT account. So, I asked it to do a search of the web and give me a referenceable number on how fast economies grow in a year, globally. It pulled IMF and World Bank statistics and decided that even high growth is measured in single digits, averaging 4.2% in IMF’s emerging economies (India maybe around 7%). Either way, it concluded that our social-engineering economy grew 8-10x faster than the world economy, putting in about the same amount of “effort” (reported events).

This is a perfect storm: First, the United States has few rules on how to handle PII. Healthcare specifically, yes. Credit cards specifically, well VISA and Mastercard made their own global rules about those. But the data that was accessed in our example? Twenty states have some kinds of laws; the company would have to do a breach notification, at a minimum, if anyone ever discovered it. Maybe they would offer a year of credit protection and monitoring.

Of course, our crew never once pulled on the victim’s credit; they pulled on their hearts with that personal touch.

Second, AI is turning poorly written scripts, badly formed emails and questionable graphics into perfect copies of what we expect in professional communication. It’s able to weave together multiple data sources into rich detailed information. Imagine the top 10 PII datasets on the dark web, right now. Then imagine a fine-tuned LLM sitting on top of a RAG pipeline and some good analytic Agent tools. A compromised account that exposed your kid, another that exposed you, another that exposed your great aunt and suddenly, Grandma gets a terrifying call. This isn’t fiction; it’s happening right now. My parents get calls from people purporting to be me, or that I’m in an emergency. I have co-workers with similar stories. At this point it’s common enough that almost everyone knows a friend or relative that was targeted. Many of us have likely received a “sorry wrong number”. “You seem nice, can we be friends?” text message, that screams scam to most, but those people wouldn’t be fishing, if no one ever took the bait.

So, what do we do about this? In Europe, GDPR sets very specific rules around what kind of PII data can be kept, how it must be handled, what permissions must be given by the data subject (i.e. you and me), the right to have one PII data removed from organizations, and consequences for not complying. Canada has laws strictly governing how PII data can be handled, so does Singapore, so does, well, as of 2024, 70% of the world’s governments had national privacy laws in place.

In the United States, apparently, we’re relying on the individual states and the free market to sort this out. That means, if you tend to lean toward the free market, you need to vote with your wallet when companies fail to hold the social contract of protecting the information we give them. If you believe in regulations, figure out where your state’s regulations are. Make some noise at the national level. Organizations face stiff penalties for mishandling your credit cards; why is your personal information any less worthy of protection?