Things Are Always Scarier When These Stories Happen to Someone You Know 😱
A Miami Beach realtor becomes the unwitting star of an AI-powered romance scam that proves deepfakes aren’t just the stuff of sci-fi nightmares—they’re here, and they’re personal.
I heard this story yesterday and I’m still in absolute shock. I have a client that owns a cybersecurity company and he’s told me so many horror stories over drinks about the worst online deception scams that he’s uncovered and what awful things are ahead for us online. His stories are a weekly real-life Black Mirror episode for me. Most of them happen to people that I haven’t met that live all around the world. Yesterday’s story it hit way too close to home for me.
Andres Asion didn’t sign up to be a catfish. But that didn’t stop some unknown scammer from turning the well-known Miami Beach realtor and businessman into the unwitting poster boy of a deepfake love story that duped a woman in the U.K. for over a year.
Yes, a year.
This wasn’t your run-of-the-mill Nigerian prince email or poorly Photoshopped Instagram DM. No, this scam had Hollywood-level production—complete with eerily realistic videos of Asion’s face and voice addressing the victim by name. The woman thought she had found love. What she actually found was the cutting edge of AI-fueled deception.
Asion only found out about his unwanted AI clone when the woman flew across the Atlantic to meet him—because nothing says love like a surprise visit. She called his office line expecting a warm reunion. Instead, she reached the real Andres, who had no idea who she was or why she believed they were in a transcontinental relationship.
"I was shocked," Asion said, noting that even his own friends were fooled by the videos. "They looked real. They sounded real. But I never made them."
Let that sink in: If your own family can’t tell the difference between you and a digital imposter, how is the average person supposed to stand a chance?
The scammer, naturally, vanished into the digital ether—phone number disconnected, identity unknown. But the fallout remains: a deeply shaken woman, a violated public image, and a renewed urgency around the dangers of AI technology in the wrong hands. He’s now raising awareness and hoping to get some laws passed against these deepfake scams.
This isn’t a one-off horror story—it’s a preview of where we're heading. Deepfakes used to be quirky internet experiments. Now, they’re tools in the scammer’s toolkit. And the scary part? They're getting better. Faster. Cheaper. More personal.
This story hits differently because it didn’t happen to a faceless stranger. It happened to someone from my community. Someone who lives in my hometown that I’ve known for over 20 years. That's the gut-punch: deepfakes have moved off our social media feeds and into our neighborhoods.
So next time someone sends you a video that feels a little too perfectly curated, maybe pause before catching flights or feelings. Because in 2025, your soulmate might be an algorithm—and your heartbreak, a scam. Be careful out there.
The full story is here:
This is insane!!!