FBI PSA: The Safe Bet Is to Assume It’s Fake Remember when the only person you worried might fall prey to scammers was your favorite aunt, who had only her Welsh Corgi at home with her during the day? “Now, Trixie,” you’d say, “don’t agree to anything and always call me first.” Those days are over. Forget your late aunt Trixie. Worry about yourself. Imagine if you received a phone call from a close friend, family member, even your spouse that was actually an utterly-convincing AI-generated version of that person’s voice – urgently begging you to provide a credit card number to spring her out of a filthy jail in Veracruz or pay an emergency room hospital bill. The age of AI augers many things, we are told. But while we’re waiting for flying taxis and the end of mundane tasks, get ready to question the veracity of every form of media you encounter, be it text, image, audio, or video. In what is sure to be the first of many such public service announcements, the FBI is warning that the era of AI-powered fraud hasn’t just dawned, it is fully upon us. The theme of the FBI’s announcement is “believability.” It used to be that scams were easy to spot – the writing was laughably bad, or the video and audio were noticeably “off” or even a little creepy – a phenomenon known as the Uncanny Valley effect. The newfound power of generative AI to produce realistic versions of traditional media has put an end to such reliable tells. Anyone who thinks they’re immune to such trickery misunderstands the nature of generative AI. Consider:
Whenever a friend or family member sends a video that clearly shows him or her in need of help (stranded on vacation or having their wallet stolen at a nightclub perhaps), don’t automatically assume it’s real no matter how convincing it looks. And thanks to generative AI’s “vocal cloning” ability, a straight-up phone call is even easier to fake. So, what can we do? The FBI advises: Agree to a secret password, phrase, or story that only you and your family members know. Do the same with your friend groups. Then stick to your guns. No matter how close your heartstrings come to breaking, if they don’t know the secret answer, it’s a scam-in-waiting. The FBI also recommends limiting “online content of your image or voice” and making social media accounts private. Fraudsters scrape the online world for these artifacts to produce their deepfake masterpieces. All generative AI needs to create a convincing representation of you is a few seconds of audio or video and a handful of images. Rest in peace, Aunt Trixie. We miss her and the good old days when all we had to do was warn her not to give her personal information to a caller who said he was from the Corgi Rescue Fund. Today, if an AI scamster wanted to, he could now have Aunt Trixie call you from the grave, needing money, of course. Comments are closed.
|
Categories
All
|