crimeliberal
Fake Kidnappings: AI Voices Trick Parents into Paying Ransom
Burien, USAFriday, October 11, 2024
DeStefano was terrified and tried to contact friends, family, and police. Meanwhile, her husband found Briana safe at home. The scammers had used AI to replicate Briana's voice so convincingly that DeStefano thought her daughter was in real danger.
Beenu Arora, CEO of Cyble, a cybersecurity company, explains how scammers collect voice data. They either gather it from unknown calls or from public videos online. The more you speak, the more data they collect. Arora suggests being cautious when answering unknown calls and not speaking too much.
The National Institutes of Health (NIH) advises people to be wary of calls from unfamiliar numbers demanding ransom. They recommend slowing down, asking to speak directly with the victim, and trying to contact them separately.
AI is making it harder to distinguish real from fake. Arora warns that society needs to be more critical and think before acting when faced with urgent situations. Anyone who thinks they've been targeted should contact their local police.
Actions
flag content