In 2024, scammers increasingly used AI-generated voice and video to impersonate real people during calls. Authorities confirm a sharp rise in AI-enabled impersonation fraud (Federal Bureau of Investigation public warnings, 2024, https://www.ic3.gov/Media/Y2024/PSA240402).
The Scenario
Cases have emerged where victims receive urgent calls appearing to be from family members in distress, requesting money. These scams rely on cloned voices and, in some instances, manipulated or AI-generated video. Law enforcement has documented similar "emergency impersonation" scams using AI voice cloning (Federal Trade Commission, consumer alerts).
The Technology Shift
Advances in generative AI have lowered the barrier to impersonation:
- AI voice cloning can replicate speech patterns from short audio samples (ElevenLabs technology capabilities)
- Synthetic video tools can generate realistic human avatars, though fully real-time, indistinguishable deepfake video calls at scale remain limited in verified reporting
- These tools reduce language barriers and increase the believability of scams (Europol, 2023–2024 threat assessments)
By the Numbers
Authorities report rising incidents of impersonation and investment scams, many incorporating AI elements, but:
- Precise figures such as "monthly incident counts," "success rates," or "age breakdowns" vary widely and are not confirmed in public datasets
- Overall fraud losses (including impersonation scams) reach billions annually, reflecting the scale of the broader problem (Federal Trade Commission, 2023–2024 data) Variants of Use
AI-enhanced impersonation appears in multiple scam types:
- Family emergency scams requesting urgent transfers
- Romance scams escalating trust with voice or video interaction
- Business email/CEO fraud using cloned voices
- Investment scams leveraging perceived identity verification
Why It's Harder to Detect
AI-generated content is improving rapidly, making traditional cues (e.g., unnatural speech or visual artifacts) less reliable. Combined with compression and background noise in phone calls, distinguishing real from synthetic speech is challenging, especially in real time (INTERPOL cybercrime reports). The Psychological Factor Seeing and hearing a familiar person increases trust. Scammers exploit this by creating urgency and emotional pressure, reducing the likelihood that victims verify the request independently.
Practical Defense
A consistent recommendation from security agencies: Always verify urgent requests through a separate, trusted channel before sending money, such as calling the person directly on a known number.