Illustration: Aïda Amer/Axios With just mere seconds of audio, artificial intelligence voice cloning programs can make a copy of a voice , virtually indistinguishable from the original to the human ear.
Why it matters: That tech can have legitimate accessibility and automation benefits — but it can also be an easy-to-use tool for scammers. Despite that threat, many products’ guardrails can be easily sidestepped, a new assessment found. The "granny scam," as experts refer to imposter scams that sometimes weaponize voice cloning tech to scam people using their loved ones’ voices, is not a new phenomenon.
However, "the pace at which it’s now happening and the believability of the voice has fundamentally changed," says Rahul Sood, the chief product officer at Pindrop , a security company that develops authentication and fraud detection tools.
It’s not just individuals who are at risk, he noted. The corporate sector faces many cyber threats, from account takeover scams targeting call centers to recruiting impersonation.
Zoom in: A study out this week from Consumer Reports found many leading voice-cloning technology products lacked significant safeguards to prevent fraud or misuse. For four of the six products in the test set, researchers were able to "easily create" a voice clone using publicly accessible audio, with no technical mechanism to ensure the creators received the speaker’s consent to use their voice or to limit the cloning to the user’s own voice.
For four of those services, it was free to create a custom voice cloning.
By the numbers: While the Federal Trade Commission does not have specific data on voice-cloning scams, over 845,000 imposter scams were reported in the U.S. in 2024.
The intrigue: Scams and spoofs using AI voice cloning and deepfake technology also often impersonate well-known individuals, like celebrities, CEOs and politicians. After former President Biden ‘s voice was cloned using AI in fake robocalls discouraging voting in the New Hampshire primary, the Federal Communications Commission unanimously outlawed the use of AI-generated voices in scam robocalls.
In July, Elon Musk shared a fake Kamala Harris ad that featured a phony voice that sounded just […]
AI voice-cloning scams: A persistent threat with limited guardrails