Skip to main content

Senator Hassan Presses Leading AI Voice Cloning Companies to Prevent Exploitation by Scammers

FOR IMMEDIATE RELEASE 

April 16, 2026  

Contact: minority_jecpress@jec.senate.gov

Senator Hassan Presses Leading AI Voice Cloning Companies to Prevent Exploitation by Scammers 

Requests Come as FBI Newly Reports Victims Lost $893 Million to AI-Related Scams in 2025 

WASHINGTON – U.S. Senator Maggie Hassan (D-NH), as part of her comprehensive investigations into scams as Ranking Member of the Joint Economic Committee, today pressed four companies with leading AI voice cloning services to increase efforts to prevent the exploitation of their products by scammers. Senator Hassan requested information from ElevenLabs, LOVO, Speechify, and VEED about what steps the companies currently take to deter scammers from abusing their tools to defraud Americans.

In recent years, AI voice-generating tools have enabled global criminal networks to produce deepfake materials to target more people with increasingly personalized and believable digital scams, including fictitious voices and calls used for imposter and romance scams. These scams can involve impersonations of elected officials and celebrities – and even family, friends, and loved ones. This month, the FBI reported that scams involving AI, in general, accounted for more than $893 million in reported losses in 2025, and experts predict that generative AI tools could enable up to $40 billion in annual fraud losses in the United States by 2027. 

 “In recent years, global criminal networks have used deepfake voice programs, along with other new AI tools, to target more people with increasingly personalized and believable digital scams, fueling a booming scam industry that surpasses the global drug trade as an illicit industry,” wrote Senator Hassan in her requests. “Protecting Americans from these financial losses will require collaboration between the public and private sectors, and AI companies [including yours] are on the frontlines of this effort.” 

In her requests, Senator Hassan detailed the need for AI voice cloning services to strengthen scam prevention efforts: “Evidence suggests that technology companies could do more to prevent the misuse of their AI voice generation tools. In 2024, investigative news outlet Proof examined eight popular AI voice cloning platforms and found few safeguards to prevent nonconsensual voice cloning…A March 2025 Consumer Reports investigation similarly found that most of the leading AI voice cloning products had no ‘technical mechanism’ in place to prevent users from replicating voices without permission.” 

Senator Hassan also pointed to examples of some of the harrowing impacts of AI voice-generated scams. She wrote, “In June 2025, a New York man was sentenced to prison for his role in a high-tech, 'elaborate grandparent scam' in which he stole around $20,000 from three New Hampshire families after convincing them that their loved ones were in trouble. According to the Union Leader, ‘[v]ictims say the scam involved the use of artificial intelligence mimicking a loved one’s voice to trick them into turning over money to bail that person out of jail.’One victim describedhow ‘[t]he story was presented so convincingly with my son’s voice full of terror’ on the deepfake call. In 2024, police in Merrimack County, New Hampshire, similarly investigated dozens of reports of scam calls targeting residents in which criminals allegedly used AI to manipulate their voices to sound like law enforcement or the family members of victims.” 

Senator Hassan’s requests issued today are part of her ongoing comprehensive effort to combat scams. She has opened investigations into the roles that AI companiesfederal agencies, satellite internet providers, and online dating platforms have in protecting Americans from criminal fraud. 

Read Senators Hassan’s letters to ElevenLabs, LOVO, Speechify, and VEED here

###