phone apps

The Federal Communications Commission for months has wrestled with how to get control of the gnarly issue of scammers using AI-generated voices in unsolicited robocalls, particularly if they sound like well-known people or family members.

This week, FCC Chairwoman Jessica Rosenworcel announced that the plan is to make such calls illegal.

The plan is to have the commission deem AI-generated voice as “artificial” under the Telephone Consumer Protection Act (TCPA), which would make it illegal to use generative AI-based voice cloning technology in robocalls aimed at consumers.

“AI-generated voice cloning and images are already sowing confusion by tricking consumers into thinking scams and frauds are legitimate,” Rosenworcel said in a statement. “No matter what celebrity or politician you favor, or what your relationship is with your kin when they call for help, it is possible we could all be a target of these faked calls.”

Given that, the FCC “is taking steps to recognize this emerging technology as illegal under existing law, giving our partners at State Attorneys General offices across the country new tools they can use to crack down on these scams and protect consumers.”

Fake Biden Call a Turning Point

The announcement comes less than two weeks after a robocalls were sent to Democratic voters in the runup to the New Hampshire presidential primary telling them – in the AI-created voice of Joe Biden – not to vote.

The FCC in the statement noted that calls using AI-based voice-cloning technology has been on the rise over the past few years, making it difficult for people to know whether they’re talking to a real person or a computer. Some AI technologies can take a snippet of a person’s voice and use that voice when creating full messages.

This can lead to a broad range of problems in society, from misinformation and disinformation campaigns – such as the one using Biden’s voice – to so-called “grandparent” scams, where the fraudster uses generative AI to mimic the voice of a child to trick victims into believing their grandchild is in trouble and need their money to get out of it.

The Federal Trade Commission (FTC) a year ago published an advisory about such scams, warning that a “scammer could use AI to clone the voice of your loved one. All he needs is a short audio clip of your family member’s voice — which he could get from content posted online — and a voice-cloning program. When the scammer calls you, he’ll sound just like your loved one.”

None of these frauds are new. However, the accelerated innovation of text-to-speech (TTS) technology is making it easier and cheaper for scammers to run these schemes and more difficult for those targeted to tell what’s real and what isn’t.

Using the TCPA

The TCPA was enacted in 1991 to help the FCC crack down on junk calls by restricting the use by telemarketers of artificial or prerecorded voice messages and automatic phone dialing systems. If the agency enacts the proposal, AI-generated voice calls will be included within those standards.

Such a move will give law enforcement across the country new capabilities for investigating and charging people behind robocalls, the FCC said. The full commission reportedly is expected to vote on the proposal in the next few weeks.

The agency is working with 48 attorney generals to fight against robocalls.

A Fast-Growing Market

The FCC in November issued a Notice of Inquiry (NOI) to collect information about how AI tools are being used in such calls and text messages at a time when the global AI voice generator market could reach almost $5 billion by 2032 and the voice cloning space hitting almost $1.8 billion by 2029.

Much of this is being driven by the rapid develop of generative AI, natural language processing, and deep learning techniques.

The FCC isn’t the only agency looking into voice cloning. The FTC late last year launched its Voice Cloning Challenge to promote ideas for preventing, monitoring, and evaluate malicious ways such technology is used in hope of protecting consumers against AI-based scams. Earlier last month, the agency opened a 10-day window for people and organizations to submit ideas that address policy, products, or procedures.

Also in the wake of the fake Biden calls in New Hampshire, House Democrats reportedly last month proposed legislation to address AI-powered scams, including robocalls that use voice-cloning technologies. Among other points, the bill would require that the use of AI in robocalls be disclosed.