February 24, 2024

FCC Bans AI-Generated Robocalls Ahead Of The US Elections

On Thursday, the FCC (Federal Communications Commission) declared that robocalls that use AI-generated voices are “artificial” and hence should be made illegal.

This statement comes shortly after several fake robocalls circulated around the country, imitating President Joe Biden and asking the people to not vote for him in the upcoming elections.

In a recent update in the case earlier this week by New Hampshire Attorney General John Formella, it was found that calls traced back to Texas-based Life Corp. The company is run by Walter Monk who has already received a cease-and-desist letter. A criminal investigation has also been started to dig out more details.

Bad actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, imitate celebrities, and misinform voters. We’re putting the fraudsters behind these robocalls on notice.Jessica Rosenworcel, FCC Chairwoman

She also stated that this declaration will give the state attorneys the required tools and power to go after these scammers who have been troubling the people for too long.

This isn’t the first time that people have used robocalls to disrupt elections. In fact, in 2023, the FCC imposed a fine of $5.1 million on two conservative activities that made 1100+ illegal robocalls ahead of the 2020 elections.

What Was The Law And How Has It Changed?

According to the previous rules, the state attorney journeys could only target the robocalls if there was an unfortunate outcome of their acts. However, the new rules make the act of using AI to make robocalls illegal in itself.

Now, although it might seem like a new rule, it’s not all that new. It’s merely an extension of an old rule.

Under the Telephone Consumer Protection Act, certain categories of communication are prohibited. But the real question was whether an AI voice generator used in robocalls falls under those categories or not.

Each unwanted robocall can attract a fine of up to $23,000 per call and the recipients of those calls can also take legal action and claim up to $1,500 in damages for each call.

While the answer might seem obvious to some, there’s nothing the government or the FCC can do unless the terms are clearly mentioned under the Act.

That’s why the FCC decided to investigate the matter by itself and hire experts to get their opinion on whether AI-generated robocalls can be declared illegal.

It’s not to say that impersonating Biden would be legal before the new rule. It would still be illegal and may have attracted charges for voter suppression and fraud. However, since cases like these need to be backed by evidence in court, it is important to distinguish the contexts in which AI can be used legally in automated calls.

So for example, if your doctor uses an AI-generated call to confirm your appointment, then that’s totally allowed. But if someone uses AI to impersonate your doctor to contact you, regardless of what they say, it will be illegal.

This small addition to the rule will make court trials a speedy affair. Instead of going into the complexities of understanding the context, all that needs to be proved is that the voice was AI-generated and fake.

This is indeed a timely change considering that the US elections are around the corner. The authorities are trying to do everything to ensure it is a fair one.

free coins
free coinsfree coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins

Leave a Reply

Your email address will not be published. Required fields are marked *