Photo by Markus Spiske
The Associated Press broke a story that’s lighting up conversations across music, tech, and media. On Wednesday last week, a mix of tech executives, record label heads, and none other than country legend Martina McBride sat down in front of Congress with a message that’s hard to ignore – AI deepfakes aren’t just a funny internet trend anymore. They’re starting to cross serious lines. It’s one thing when someone messes around with voice filters on TikTok. It’s another when an artist’s voice—or face—is being dropped into songs or videos they never made.
That’s why a new piece of legislation called the No Fakes Act is gaining traction fast.
At its core, the bill makes it illegal to digitally clone a person’s voice, image, or performance without their permission. And it doesn’t matter who’s behind the fake—a solo hobbyist or a billion-dollar AI company. If the result feels real and the person didn’t agree to it, this bill says: that’s not okay, and it’s time to face the consequences.
Martina McBride came through strong. She told lawmakers the tech itself isn’t the enemy—but how people are using it, that’s where things are going off the rails. She didn’t hold back, calling out everything from fake explicit content targeting girls to scammers cloning voices for fraud to musicians being digitally hijacked. It’s happening, and she’s not the only one raising the alarm. Nearly 400 performers, including Missy Elliott, Scarlett Johansson, and Bette Midler, are standing behind the No Fakes Act through the Human Artistry Campaign.
One of the big shakeups this bill could bring? It would make platforms like YouTube and TikTok legally responsible if they knowingly host deepfake content. If the law passes, they’ll have to take down those fakes fast—or be ready for a courtroom. It also introduces a takedown process for victims. No lawyers, no legal gymnastics. You notify the platform, and they’re required to remove the content. That simple.
Even the tech world is getting behind it—at least the parts that know what’s at stake. Suzana Carlos, who oversees music policy at YouTube, said they support the bill because AI needs structure. She was clear: the tools have potential, but without real rules, they’re just as likely to cause harm. Right now, it’s chaos. One day it’s a fake Drake track. The next, it’s a video that could swing an election. The No Fakes Act offers a way to rein it in.
And this isn’t happening in a vacuum. Just days before the hearing, President Trump signed the Take It Down Act, focused on fighting revenge porn and AI-made explicit deepfakes. The No Fakes Act picks up that thread and stretches it further—into voice clones, deepfake performances, and impersonations that undercut trust, creativity, and consent.
Mitch Glazier of the RIAA said it best: this gives people a way to act now—not months down the line. And that’s the point. This isn’t a ban on AI. It’s about protecting artists, creators, and everyone else from being digitally copied and exploited without recourse.
AI is not going anywhere – but with laws like this, maybe we can finally stop it from getting away with everything.