AI advances have already given us all sorts of crazy , useful , even mind-boggling new tools in staggeringly short time. Even one of the most controversial AI technologies–deepfakes–has an upside , like giving digital "replica" voices to people who’ve lost their own voices through illness.
But the dark side of deepfake systems, which can create content or manipulate existing material to make it seem like people are saying or doing things they’d never really do, in videos, photos or audio clips, is very, very dark indeed. From sordid sex crimes involving manipulated images, to huge financial crimes , to faked robocalls from President Biden designed to disrupt elections, deepfakes worry the U.S. Copyright Office enough that it’s issued a report calling for swift action to rein-in the tech.
In a statement accompanying the report , the office’s director Shira Permutter explained the scope of the threat, TechCrunch reports . It’s not just that "digital replicas" threaten celebrities and politicians, but also "private citizens," Perlmutter said. The office believes "there is an urgent need for effective nationwide protection against the harms that can be caused to reputations and livelihoods." Perlmutter points out that deepfakes can impact anyone, which is backed up by recent events like the recently released video deepfake of likely Democratic presidential nominee Kamala Harris , which Elon Musk just shared on X, and an attempted deepfake scam that recently hit Ferrari.
Perlmutter taps into a rich vein of worry among government and industry figures. AI experts have sounded warning bells on AI deepfakes, calling for regulation. The threat of even "legitimate" deepfake tech has led the actors’ union SAG-AFTRA to change its contracts to clarify that a voice actor role in animated TV shows can only be performed by real humans .
The Copyright Office’s report includes ideas policymakers may want to consider, including ensuring that liability for deepfakes isn’t limited to the act of creating the replicas, but also to sharing them, and isn’t confined to commercial cases, since "the harms caused are often personal in nature." Protection against deepfakes should also last a person’s entire life, and also […]
U.S. Copyright Office Calls AI Digital Replicas a Serious Threat