Modern Artifacts

Blogging the Contemporary Artifacts of Tomorrow

New Technology

After Taylor Swift controversy, lawmakers propose an anti-nonconsensual AI Porn Bill.

Advertisement: Click here to learn how to Generate Art From Text

US lawmakers have proposed that people can sue for faked pornographic pictures of themselves. Following the spreadAI-generated explicit photos of Taylor Swift. The Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act would add a civil right of action for intimate “digital forgeries” depicting an identifiable person without their consent, letting victims collect financial damages from anyone who “knowingly produced or possessed” the image with the intent to spread it.

The billThe Senate Majority Leader, Dick Durbin (D – IL), was joined by Sens. Lindsey Graham and Amy Klobuchar of the Republican Party in South Carolina, as well as Josh Hawley of the Republican Party in Missouri, introduced this bill. It builds on the provisions in the Violence Against Women Act Reauthorization Act of 2018, which added an equivalent right of action for non-Faked explicit images. In a summary, the sponsors described it as a response to an “exponentially” growing volume of digitally manipulated explicit AI images, referencing Swift’s case as an example of how the fakes can be “used to exploit and harass women — particularly public figures, politicians, and celebrities.”

Since the term “deepfakes” was first coined, pornographic AI-manipulated pictures, also known as deepfakes or deepfakes, are becoming more popular and sophisticated. Coined by 2017. Off-the-shelf generative AI tools have made them far easier to produce, even on systems with guardrails against explicit imagery or impersonation, and they’ve been used for Harassment and blackmail. But so far, there’s no clear legal redress in many parts of the US. Nearly all US states have passed laws Unsimulated nonconsensual Pornography is banned, though it’s been a slow process. Far fewer have Laws AddressedThe use of simulated images is not prohibited by federal law. (There’s no federal criminal law directly banning either type.) But it’s part of President Joe Biden’s AI regulation agenda, and White House press secretary Karine Jean-Pierre Call on CongressIn response to the Taylor Swift incident that occurred last week, new laws are being passed.

The DEFIANCE Act was introduced in response to AI-generated images, but it’s not limited to them. It counts a forgery as any “intimate” sexual image (a term defined in the underlying rule) created by “software, machine learning, artificial intelligence, or any other computer-generated or technological means … to appear to a reasonable person to be indistinguishable from an authentic visual depiction of the individual.” That includes real pictures that have been modified to look sexually explicit. Its language appears to apply to older software like Photoshop as long the result is realistic enough. Adding a label marking the image as inauthentic doesn’t remove the liability, either.

Many members of Congress have proposed numerous bills that address AI and nonconsensual voyeurism, but none have passed. Earlier this week Legislators introducedThe No AI Fraud Act, Extremely broad banOn using tech to copy someone without permission. A blanket impersonation law raises serious questions about artistic expression. It could allow powerful figures to sue over fictional treatments, political parodies or reenactments. The DEFIANCE Act could raise some of the same questions, but it’s significantly more limited — although it still faces an uphill battle to passage.

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *