Uncategorized

Lawmakers propose anti-nonconsensual AI porn bill after Taylor Swift controversy

Photo by Kathryn Riley/Getty Images

US lawmakers have proposed letting people sue over faked pornographic images of themselves, following the spread of AI-generated explicit photographs of Taylor Swift. The Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act would add a civil right of action for intimate “digital forgeries” depicting an identifiable person without their consent, letting victims collect financial damages from anyone who “knowingly produced or possessed” the image with the intent to spread it.
The bill was introduced by Senate Majority Whip Dick Durbin (D-IL), joined by Sens. Lindsey Graham (R-SC), Amy Klobuchar (D-MN), and Josh Hawley (R-MO). It builds on a provision in the Violence Against Women Act Reauthorization Act of 2022, which added a similar right of action for non-faked explicit images. In a summary, the sponsors described it as a response to an “exponentially” growing volume of digitally manipulated explicit AI images, referencing Swift’s case as an example of how the fakes can be “used to exploit and harass women — particularly public figures, politicians, and celebrities.”
Pornographic AI-manipulated images, frequently referred to as deepfakes, have grown in popularity and sophistication since the term was coined in 2017. Off-the-shelf generative AI tools have made them far easier to produce, even on systems with guardrails against explicit imagery or impersonation, and they’ve been used for harassment and blackmail. But so far, there’s no clear legal redress in many parts of the US. Nearly all states have passed laws banning unsimulated nonconsensual pornography, though it’s been a slow process. Far fewer have laws addressing simulated imagery. (There’s no federal criminal law directly banning either type.) But it’s part of President Joe Biden’s AI regulation agenda, and White House press secretary Karine Jean-Pierre called on Congress to pass new laws in response to the Taylor Swift incident last week.
The DEFIANCE Act was introduced in response to AI-generated images, but it’s not limited to them. It counts a forgery as any “intimate” sexual image (a term defined in the underlying rule) created by “software, machine learning, artificial intelligence, or any other computer-generated or technological means … to appear to a reasonable person to be indistinguishable from an authentic visual depiction of the individual.” That includes real pictures that have been modified to look sexually explicit. Its language seemingly applies to older tools like Photoshop, as long as the result is sufficiently realistic. Adding a label marking the image as inauthentic doesn’t remove the liability, either.
Members of Congress have floated numerous bills addressing AI and nonconsensual pornography, and most have yet to pass. Earlier this month lawmakers introduced the No AI FRAUD Act, an extremely broad ban on using tech to imitate someone without permission. A blanket impersonation rule raises huge questions about artistic expression, though; it could let powerful figures sue over political parodies, reenactments, or creative fictional treatments. The DEFIANCE Act could raise some of the same questions, but it’s significantly more limited — although it still faces an uphill battle to passage.

Photo by Kathryn Riley/Getty Images

US lawmakers have proposed letting people sue over faked pornographic images of themselves, following the spread of AI-generated explicit photographs of Taylor Swift. The Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act would add a civil right of action for intimate “digital forgeries” depicting an identifiable person without their consent, letting victims collect financial damages from anyone who “knowingly produced or possessed” the image with the intent to spread it.

The bill was introduced by Senate Majority Whip Dick Durbin (D-IL), joined by Sens. Lindsey Graham (R-SC), Amy Klobuchar (D-MN), and Josh Hawley (R-MO). It builds on a provision in the Violence Against Women Act Reauthorization Act of 2022, which added a similar right of action for non-faked explicit images. In a summary, the sponsors described it as a response to an “exponentially” growing volume of digitally manipulated explicit AI images, referencing Swift’s case as an example of how the fakes can be “used to exploit and harass women — particularly public figures, politicians, and celebrities.”

Pornographic AI-manipulated images, frequently referred to as deepfakes, have grown in popularity and sophistication since the term was coined in 2017. Off-the-shelf generative AI tools have made them far easier to produce, even on systems with guardrails against explicit imagery or impersonation, and they’ve been used for harassment and blackmail. But so far, there’s no clear legal redress in many parts of the US. Nearly all states have passed laws banning unsimulated nonconsensual pornography, though it’s been a slow process. Far fewer have laws addressing simulated imagery. (There’s no federal criminal law directly banning either type.) But it’s part of President Joe Biden’s AI regulation agenda, and White House press secretary Karine Jean-Pierre called on Congress to pass new laws in response to the Taylor Swift incident last week.

The DEFIANCE Act was introduced in response to AI-generated images, but it’s not limited to them. It counts a forgery as any “intimate” sexual image (a term defined in the underlying rule) created by “software, machine learning, artificial intelligence, or any other computer-generated or technological means … to appear to a reasonable person to be indistinguishable from an authentic visual depiction of the individual.” That includes real pictures that have been modified to look sexually explicit. Its language seemingly applies to older tools like Photoshop, as long as the result is sufficiently realistic. Adding a label marking the image as inauthentic doesn’t remove the liability, either.

Members of Congress have floated numerous bills addressing AI and nonconsensual pornography, and most have yet to pass. Earlier this month lawmakers introduced the No AI FRAUD Act, an extremely broad ban on using tech to imitate someone without permission. A blanket impersonation rule raises huge questions about artistic expression, though; it could let powerful figures sue over political parodies, reenactments, or creative fictional treatments. The DEFIANCE Act could raise some of the same questions, but it’s significantly more limited — although it still faces an uphill battle to passage.

Read More 

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top
Generated by Feedzy