harassment, impersonation, attacks, deepfake

The image wasn’t real, but the damage certainly was. When a 14-year-old girl discovered last year that a classmate had used a “nudify website” to make an AI-generated fake nude photo of her, from a photo she had posted on Instagram, she was devastated and humiliated, and although she attempted to have the photo removed from social media, it took weeks for it to disappear.

There are a growing number of AI-driven platforms that offer users the ability to turn personal photos into nude images that convincingly appear to be the subjects in undress. Even AI-generated explicit images of pop superstar Taylor Swift have been posted throughout social media. But the typical use of these platforms are by young males who obtain photos, and even videos, of female classmates or friends, and run them through a website service without the consent of the subject. Some of the websites offer the first photo for free.

The rise of AI-driven image manipulation has unleashed a troubling new frontier in digital harassment, one that goes beyond traditional forms of revenge porn, also known as Non-Consensual Intimate Images (NCII). Previously, perpetrators relied on sharing explicit photos taken during relationships as a form of retaliation post-breakup. But now, AI has made it disturbingly easy for anyone to create fabricated nudes sourced from social media profiles. With hundreds of millions of photos readily available online, the potential for abuse is vast.

But the U.S. Congress appears to be close to passing the Take it Down Act, which would criminalize explicit AI deepfake posts. The bill was introduced on June 18, 2024, by Sen. Ted Cruz (R-Texas), along with Sen. Amy Klobuchar (D-Minn.), members of the Senate Committee on Commerce, Science and Transportation, which oversees communications and technology. On December 3, 2024, the Senate unanimously passed the bill, and it moved to the House for consideration.

The Committee issued a joint statement; “Disturbingly, this trend is increasingly affecting minors. A number of high-profile cases have involved young girls targeted by their classmates with deepfake NCII. Up to 95% of all internet deepfake videos depict NCII, with the vast majority targeting women and girls. The spread of these images—possibly in perpetuity if allowed to remain online—is having a profoundly traumatic impact on victims.”

Sen. Ted Budd (R-N.C.), also a member of the Committee, said, “The shocking growth in online sexual exploitation and blackmailing requires a national response. The Take it Down Act builds on existing federal law, accounts for the growth in technologies that make it easier to create fake images, and establishes a requirement for websites to respond to victims and take down explicit material. I am proud to join my colleagues in the bipartisan effort to protect Americans from this growing crime and to bring those who perpetrate it to justice.”

Almost every state has a law on the books protecting people from NCII, and 31 states even cover deepfake images, but the laws vary in the penalty and prosecution, and have been unevenly applied at best, according the Committee. And victims often struggle to have the images removed. Congress did pass legislation in 2022 creating a way for victims to file civil actions, but doing so can be time-consuming, expensive, and may force the victim to relive trauma. And it isn’t always clear who is responsible for publishing NCII.

According to the Committee, the Act would make it “unlawful for a person to knowingly publish NCII on social media and other online platforms. NCII is defined to include realistic, computer-generated pornographic images and videos that depict identifiable, real people. The bill also clarifies that a victim consenting to the creation of an authentic image does not mean that the victim has consented to its publication.”

The bill would also require websites to expedite the deletion of NCII. “Social media and other websites would be required to have in place procedures to remove NCII, pursuant to a valid request from a victim, within 48 hours. Websites must also make reasonable efforts to remove copies of the images.”

The National Organization of Women (NOW) is backing the bill. “Non-consensual image-based sexual abuse is a problem that is affecting women and children across the country at an alarming rate,” the Organization stated. “One in four women have reported that they have been subjected to non-consensual disclosure of a private or intimate image and researchers have noted that this type of sexual abuse is often linked to other major types of abuse, such as domestic violence and sexual assault. With the growth of AI and public websites, this type of abuse invades the privacy of all those that experience it and can cause detrimental problems to individuals’ mental health. There needs to be lasting and sustainable change enacted in order to ensure that with the continued growth of technology, there are protections in place to ensure that no woman or child endures the pain that non-consensual image-based sexual abuse can inflict.”

NOW urges citizens to call their respective member of Congress to urge them to support the Act.

TECHSTRONG TV

Click full-screen to enable volume control
Watch latest episodes and shows

Qlik Tech Field Day Showcase

TECHSTRONG AI PODCAST

SHARE THIS STORY