1 dari 6 Anggota Kongres Dijadikan Target oleh Deepfakes yang Dihasilkan oleh AI yang Bersifat Seksual secara Eksplisit

A recent study by the American Sunlight Project (ASP) revealed that more than two dozen members of Congress have fallen victim to sexually explicit deepfakes, with the majority of those impacted being women. The study highlighted the gender disparity in this technology and the growing risks faced by women in politics and civic engagement.

The ASP identified over 35,000 instances of nonconsensual intimate imagery (NCII) depicting 26 members of Congress, with 25 being women and one man. Most of the imagery was swiftly removed after researchers shared their findings with the affected lawmakers. The study emphasized the urgent need to address the disproportionate harm targeting women and marginalized communities in the digital age.

Nonconsensual intimate imagery, often referred to as deepfake porn, can be produced using generative AI or by superimposing headshots onto adult media. Current policies to restrict the creation and dissemination of such content are limited.

The study, shared exclusively with The 19th, found that gender was the primary factor in determining who was targeted for abuse, with women members of Congress being 70 times more likely to be victimized than men. ASP did not disclose the names of the affected lawmakers to avoid further searches, but they did reach out to provide support and resources to those impacted.

The rapid removal of most of the imagery from deepfake sites remains unexplained, as researchers noted that the material could still be shared or uploaded elsewhere. The study underscored the privilege that allowed for such quick response in removing the content, highlighting the challenges faced by those without the resources to combat deepfake abuse.

MEMBACA  Remaster Horizon Zero Dawn dari Sony mungkin akan lebih mahal $20 dari perkiraan kita.

The study also revealed that nearly 16 percent of women in Congress have been victims of AI-generated nonconsensual intimate imagery, with devastating mental health effects on the victims. The lack of federal laws establishing criminal penalties for creating and distributing such content has led some states to enact legislation, though concerns around free speech and harm definitions remain obstacles to passing comprehensive laws.

Efforts are underway to push for federal legislation, such as the DEFIANCE Act and the Take It Down Act, which aim to provide legal recourse for victims of image-based sexual abuse. The importance of addressing this issue is paramount to protect the privacy and dignity of individuals, particularly women in public life.

In light of the ongoing threats posed by deepfake technology, collaboration between government and private sector entities is crucial to develop effective solutions. The impact of image-based sexual abuse extends beyond individual victims, sending a chilling message to women everywhere about the risks of speaking out in public.