"The mere possession of a human form is sufficient to become a victim," attorney Carrie Goldberg articulates the perils of deepfake pornography in the era of artificial intelligence. While the phenomenon of revenge pornography—unconsented distribution of intimate images—has persisted nearly as long as the internet itself, the widespread availability of AI tools has expanded the reach of this harassment to anyone, regardless of whether they have ever shared or taken an explicit photograph. Today, AI capabilities enable the superimposition of an individual's face onto a nude body or the manipulation of existing images to create the illusion that a person is nude.
Over the past year, victims of AI-generated, nonconsensual explicit content have spanned from high-profile figures like Taylor Swift and Congresswoman Alexandria Ocasio-Cortez to teenage girls. For those who discover that they or their children have become the subjects of deepfake pornography, the experience is often terrifying and overwhelming, according to Goldberg, who operates the New York-based firm C.A. Goldberg Law, advocating for victims of sexual offenses and online harassment. "Especially for the young, who may not know how to handle such situations and perceive the internet as an immense, amorphous entity," she remarked.
However, there are measures that individuals targeted by this form of harassment can implement to safeguard themselves and sources of assistance available to them, Goldberg shared in an interview for the tech-focused podcast, Terms of Service with Clare Duffy. Terms of Service endeavors to clarify the new and emerging technologies that listeners encounter in their daily lives. Goldberg advised that for those targeted by AI-generated explicit images, the initial step—though it may seem counterintuitive—should be to capture screenshots. "The instinctive response is to remove this content from the internet as quickly as possible," Goldberg explained. "However, if you wish to preserve the option of reporting it criminally, you require the evidence." Subsequently, they can utilize the forms provided by platforms such as Google, Meta, and Snapchat to petition for the removal of explicit images. Nonprofit organizations like StopNCII.org and Take It Down can also assist in facilitating the removal of such images across various platforms simultaneously, although not all sites collaborate with these organizations. In August, a bipartisan group of senators penned an open letter urging nearly a dozen tech companies, including X and Discord, to participate in these programs. The struggle to tackle
By William Miller/Nov 13, 2024
By Ryan Martin/Oct 18, 2024
By Sarah Davis/Oct 18, 2024
By Sophia Lewis/Oct 18, 2024
By Sarah Davis/Oct 18, 2024
By Victoria Gonzalez/Oct 18, 2024
By Ryan Martin/Oct 18, 2024
By John Smith/Oct 18, 2024
By Daniel Scott/Oct 18, 2024
By Natalie Campbell/Oct 18, 2024
By Sophia Lewis/Oct 18, 2024
By Ryan Martin/Oct 15, 2024
By William Miller/Oct 15, 2024
By Noah Bell/Oct 15, 2024
By Sarah Davis/Oct 15, 2024
By Daniel Scott/Oct 15, 2024
By Christopher Harris/Oct 15, 2024
By Grace Cox/Oct 15, 2024
By Emily Johnson/Oct 15, 2024
By Natalie Campbell/Oct 15, 2024