Rising threat of deepfake pornography for women
Deepfake pornography increasingly endangers women’s privacy, with 98% of content being explicit.
As deepfake pornography becomes an increasing threat to women online, both international and domestic lawmakers face difficulties in creating effective protections for victims. The issue has gained prominence through cases like that of Amy Smith, a student in Paris who was targeted with manipulated nude images and harassed by an anonymous perpetrator. Despite reporting the crime to multiple authorities, Smith found little support due to the complexities of tracking faceless offenders across borders.
Recent data shows that deepfake pornography is predominantly used for malicious purposes, with 98% of such videos being explicit. The FBI has identified a rise in “sextortion schemes,” where altered images are used for blackmail. Public awareness of these crimes is often heightened by high-profile cases, but many victims are not celebrities and face immense challenges in seeking justice.
Efforts are underway to address these issues through new legislation. In the US, proposed bills aim to hold perpetrators accountable and require prompt removal of deepfake content from the internet. Additionally, President Biden’s recent executive order seeks to develop technology for detecting and tracking deepfake images. In Europe, the AI Act introduces regulations for AI systems but faces criticism for its limited scope. While these measures represent progress, experts caution that they may not fully prevent future misuse of deepfake technology.