This instance highlights the need to regulate deepfakes.
Popular streamer Atrioc posted an emotional apology video last week in response to backlash for viewing non-consensual Deepfake videos involving other popular streamers. He received mixed responses to his apology, but this incident importantly sheds light on the growing debate surrounding the regulation of such technology, especially when the people involved in these videos have not given their consent.
Lately, there has been a great deal of media attention surrounding AI-integrated technologies, with much of the coverage regarding Deepfake being particularly negative.
Regrettably, numerous young female streamers have fallen victim to this technology as pornographic videos featuring them have surfaced on various Deepfake websites.
This has called for laws regarding legal protection for individuals affected by the controversies arising from such content.
Despite the people in these videos never agreeing to participate or even being aware of their involvement, the creators present the videos as if the individuals had willingly participated in the depicted acts.
The portrayal of individuals in such content without their consent can not only lead to embarrassment, shame, and trauma but also have a detrimental impact on their careers and reputation, as well their mental health.
QTCinderella, a popular streamer, recently posted an emotional video where she pleaded for viewers to avoid watching fake explicit content involving her. Despite the original creator of the content eventually removing it after QTCinderella threatened legal action, the unfortunate reality is that many other influencers have also fallen victim to the malicious use of Deepfake technology.
British Twitch influencer “Sweet Anita” was another one of the affected individuals and expressed a mixture of emotions, saying she didn’t know whether to “cry, break things, or laugh” in light of the situation. She shared that she was once offered “millions” to enter the porn industry but refused, only for her image to be used without her consent through Deepfake technology.
This story was how I found out that I’m on this website. I literally choose to pass up millions by not going into sex work and some random cheeto encrusted porn addict solicits my body without my consent instead. Don’t know whether to cry, break stuff or laugh at this point. https://t.co/voNoxRyVBd
— Sweet Anita (@sweetanita) January 30, 2023
Despite the ordeals faced by popular streamers, there is still a debate about the validity of their experiences. Some people even hold the belief that these women have waived their right to privacy as they have chosen to broadcast themselves to a global audience in other non-explicit videos anyways.
There is no data to support how many people have actually reported Deepfake videos to the authorities. However, even if they do, they will have to face a lengthy legal battle as the law is still ambiguous on such matters. This technology has been used to exploit and profit from individuals without their consent, denying them control over the distribution of such material.
At present, there is no federal legislation in the US to protect victims of Deepfake porn. Only two out of the fifty states have declared the use of Deepfake technology in porn as illegal.
There is an ongoing debate on the appropriate actions to take regarding the use of Deepfake technology in porn. A recent development in this area was in 2019, when Yvette Clarke, a Democrat from New York, introduced the DEEP FAKES Accountability Act with the goal of holding those who engage in unethical practices responsible under the law.
Greater regulation and clarity on Deepfake technology in porn is needed to protect people’s rights and privacy and prevent exploitation. This requires legal and technological advancements and collaboration between government, industry, and society.
You must be logged in to post a comment.