Conservative influencers using AI to cover up sex worker photos

AI that has been used to create fake nude photos of women is now being used to cover up women wearing revealing clothing, in a movement called "dignifAI."

Artificial intelligence technology that has been used to create fake, nonconsensual nude photos of women is now being used to cover up women so their clothing is less revealing.

Conservative personalities like Ian Miles Cheong have shared viral before-and-after examples online — one viral post from Cheong displayed an AI-edited image of Isla David, a sex worker and sex educator based out of Canada, placed in a modest knee-length white dress with fake children. Her body was shrunken, and Cheong wrote, “When given pictures of thirst traps, AI imagines what could’ve been if they’d been raised by strong fathers.” Cheong’s post has been viewed over 7 million times. Cheong did not respond to a request for comment.

In the original photo, David posed with a glass of whiskey in a sheer white shirt and underwear, and with no children. 

There are numerous other viral posts that have featured the same practice. One account devoted to the AI trend, which is being called “DignifAI,” has amassed more than 28,500 followers since it began posting on Jan. 31. That account has directly replied to women whose photos its manipulated with the edited photos of them and mocking captions like “keep your dignity.”  

More often, women like Taylor Swift and high school-age girls around the world have been victimized by the technology in efforts to portray them nude, but Davis told NBC News that the AI-edited image of her was similarly an attempt to shame and humiliate her. 

https://www.nbcnews.com/tech/internet/conservative-influencers-are-using-ai-cover-photos-sex-workers-rcna137341


Post ID: 5a48e22a-eece-462e-97d5-d619a10edc1b
Rating: 5
Updated: 2 months ago
Your ad can be here
Create Post

Similar classified ads


News's other ads