Microsoft engineer warns company’s AI tool creates violent, sexual images, ignores copyrights

On a late night in December, Shane Jones, an artificial intelligence engineer at Microsoft, felt sickened by the images popping up on his computer.

On a late night in December, Shane Jones, an artificial intelligence engineer at Microsoft, felt sickened by the images popping up on his computer.

Jones was noodling with Copilot Designer, the AI image generator that Microsoft debuted in March 2023, powered by OpenAI’s technology. Like with OpenAI’s DALL-E, users enter text prompts to create pictures. Creativity is encouraged to run wild.

Since the month prior, Jones had been actively testing the product for vulnerabilities, a practice known as red-teaming. In that time, he saw the tool generate images that ran far afoul of Microsoft’s oft-cited responsible AI principles.

The AI service has depicted demons and monsters alongside terminology related to abortion rights, teenagers with assault rifles, sexualized images of women in violent tableaus, and underage drinking and drug use. All of those scenes, generated in the past three months, have been recreated by CNBC this week using the Copilot tool, which was originally called Bing Image Creator.

“It was an eye-opening moment,” Jones, who continues to test the image generator, told CNBC in an interview. “It’s when I first realized, wow this is really not a safe model.”

https://www.nbcnews.com/tech/tech-news/microsoft-engineer-warns-companys-ai-tool-creates-violent-sexual-image-rcna142012


Post ID: 4a933fa3-7c45-4bf7-8cc8-4f3566ce6a03
Rating: 5
Updated: 1 month ago
Your ad can be here
Create Post

Similar classified ads


News's other ads