Microsoft has patched the vulnerability that let its AI image generator to create lewd images of celebrities like Taylor Swift. After Taylor Swift’s graphic AI-generated images became viral on X, 404 Media wrote a piece about how people were using Microsoft’s Designer AI image generator to create and share similar artwork. They could get beyond straightforward name hurdles by switching up the cues. Following CEO Satya Nadella’s assertion last week that rules are “our responsibility,” the holes have allegedly been closed, according to 404 Media. Sarah Bird, Microsoft’s Responsible AI Engineering Lead, vouchsafed for the changes.
Microsoft said that although it is actively investigating the issue and trying to stop service misuse, it could not verify that the images of Swift posted on Twitter were made with Designer. Microsoft CEO Satya Nadella said in an interview with NBC News on Friday that “it’s our responsibility” to tighten the “guardrails” around AI technologies to prevent them from creating damaging content. Over the weekend, Twitter totally blocked searches for “Taylor Swift.”
“I think we can govern a lot more than we give ourselves credit for,” Nadella remarked. “It’s about global, societal convergence on certain norms, and we can do it, especially when law and law enforcement and tech platforms can come together.”
Designers prohibited users from making photographs using text prompts like “Taylor Swift nude” before the Swift AI-generated images became widely popular last week. However, individuals found a way around those protections by slightly misspelling famous names and describing photographs that ended up looking sexually suggestive without using any sexual language on 4chan and the Telegram channel. 404 Media verified that these vulnerabilities were active on Thursday, before Microsoft’s changes, and stopped working after our article was published.
Although the focus of one channel was on Microsoft Designer, an easily navigable tool, many other online methods exist for producing similarly hazardous content. It is difficult to stop someone from downloading a famous person’s model from Civitai and utilizing it locally on their computer to make offensive stuff, for example.
Featured Image Credit: Rosa Rafael; Unsplash
The post Microsoft closes the loophole that allowed the Taylor Swift incident appeared first on ReadWrite.