Other Technology | February 7, 2024
Microsoft is taking steps to prevent users from generating sexually explicit images of Taylor Swift and other celebrities using AI after 404 Media discovered that viral photos of the singer were created using Designer, a tool that allows the generation of images from a brief text description.
Specifically, 404 Media found that the AI-generated images of Taylor Swift came from a 4chan forum and a Telegram channel where users were generating explicit photos using Designer, developed by Microsoft. 404 Media claims to have sent a report to the company, and Microsoft responded by stating that they are “taking appropriate action” to end this trend.
Later, Microsoft confirmed to 404 Media that they could not verify if the AI-generated images of Taylor Swift that went viral on Twitter came from Designer. However, the company limited its tool to prevent the creation of such photographs by blocking the use of explicit keywords.
Forum users, however, found a way to continue producing suggestive images of the singer and other celebrities: simply misspelling the name or creating a description that does not explicitly detail a sexual term but is suggestive. According to the media outlet, Microsoft has also limited this approach. The source claims that in the Telegram channel, where images of Taylor Swift were shared, there are now only messages highlighting that the AI only creates generic images.
Microsoft is not the only one taking measures to prevent the sharing of explicit images like those of Taylor Swift. Twitter has also limited certain searches on the platform to prevent users from finding and sharing deepfakes massively. This is a temporary measure, although it is unclear when the search function for Taylor Swift will be reactivated.
Of course, the platform has also removed all viral images of the singer and other celebrities.
However, 404 Media claims that despite the disappearance of AI-generated photos of Taylor Swift and serious measures taken by companies, users continue to produce explicit photographs of celebrities using other AI tools and share them through the Telegram channel. While the media outlet has not revealed the name, it emphasizes that the channel has thousands of members and remains active.