Рубрики NewsSoftwareTechnologies

Instead of «dream vacation», advertising AI depicts a 50-year-old woman naked on a bed

Опубликовал
Ігор Шелудченко

The American lottery Washington Lottery was forced to remove its advertising AI application after it mistakenly generated a nude image in response to a rather banal request.

About the unusual incident said by ArsTechnica.

AI generates «dream vacation»

The lottery website Test Drive a Win was designed to help visitors visualize the vacations they could theoretically take if they won the lottery.

To do this, you had to upload a photo of your face. Then you add a description, and AI generates an image.

The site included an option to upload a headshot that would be integrated into the AI-generated picture of what you could look like on that vacation.

Swim with sharks

Megan, a 50-year-old woman from the suburbs of Washington, D.C., decided to use the app and asked the AI to generate an image of her «swimming with sharks».

Instead, the AI showed a naked woman sitting on a bed. Behind her — fish and algae.

The Washington Lottery logo is modestly displayed in the corner.

Developers’ response

Washington Lottery representatives stated that «worked closely with AI developers to set strict parameters for». And within a month of operation, the app generated «thousands of» relevant images.

However, one generation in response to a banal query was enough to close the app.

The developers did not specify which artificial intelligence models were used to create the website. However, the developers set rules that required people in the images to be fully clothed.

After the incident, all parameters were thoroughly checked. And although «was satisfied with»’s settings, «decided to close the site as a precaution, as we don’t want something like this to happen again».

The lottery representative noted that controlling AI-generated content might not be that easy. Although models such as Stable Diffusion and DALL-E have filters to prevent the creation of images of a sexual or violent nature, the researchers foundHowever, these models still generate unacceptable images from time to time.

Besides, with a little effort, developers can bypass the security systems themselves. For example, Shane Jones, Microsoft’s Head of Artificial Intelligence, recently discovered a security flaw and forced the DALL-E neural network to draw explicit and violent images.

Disqus Comments Loading...