Midjourney’s founder, David Holz, says it’s banning these phrases as a stopgap measure to stop individuals from producing stunning or gory content material whereas the corporate “improves issues on the AI facet.” Holz says moderators watch how phrases are getting used and what sorts of photographs are being generated, and regulate the bans periodically. The agency has a group pointers web page that lists the kind of content material it blocks on this manner, together with sexual imagery, gore, and even the 🍑 emoji, which is commonly used as a logo for the buttocks.
AI fashions akin to Midjourney, DALL-E 2, and Secure Diffusion are skilled on billions of photographs which have been scraped from the web. Analysis by a staff on the College of Washington has discovered that such fashions study biases that sexually objectify ladies, that are then mirrored within the photographs they produce. The large dimension of the information set makes it nearly unimaginable to take away undesirable photographs, akin to these of a sexual or violent nature, or those who may produce biased outcomes. The extra usually one thing seems within the information set, the stronger the connection the AI mannequin makes, which implies it’s extra prone to seem in photographs the mannequin generates.
Midjourney’s phrase bans are a piecemeal try to deal with this downside. Some phrases referring to the male reproductive system, akin to “sperm” and “testicles,” are blocked too, however the record of banned phrases appears to skew predominantly feminine.
The immediate ban was first noticed by Julia Rockwell, a scientific information analyst at Datafy Scientific, and her pal Madeline Keenen, a cell biologist on the College of North Carolina at Chapel Hill. Rockwell used Midjourney to attempt to generate a enjoyable picture of the placenta for Keenen, who research them. To her shock, Rockwell discovered that utilizing “placenta” as a immediate was banned. She then began experimenting with different phrases associated to the human reproductive system, and located the identical.
Nonetheless, the pair additionally confirmed the way it’s potential to work round these bans to create sexualized photographs by utilizing totally different spellings of phrases, or different euphemisms for sexual or gory content material.
In findings they shared with MIT Expertise Assessment, they discovered that the immediate “gynaecological examination”—utilizing the British spelling—generated some deeply creepy photographs: one among two bare ladies in a health care provider’s workplace, and one other of a bald three-limbed particular person slicing up their very own abdomen.

JULIA ROCKWELL
Midjourney’s crude banning of prompts referring to reproductive biology highlights how tough it’s to average content material round generative AI techniques. It additionally demonstrates how the tendency for AI techniques to sexualize ladies extends all the way in which to their inside organs, says Rockwell.