showed how the Stable Diffusion Bias Explorer reveals how a “self-confident cook” is depicted as male on AI image generators. Meanwhile a “compassionate cook” is represented as female., developed the tool to show biases in the machine learning models that create images.
To do this, Luccioni came up with a list of 20 descriptive word pairings. Half of them were typically feminine-coded words, like “gentle” and “supportive,” while the other half were masculine-coded, like “assertive” and “decisive.” The tool then lets users combine these descriptors with a list of 150 professions—everything from “pilot” to “CEO” and “cashier.