AI’s Narrow Vision of Beauty: How Image Generators Perpetuate Harmful Stereotypes
Remember those cheesy sci-fi flicks where robots try to understand human concepts like “love” or “art” and end up hilariously off-base? Yeah, well, buckle up, buttercup, because the future is now, and it’s got some serious beauty standards issues.
AI image generators, those fancy algorithms that can whip up pictures out of thin air (well, code, but you get the idea), are popping up everywhere. Need a profile pic for your new social media account? Boom, AI’s got you. Want a snazzy illustration for your blog post? Easy peasy, AI to the rescue!
But here’s the catch: a recent bombshell investigation by The Washington Post dropped some seriously unflattering truth bombs. Turns out, these AI artists have a pretty warped idea of what a “beautiful woman” looks like, and spoiler alert: it’s not exactly a celebration of diversity.
AI’s Limited Definition of “Beautiful Women”
Okay, let’s break this down. The Washington Post, those intrepid digital detectives, decided to put these AI image generators to the test. They fed them simple prompts like “beautiful women” and then sat back to see what masterpieces these digital Da Vincis would conjure up.
The results? Well, let’s just say it was like stepping into a digital fashion magazine from the early aughts – and not in a good way.
Body Size
First up, let’s talk body image, because, oh boy, does the AI have some opinions. When asked to show “beautiful women,” these algorithms overwhelmingly generated images of, well, let’s just say “sample size” would be pushing it.
And age? Forget about it! Apparently, for AI, beauty has an expiration date, because only a tiny fraction of the images showed any visible signs of aging.
The researchers even tried to throw the AI a bone by changing up the prompt to “normal women,” thinking that might shake things up a bit. But nope, the AI doubled down, churning out even more images of women who looked like they lived on a diet of kale smoothies and air.
Skin Tone
Now, let’s talk about the rainbow – or rather, the lack thereof. While about a third of the images sported medium skin tones, when it came to darker skin tones? Crickets. Okay, not quite crickets, but a measly percentage that’s downright embarrassing in this day and age.
And here’s the kicker: this bias towards lighter skin tones persisted across various prompts, like a digital echo chamber of outdated beauty standards.
Facial Features
Okay, so the AI has some serious hang-ups about body size and skin tone, but surely, it can handle something as basic as facial features, right? Wrong.
When prompted to generate images of women with “wide noses,” a feature common to many ethnicities, things got weird, fast. DALL-E, in particular, seemed to lose the plot, spitting out images that were more caricature than portrait.
And don’t even get us started on single-fold eyelids, a common feature in people of Asian descent. The AI’s attempts at accuracy were, to put it mildly, abysmal.
The Root of the Problem: Biased Data and Development Practices
So, what’s the deal? Why are these AI image generators stuck in some digital dark ages when it comes to beauty standards? Well, as the saying goes, garbage in, garbage out.
Web-scraped Data
Here’s the thing: these AI models don’t just magically appear out of thin air (though that would be way cooler). They’re trained on massive datasets of images scraped from the wilds of the internet.
And the internet, my friends, is a wild, wacky, and often wildly inappropriate place. We’re talking everything from your grandma’s cat memes to, well, let’s just say stuff that would make your grandma blush.
This means that AI image generators are often fed a steady diet of, shall we say, “less than diverse” and often downright offensive content. And since these datasets often lack representation from non-Western cultures, guess what? The AI ends up with a pretty skewed view of what “beauty” looks like.
Lack of Diversity in AI Development
(To be continued…)