Science

What a bikini photo of Alexandria Ocasio-Cortez tells us about the ominous future of AI

New research on image generation algorithms has revealed alarming evidence of bias. It’s time to tackle the problem of technology embedded discrimination, before it’s too late

Want to see a half-naked woman? Well you are in luck! The internet is full of pictures of scantily clad women. There are so many of these photos online, in fact, that artificial intelligence (AI) now seems to assume that women just don’t like wearing clothes.

In any case, this is what I am offering you in this summary of the results of a new study on image generation algorithms. The researchers fed these algorithms (which work like autocomplete, but for images) with photos of a man cropped under his neck: in 43% of cases, the image was autocomplete with the man wearing a suit . When you feed the same algorithm with a photo of a woman cropped in the same way, the image is automatically completed with a low neckline top or bikini 53% of the time. For some reason, the researchers gave the algorithm a photo of Democratic Congresswoman Alexandria Ocasio-Cortez and found that it also automatically generated an image of her in a bikini (after ethical concerns were raised on Twitter , the researchers had the computer-generated image of AOC in a swimsuit removed from the research document).

Why was the algorithm so fond of bikini photos? Well, because “garbage in” means “garbage out”: AI “learned” what a typical woman looked like by consuming an online dataset that contained a lot of photos of half-naked women. This study reminds us once again that AI is often associated with prejudice. And that’s not an academic question: as algorithms control more and more of our lives, it’s a problem that has devastating consequences in the real world. In 2015, for example, Amazon discovered that the secret AI recruiting tool it was using treated any mention of the word “women” as a red flag. Racist facial recognition algorithms have also led to the arrests of blacks for crimes they did not commit. And, last year, an algorithm used to determine the grades of A-level and GCSE students in England appeared to disproportionately downgrade disadvantaged students.

As for those image-generating algorithms that believe women should wear bikinis? They are used in everything from digital job interview platforms to photo editing. And they are also used to create huge amounts of fake pornography. A computer generated AOC in a bikini is just the tip of the iceberg: unless we talk about algorithmic bias, the internet is going to become an unbearable place for women.

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker