Google image and Getty Images search results are both racist — we’ve mentioned it before — but other people are starting to notice, and it’s a problem.
For her final project, entitled “World White Web,” Swedish design student Johanna Burai noted, “When you search for images of the word ‘hand,’ all you see on Google are White hands, regardless of where you are in the world. World White Web is an initiative that wants to put an end to the norm of whiteness on the Internet.”
As I once said here on Unicorn Booty:
“… whenever I open a newspaper, comic book, website, or visit a magazine rack and see mostly white faces staring back at me. On this site I’ve made a conscious effort to try and use images that feature people of color and differently sized bodies; I’m surprised how hard I often have to search just to find good ones, when White faces and bodies pop up so effortlessly on Google image search.”
I once ran an informal test on Google and Getty image search to test this theory. I searched for the following terms “woman large dog” and “Black woman large dog” and in both instances, a majority of the images were of White women.
Granted, there have been infinitely more images taken of White women than of Black ones, and so naturally Google and Getty will both display the most trafficked images above all others. But when the world’s largest search engine and photo service regularly serve up White women even when users requests Black ones, it’s a problem.
The problem is that it renders White people as the default and shrinks our imagination about what non-White people can become. For example, I had a friend who admitted that whenever he sees a name in a book or hears a voice on the radio, he always imagines them as White. Similarly, I thought the band TV on the Radio was all-White until I saw pictures of them. The reason we imagine they’re White is because the media tells us that world is mostly White, even though the population tells a different story.
By serving up consistently White results, the implication is that Black women don’t have dogs, Black people don’t appear in books, they aren’t aren’t indie musicians, and just don’t exist in all the ways that White people do. And so, it atrophies our racial imagination diminishes and our concept of what our non-White brothers and sisters can be and where they belong.
Burai’s asking folks to share and use images of different colored hands on her site so that they can start appearing higher on Google’s image search, but Burai’s campaign is not an ideal solution because it alone won’t fix the problem. It’s a bit like telling television watchers to increase the number of non-White actors on TV by only watching shows with non-White casts. Yeah, it helps, but it does little to influence the predominantly White gatekeepers behind the scenes keeping the airwaves lily White in the first place.
According to the 2013 U.S. Board of Labor statistics, Google’s workforce is 61 percent White, 30 percent Asian, and only two percent Black. That’s 686 Black employees out of Google’s 34,311 employees. We haven’t been able to get racial statistics on Getty’s workforce, but they do provide images for some of the biggest magazines and newspapers around the world. By displaying White search results for racially-neutral image-search inquiries, they’re reinforcing a media-world that continues to disproportionately favor White faces above other races.
One solution would be for Google to start an “affirmative action” campaign of a sort, by favoring images of non-White people a little higher in their image search results. They could tweak their algorithm to better serve Black people whenever a person searches for things like “black woman large dog.” Lastly, they could also offer filtering for non-White skin tones, though both people who identify as White and non-White can often share similar skin tones.
It’s their responsibility every bit as much as it’s ours to pressure them because as the world’s largest search engine, presenting a white world based on “traffic algorithms” isn’t only racist, it’s inaccurate.