Google Maps Is Racist Because the Internet Is Racist

A slur recently showed up on Google Maps, and it was the work of trolls. It was the work of algorithms.
Image may contain Art and Modern Art

Earlier this week, Google Maps suffered the latest in series of embarrassing occurrences. It was discovered that when searching for "n***a house" and "n***a king," Maps returned a surprising location: the White House. A search for "slut's house" led to an Indiana women's dorm. Initially, you may have suspected this to be the work of a lone vandal, or even a coordinated campaign. But Google Maps gave racist, degrading results not because it was compromised, but because the internet itself is racist and degrading.

That revelation comes from a Google statement posted yesterday evening. "Certain offensive search terms were triggering unexpected maps results, typically because people had used the offensive term in online discussions of the place," wrote Jen Fitzpatrick, VP of Engineering & Product Management. "This surfaced inappropriate results that users likely weren't looking for."

It's as remarkable as it is disheartening. What it shouldn't be, though, is surprising.

"This is not new by any means," says University of Michigan professor and co-editor of Race After the Internet Lisa Nakamura. "This is just a higher-profile example of something that's been happening a really long time, which is that user-generated content used to answer questions reflects those pervasive attitudes that most people don't want to think about, but are really common." And in this case, are exceptionally racist.

That also makes it different from other prominent Maps gaffes, like the image of a giant Android mascot urinating on an Apple logo that surfaced in late April. That previous embarrassment was the result of abuse of Google's Map Maker tool, which allowed users to create entries for far-flung places in an effort to crowdsource its digital cartography. They were manufactured menace, lone actors tweaking the system.

That sort of prank is also easier to prevent; the abuse those deliberate spammers caused prompted Google to shut down its Map Maker tool on May 12, a Google spokesperson confirmed to WIRED, a nuclear option that has since shuttered a service that had been functional since 2008.

The type of invective that led to this more recent Google Maps grotesqueness, though, isn't something you can simply flip a switch to turn off, because it's woven into the fabric of the internet itself. Essentially, we're making internet algorithms racist.

Bomb Dot Com

What happened with the White House and other victims of untoward Google Maps results is most reminiscent of the popular practice known as "Googlebombing." You may remember how in the mid-2000s a search for "santorum" led to a site that attempted to define the then-Senator's name as a sex act, or that a search for "miserable failure" returned George W. Bush's official page as a top result.

A Googlebomb was simple enough to produce; practitioners simply gamed Google's search-rank algorithm through heavy tactical linking of specific phrases. The more a site is linked to, generally speaking, the higher Google will rank it. It's a system that's easily gamed if you have enough time and HTML on your hands.

What happened with Google Maps appears to have the same technical foundation as those concerted campaigns. As search guru Danny Sullivan points out at SearchEngineLand, a recent Google update applied similar ranking logic to Maps as of late last year, incorporating mentions of locations across the Web to help more accurately surface them in searches, and to provide richer descriptions when they appear. In theory, this helps customers find shops and services near them that might otherwise be labeled too vaguely to be helpful. Which is nice!

In practice, though, it also means that if enough people online refer to a specific place using vile epithets, even one of the most recognizable landmarks in the United States can be reduced to racist garbage. And it's important to understand that while the technical function of producing the recent racist results are similar to how a Googlebomb works, there's one very big fundamental difference: A Googlebomb is calculated. A group of people decided they wanted to game the "santorum" results and made it happen. In the case of the White House and other offensive Maps searches, the algorithm wasn't subject to a coordinated effort, it just gathered up all the data the internet could provide, and the internet provided trash.

This is also what makes what happened to Google Maps different from Flickr's similar algorithmic issues. Recently, the photo site launched new auto-tagging features that intelligently label photos based on their contents. Unfortunately, it was discovered it's labeled photos of concentration camps as "jungle gyms," and at least two photos of human beings (one man, one woman) as "ape." Those errors, embarrassing and unfortunate as they are, stem not from a critical mass of offending users but from an algorithm that's more easily confused than advertised or intended.

Garbage In, Garbage Out

It's easy to forget how much grossness lurks online, especially now that our browsing habits are largely dictated by social channels like Twitter and Facebook (the latter of which works overtime, literally, to scrub its feeds of filth). We rarely see tabs that aren't presented to us either by friends or trusted sources; our paths to content rigidly defined enough that it's almost impossible to find yourself lost in a bad part of town.

That effect isn't limited to social networks, either. "The fact that most people believe Google results are a reflection of reality is the real problem," says S. Shyam Sundar, Co-Director of Penn State University's Media Effects Research Laboratory. "It's like believing that TV news is an unbiased mirror of society." Google doesn't show us the world; just a curated version that it thinks we want to see.

Meanwhile, somewhere not far from your Chrome cul-de-sac there are enough ongoing conversations in which the White House is casually, consistently, and pervasively called this horrible thing, that the world's largest arbiter of information also identifies it as such. Lone, manipulative racists can certainly cause damage, but at least they can be dismissed as outliers. Google doesn't do outliers; it does zeitgeist. And the zeitgeist of the internet, the real one, the one that you don't normally see, turns out to be disgusting.

"The Web, from the very beginning, has been a haven for the explicitly racist speech by white supremacists," says Charlton McIlwain, New York University Associate Professor of Media, Culture, and Communication. "But it has also become a haven for people who fancy themselves as egalitarian to express the kind of racial resentments, anger, and mistrust that they know is not publicly acceptable." Both McIlwain and Nakamura note that if anything, the problem has gotten worse in recent years; McIlwain points to research that shows a "dramatic increase in racist speech online" since Obama's 2008 election, while Nakamura attributes the uptick to the resentment that comes from "a feeling of entitlement that straight white men had for a long time that they don't have anymore."

Google plans to fix this recent flare-up the same way it did the Googlebombing of the aughts: by making them disappear. "Building upon a key algorithmic change we developed for Google Search, we've started to update our ranking system to address the majority of these searches," explained Fitzgerald in her statement. "Simply put, you shouldn't see these kinds of results in Google Maps, and we're taking steps to make sure you don't."

So even though these results might technically be what the internet dictates, Google will do its best to obscure them from view. One could argue that's just as well; just because you live near a cesspool doesn't mean you ever have visit. But everyone we spoke with agreed that pretending the internet's a more civilized place than it really is ultimately does more harm than good.

Facing up to the darker realities of the online experience, McIlwain says, "helps us realize that the Web itself is fraught with the same kinds of racial problems, controversies, and politics that we've dealt with offline for most of our country's history." Sundar believes that "the solution is to increase media literacy among internet users, not to give more editorial control to Google."

A big part of that literacy? Understanding that while Google calls its surfacing of racism a "mess up," it's arguably the opposite. "People see [these results] as a glitch, a malfunction, but it's not," says Nakamura. "If anything, it works too well."