Google Instant, the real-time search feature that lets lazy people find cat photos without having to keep clicking “Enter,” famously excludes words like “lesbian” and “bisexual” while terms like “rentboy” and “prostitution” pull up all sorts of fun results. How come? Because Google has a master list of banned words that, of course, it used an algorithm to create. The Daily Beast explains:
And since its debut, the hackers at The Hacker Quarterly have been reverse-engineering search terms and making a list of words that the company’s algorithms have blacklisted, in an attempt to sanitize its real-time results. In other words, as you enter these words or phrases into your search bar, letter-by-letter, you’ll eventually be faced with a blank page.
One of the first search terms they realized Google had blacklisted was “teen.” That led to other words, like “adult,” and then certain ethnic and religious groups, but, then, not others. “Like, lesbian, why is that in there?” wondered Emmanuel Goldstein, the editor.
Let’s go to the source.
So, how did Google come up with these words in the first place? The answer, not surprisingly, is hidden in an algorithm that Google never quite explains.
Responding to a user query in a help forum post on September 14, a Google employee named “Kelly F” tried to go into the specifics of the word “lesbian.” It was the result of a bug, she said, and the company was working to see if it could be fixed. “These results are available but require you to hit Enter,” she added in clarifying the problem struck only when using Google Instant.
“It’s important to note that removing queries from autocomplete is a hard problem, and not as simple as blacklisting particular terms and phrases,” said a Google spokesperson.
Here’s a solution: PRESS ENTER.