The decade is off to a bizarre start for Google, the company that's trying to synthesize the sum of human knowledge and run little ads next to it.
There was the small matter of going to war with China. Last week, Google threatened to pull out of the country, chafing at censorship laws and all but accusing the Chinese government of hacking its networks to persecute human-rights activists. This quickly escalated into an international incident, except that one of the nations wasn't a nation, it was a company.
But before Google's swagger into geopolitics, another, smaller story had caught the attention of the Internet's commentariat. A feature called Google Suggest appeared to be self-censoring results that would have disparaged Islam - and Google found itself accused of cowardice. If ever you needed an illustration of the bind Google has worked its way into, here it be.
People have been having fun with Google Suggest (its auto-complete feature).
When you start typing a query, the search engine tries to complete your sentence, popping up suggestions as you type. It draws them from what other people have searched for in the past, ranking them by a secret-sauce combination of popularity and longevity. (It will also take your own search history into account.)
In other words, it's a picture of what others are searching for. Oftentimes, the results are useful. However, it didn't take users long to figure out that entertaining results can be had by entering leading phrases. When you punch in the words "why does my," for instance, Google Suggest pops up options such as "Why does my cat bite me?" and "Why does my belly button smell?" When you punch in "all I want to do is," we learn that others have been searching for "All I want to do is eat your brains." It's fun, in a hell-is-other-people sort of way.
The trouble started when the trick was tried on world's religions. It was soon discovered that typing in leading questions about different faiths turns up a series of uniformly disgruntled suggestions: "Christianity is a lie," "Judaism is a cult," "Buddhism is not what you think," "Hinduism is wrong" and so on.
However, if you type "Islam is," Google Suggest suggests - nothing. Google Suggest happily reflects users' derisive queries about every other religion, but when it comes to Islam, it's mute. It gives the appearance that Google is self-censoring to avoid trouble with a religion whose extremists are increasingly militant about slights.
The discovery caused a minor sensation online. "Google, now serving cowardliness," read a missive that went to the top of the charts at Digg, the influential portal. News outlets left and right picked up the story. Was Google - that bastion of honesty - fudging its own results for political expediency?
It's the kind of accusation that most companies wouldn't blink at, but it's a serious one for Google. One of Google's foundational principles is that it does not tinker with search results. The algorithms it uses to generate them might change, but Google promises that they'll be applied consistently and neutrally. This isn't just for public relations: Google's entire business of selling ads next to search results relies on advertisers trusting it not to play favourites.
Google says that the Islam omission is "a bug." This explanation has been greeted with more than a little skepticism. To offer suggestions for terms from "Alabama" to "zygote" and then choke on "Islam is" is one oddball technical difficulty. Google employees are known for many things - being bad at programming is not one of them. (As of press time, the quirk is still there.)
The flap over its "Islam" non-suggestion will come and go. But the controversy illustrates the bind that Google finds itself in again and again, as it tries to play the role of reality's honest broker. Real life is a dirty, tarry thing to have on your hands. As Google inserts itself into every corner of our information-consuming lives, it has to play an ever-bendier game of Twister to satisfy competing demands.
As it looks up our information, suggests our search terms, pulls up our maps, photographs our streets and stores our books, Google also has to flex to meet national copyright laws, local taboos, rules about hate speech, various censorship regimes and now perhaps even religious sensibilities. And it has to do all this while still saying it's neutral.
Google Suggest puts the firm in a tighter spot still. Google can legitimately wash its hands of much of the awful content it has to index by pointing out that it's just a conduit by which to reach other people's information. But Google Suggest is more active. Even if its suggestions come from other people's searches, the uninformed (and the willfully ignorant) may not draw that distinction.
It's a pickle. For all its devotion to openness, Google can't afford to hold up a perfect mirror in which to see ourselves, because between blasphemy and felony, the view isn't pretty. At the same time, Google's users demand transparency as a prerequisite to their trust. When Google even gives the appearance of mucking with its search results, its users go up in arms because we fear skewed results will distort our view of the world.
You don't have to go to China to learn that small differences in search results can make a big difference. The best we can hope for is fairness and consistency. The sooner Google fixes this "bug," the better.