By Vanessa Rychlinski, University of Michigan
Imagine living on an island with no technology. Simple inventions like hammers and knives are the only real tools. The knowledge you possess has been passed down to you by the elder generation, and it is limited to information relevant to surviving on this particular island. Any news is transmitted occasionally by those who leave, or come to visit, but for the most part the community remains solitary.
In a talk at TED 2011, Eli Pariser explains his research regarding Google and the filter bubble. With this “invisible algorithmic editing of the web” as he calls it, Google logs each user’s searches and click history in order to better anticipate what kind of content a person may be looking for. Pariser noticed that this tailoring of content created some alarming disparities among users. A friend who travelled frequently received tourist information when searching “Egypt” while a friend’s search found news about the protest; liberals or conservatives were assigned certain news sources, and etc.
Not only are results uncannily user-specific, they are also potentially harmful — these algorithms do not take into effect that familiar devil known as procrastination. This brings to mind what I’ll call here the Netflix dilemma, the long queue on your account comprised of important, culturally relevant films which you tell yourself you’ll watch one day, but never do. Pariser includes the Netflix dilemma in his argument, pointing out that algorithms would inherently prevent us from having both “some information vegetables and some information dessert.” It’s easy to click on the news story about the Red Wings instead of yet another article about the auto industry, but a stream of information based on our “soft” interests would indeed be weak.
I asked Sugih Jamin, an Associate Professor of Computer Science at the University of Michigan, some questions regarding filtering. Google uses a mining algorithm which categorizes your data by looking at the repetition of key words. The algorithm then matches you with content according to relevancy in a process that is essentially “a match-making service.” Certainly, as he pointed out to me, a search engine obviously ought to filter out results; otherwise one would have to wade through waves of content before finding something germane.
However, Google’s sheer size and the attractiveness of its user data allows for what Professor Jamin says is a “large potential for abuse.” According to Internet World Stats, around 30 percent, or two billion, of the world’s population used the internet in 2010. Not only that, in the decade since 2000 the number of internet users has grown by 480.4%. Google has been estimated to have more than a million servers, employing three million computers. A CNN article estimated that the search engine processed over one billion search requests in 2009. In a similar manner to search filtering, Google already uses your browsing data to feed you specific advertising. Therefore, Google not only has a little finger dipped into the sea of information, but also another that has been agitating the free market. Finally, the company also has just announced a new social network model called Google+, which some of the cheekier speculators predict will make Facebook “the new MySpace.” If this prophecy does come to pass, the company will have its eyes and servers documenting even more of its consumers’ everyday lives.
DuckDuckGo.com is a search engine that capitalizes on such fears. The site protects a user’s control over filter streams and abstains from “bubbling” search results. At dontbubble.us, DuckDuckGo promotes itself with an illustrated guide to bubble filtering. The guide includes examples from the TED talk as well as a few others — a search for “climate change” returned informational sites for one user and climate action sites for another. As the guide puts it, “Since you often click on things you agree with, you keep getting more and more of what you already agree with.” Instead of being distributed somewhat equally, the sea of information is swirling around different point — a piece of coding decides to promote content where it “knows” that it will already be welcome.
Google believes itself capable to decide for us what is right. With “smart” coding, it escapes the “Big Brother” label because the company can point to data that we handed over to it. But there’s more than one truth, and I, for one, would like to discover as many as possible. The greatness of the Internet is the possibility of true connectivity between our comfortably isolated islands. Marshall McLuhan’s idea of a “global village” is as relevant today as it was in the twenties during the advent of the radio. However, instead of the flow of content being accessible to many, as Pariser says, we are each contained “in a web of one.” It is imperative that Google, in acknowledgement of the vast reach of its services, assumes some sort of ethical responsibility. In the interest of fostering net neutrality, the powerful turbine that is Google needs to let us see — and help decide — which way the current is flowing.