In case you have not noticed, since 2009 Google has been customizing its search results for every user (“Personalized search for everyone“) Even if you are signed-out, the results that you get after launching a search might be different to the results that your mum or your best friend get, or even you yourself would get in a different situation. In order to predict what you are most likely to click on, Google applies a filtering algorithm that takes into account all the personal and contextual information it has about you in a given moment.
In a video posted last August 25th (“Another look under the hood of search“), which was received with high interest, one of the search scientists at Google, explains:
The Google Search algorithm is made up of several hundred signals that we try to put together to serve the best results for the user
Google is not the only one using personalization filters to customize the information presented to us. Every major online service is flirting with the technology. Personalization algorithms are the magic invoked to deal with information overload and make our lives easier. And they represent a clear advantage if what you are looking for is a pizza or a drugstore to buy a tooth-brush. But when you are interested in getting broad and objective information about a given subject, should you get information which has been cooked only for you? Is that what you expect?
For more than ten years, I have been a customer and absolute fan of Amazon. I enjoy knowing what other people have bought after buying such and such, and it is always good to have a look at its recommendations. But even today, I cannot but smile when I start looking for a book about, let’s say, personalization filtering, and the recommendation engine insists with “A Primate Memoir” because the last book I bought was “Why Zebras don’t get ulcers”, by Robert Sapolsky. (A book which, by the way, I wouldn’t recommend in case you are slightly hypochondriac.)
After a few days of heavy use of Zite, the reader app for ipad recently acquired by CNN, I am astonished. Starting only with your twitter, facebook or google reader account, you are set and running in a few minutes and from the very beginning the choices it offers you seem quite sensible while still wide and fresh enough to keep the promise of serendipitous discovery alive. Again, this is the result of putting together semantic and social tagging search algorithm. But can we trust algorithms? What are they hiding from us?
Algorithms are perceived as neutral mathematical entities, but they are created by humans and they can be vehicles for hidden ideologies, as Kevin Slavin explains in a recent interview for New Scientist:
The pernicious thing about algorithms is that they have the mathematical quality of truth – you have the sense that they are neutral – and yet, of course, they have authorship. For example, Google’s search engine is composed entirely of fancy mathematics, but its algorithms, like everybody’s, are all based on an ideology (Kevin Slavin, New Scientist, “Game Developer: Beware Algorithms Running Your Life”)
Even if algorithms act with the best of intentions, there is an increasing risk of getting trapped in a universe of personal information, a sort of Hotel California in which we will increasingly be typed and fed only news that is pleasant, familiar, and confirms our beliefs, and where we don’t know what is being hidden from us. This is the thesis put forward by Eli Pariser in his book “The Filter Bubble”:
Our past interests will determine what we are exposed to in the future, leaving less room for the unexpected encounters that spark creativity, innovation, and the democratic exchange of ideas.
The advance of personalization is both a necessity and a cause for concern. What we -internet users- need is a transparent way to stay in control and be sure that filtering algorithms are working for us, and not the other way around. For example, it’s easy to imagine a nice function which would offers us the possibility to leave the cage of our persona and “view search results as…” where now you can put your mum or your best friend to get the results they would normally get. Okay, I now that’s probably not privacy-compatible. Let’s say at least, “view search results as if you don’t know who the devil I am”
I am pretty sure that the so-called “internet giants” can offer clear and easy mechanisms to help us understand and choose what we see and what we don’t see, so that we will be happy to join Isaac Newton in saying: “If I have seen further, it is only by standing on the shoulders of (“the internet”) giants.”
Fully agree with you.
I’ve started noticing this search-results customization early last year, and at first I appreciated very much such advance, because I was able to find what I was looking for much faster…
But since a couple months this is getting worse. As an open innovator, I need some doses of serendipity when I’m researching about a general topic, where the objective is to learn about different views, and this has become almost impossible to do.
The solution I’ve found is opening a private browsing window in a new browser where I’m not logged-in, so I can avoid both my profile, and all the cookies…
[…] Political scientists and social theorists have long fretted about the Internet’s potential to flatten and polarize democratic discourse. Exposure to news and opinion increasingly occurs through social media. How social networks choose to filter and personalize our news feed influence exposure to perspectives that cut across ideological lines. Can we trust the algorithms they use? Or are they hiding relevant information from us? […]
[…] The new adage is “don’t search, don’t compare, don’t even try. We know better than you what you need…” In other words, you can check out any time you like, but you can never leave. […]