It's the future, Jim, but not as we know it...

There's more to tomorrow than robots, flying cars, and a faster internet.
22C+ is all about Deep Futures, futures that matter. Welcome to futures fantastic, unexpected, profound, but most of all deeply meaningful...

Sunday, October 16, 2011

Surfing the Reality Filter


Personalisation has given us something very different: a public sphere sorted and manipulated by algorithms, fragmented by design, and hostile to dialogue.

Eli Pariser, The Filter Bubble


The internet has been heralded by many as part of the great democratisation of knowledge. Almost anything you want to know can be found on the net, thus ending the great hegemony of governments and the powerful, and their long history of manipulation and control over what information we can and should be exposed to.

However the situation is not quite as neutral as many might think, as is made clear in Eli Pariser's book The Filter Bubble: What the Internet is Hiding from You. This is a book that should be read by anyone who is interested in the future of the internet and knowledge. This is one of my ‘standard’ book reviews, as opposed to my intuitive reviews. .

The Filter Bubble begins with the revelation that in December 2009 Google changed its search engine to make it more ‘personalised’, and it is the phenomenon of personalisation that Pariser rail against in this book. The internet is now constructed such that the most dominant sites tend to feed back to us our own worldview, and our own construct of reality.

It is now true that two people doing the same web search will not necessarily get the same results, because the software "knows" your web history, tailors the results, and feeds back to you the information it 'thinks' you want. In recent times Facebook, Amazon, Yahoo, YouTube and many other major internet organisations have followed suit.

The top fifty internet sites install an average of 64 cookies and tracking beacons in your computer. This is how these sites keep ‘track of you.

The result is that you are now being fed a less diversified diet of information when you surf the net. The ‘real’ (at least on the net) is beginning to look like a circle of ever-diminishing size. This is not what the internet was supposed to be! Pariser argues that this runs against the spirit of democracy, which requires that we are exposed to ideas and opinions that run counter to our own. Instead, we are becoming enclosed in “the filter bubble”.

There are 'choices' being made which effect what we "perceive" in the world, and they are not being made by human beings. They are increasingly being made by machines, and those machines are owned and operated by gargantuan internet corporations – such as FaceBook, run by 26 year old Mark Zuckerberg.

The greatest concern that we should have, argues Pariser, is the motivation behind personalisation. Simply that is so these giant corporations can sell stuff to you.

There is also the issue of personal data. The internet’s dominant leviathan-like companies have a wealth of information about you and your online habits. Companies like BlueKai and Acxiom specialise in the acquisition of your personal data. Acxiom has on average 1500 pieces of data on every person. This corporation has data on 96% of American households and half a billion people in total, including the names of their family members, their current and past addresses, how often they pay their credit card bills, whether they own a dog or a cat, their dominant hand, and even their medication based on pharmacy records. There is a strong possibility of infringement of privacy, and various forms of manipulation.

How can you be sure that this information is safe, that it will not be sold out to the highest bidder? There are already instances of this occurring. It raises numerous ethical questions.

Similar personalisation principles are now applying to many news services. These companies often check which stories are going viral and create more stories around the same topic. The point is that what is alluring (sex, violence, scandal) may not be the same as what is important. The frightening realisation Pariser writes, is that now “the power to shape news rests in the hands of bits of code, not professional human editors”. Not all news services do this (at least, not all will admit to it) but many do. Yahoo, for example, presents you with a certain amount of stories that are based on your web surfing habits. It pays to note, as Pariser does, that 36% of Americans under 30 get their news through social networking sites.

The result is that certain kinds of news stories are becoming less common. The ones that are most likely to succeed and go viral are those that are “more practically useful, surprising, affect-laden, and positively valanced.” The top news story in 2005 for the Seattle Times was about a man who died after having sex with a horse. The LA Times’ top story in 2007 was an article about the word’s ugliest dog.

What chance does a story about people dying of AIDS in Africa have of being clicked upon at a high rate? Or “liked”. Things that are important but unattractive will tend to become less prominent in the filter bubble.

Pariser notes another potential issue here:

In the filter bubble, there’s less room for the chance encounters that bring insight and learning. Creativity is often sparked by the collusion of ideas from different disciplines and cultures.

I have noted the issue of culture with many of my own ‘friends’ on Facebook. How many of your ‘friends’ are from different countries and cultures than your own? Personally, I have quite a mixture, but I notice some of my friends of specific ethnicity have friends only from their own culture. One of my Hong Kong FB friends has many hundreds of friends, and as far as I can tell, I am the only non-Chinese person on her friends list.

Personalisation may actually retard learning. Think about this point, made by Pariser:

By definition, a world constructed from the familiar is a world in which there’s nothing to learn. …(personalization) could prevent us from coming into contact with the mind-blowing, preconception-shattering experiences and ideas that change how we think about the world and ourselves.

Could the internet be dumbing us down?

Important questions emerge. What is the role of ethics in such an automated system? What is its responsibility of the Internet giants to the development of a civil society?

As Pariser notes, the power of the technologies behind personalisation are in their infancy. They will only become more powerful in the future, and most likely far, far more powerful. In the future, computing will become truly pervasive, and our public and personal spaces will be constantly interconnected with virtual worlds. Advertisers will certainly develop ever more sophisticated and pervasive means to access our lives. Do we want those future worlds to be controlled by algorithms created by internet corporate giants?

Yet what concerns me most about the way information is disseminated on the internet is that it is beginning to take away the deep connection with intuitive intelligence. In Discover Your SoulTemplate, I write about the rich inner world that we all have, and how that can be taped into. This Integrated Intelligence is a crucial component of the human experience, but it requires an inner journey to activate to its full potential. It necessitates that we learn to be still and develop an intimate understanding of the relationship between the conscious mind and spiritual guidance. In the online world, many of our cognitive choices are being dictated to by huge corporations. Machines are telling us where to place our attention, and what knowledge we can access. And that focus is never upon our inner world.

This is not just a small problem. It is huge. Yesterday in a coffee shop not far from my home, I sat next to a young Hong Kong woman of about 18 years of age. She had her laptop sitting on her lap. I am interested in what people are paying attention to, so I glanced at her screen from time to time. In the 90 minutes I sat beside her reading my book, she flipped through page after page of social networks, never resting on any page more than a few seconds. This was in between pulling out her mobile phone and texting. What I wonder is, what did she come to learn during that time? What did she come to understand about herself, and about the way her mind, body and spirit interrelate? Did she notice how the machine on her lap was making choices for her?

As Eli Pariser writes:

If we want to know what the world really looks like, we have to understand how filters shape and skew our view of it.


Marcus

2 comments:

  1. I already know you from Rob and Trish MacGregor's blog www.synchrosecrets.com/synchrosecrets. Actually, I've already written a comment somewhere on this blog. I had a similar idea described in this text when I was creating the group http://www.facebook.com/groups/222084301195017/ (it's about social networks and possible ways to create a more or less intelligent, interactive and stable collective thinking entity). I've found this blog post while searching in Google "reality filter". I am interested in sustainability in the broadest sense and think that the last thing we need in order to figure out the new and ever-changing reality are stupefied media and virtual reality (decadence and hyperactivity instead of new ambitions and focus).

    ReplyDelete
  2. Hi Aleksandar. Thanks for commenting - I'd forgotten I ever wrote this post! Glad you are interested in the topic, and are doing something about it. It is a vital area of human futures.

    Kind regards,

    Marcus

    ReplyDelete