As we continue to barrel toward a world operated almost completely by digital technologies – and the very real possibility of a collective future that offers untold possibilities for “social, economic, practical, artistic and even spiritual progress,” as Douglas Rushkoff recently wrote, many academics, sociologists, marketers and technologists are offering their take of what this all means for humanity (and therefore business) and what avenues we should or shouldn’t then take.
And of course, the Internet, and how we interact with it (and how it interacts with us), resides in the center of this discussion/debate. For a quick rundown see: Sherry Turkle, the Director of MIT’s Initiative on Technology & Self; author and journalist Nicholas Carr; author, teacher and consultant Clay Shirky, and media theorist and author Neil Postman.
This whole debate fascinates me and as citizens living through this transformative era I think that we should be absorbing and thinking about ALL views – not just the ones we are apt agree with.
On a related note, and the impetus for this post, I was recently listening to an audio interview with MoveOn.org’s former executive director Eli Pariser on the daily tv/radio news program Democracy Now! and I heard something quite interesting that I will mention in a minute.
Pariser was on the show to be interviewed about his new book, “The Filter Bubble: What the Internet Is Hiding from You.” His thesis, in a nutshell, is that the Internet is increasingly becoming an echo chamber in which sites tailor info according to the preferences they detect in each viewer. As an example he talks about two of his friends who both google “Egypt” from their respective computers and get two vastly different results – one about the protests and revolution and the other receives travel-related search results.
He posits that Google uses accumulated data from these two friends to deliver the results on Egypt it thinks they will click on – with the reason being more page views and ad dollars. Therefore not everyone receives the same results and a case can be (and is) made that Google is diverting from their original philosophy and algorithm that users and their pointed links to sites are the arbitrators of authority and therefore, determine the best, most useful search results.
A Facebook “Important Button.”
But what really caught my attention is what Pariser begins to discus about 38:30 into the show in relation to this perceived echo chamber (conformational bias) phenomenon and the social networking giant Facebook. He talks about the fact that the way info is passed around Facebook, and therefore consumed by the community, is through the “Like” button.
“The “Like” button has a very particular valence. It is easy to click like on ‘I just ran a marathon,’ or ‘I baked a really awesome cake,’ but its very hard to click “Like” on ‘war in Afghanistan enters its tenth year,’” says Pariser
Therefore, “Info that is “Likable” gets transmitted, information that is not “Likable” falls out” he adds. His suggestions to begin to remedy this and take back some control?
1). We need to be aware of whats happening, in terms of these filters operating invisibly a.k.a. “Use Your Head.” 2). And the idea that inspired this post – a grassroots campaign to develop an “Important Button.” This would be a way to signal that something is not only “Likable” but also important.
And consequently, and most important, different, more varied information/stories would then began to be viewed and consumed by more people.
“This [campaign/idea] can start to remind these companies that there are ways that they can begin to build in more civic value into what they’re doing”
So, what do you think about an “Important Button?” I think its certainly an interesting and viable idea and would love to hear your thoughts in the comments.
Here is the link to the video interview