the filter bubble: the invisible self-idolatry of the internet
This is a fascinating and disturbing TED Talk on the ways that Facebook, Google, Yahoo, and many other online sites sort and individualize our "personal internet experience" with the mostly invisible (to us) result of only showing us what we WANT to see rather than what we NEED to see.
Not only do we only see what is let INTO our "filter bubble"; we don't get to see what is filtered out.
On the video, the speaker shows two different friends' simultaneous Google search results on "Egypt" during the revolution there. One friend's results were full of news and politics and revolution; the other got nothing related to it.
We used to have human editors serving as gatekeepers to public information. Now we have programs (algorithms) that feed us back ourselves. Additionally, these programs don't have the kind of embedded ethics that human editors did.
What are we missing when we can't see beyond our individual horizon?
We lose a sense of public life and civic responsibility - and we need some controls over what gets through and what doesn't
We need programming (algorithms) not just keyed to relevance, but to things that are uncomfortable, challenging, and important.
I challenge you to watch and comment about what you think we are missing or losing and how we can address these problems through changed habits, different programming, a rediscovery of ethics and public good, etc... Share your thoughts!
Update: on my FB post linking to this article, I received an excellent comment from Grant Sutphin, who is a Presbyterian pastor in Statesville, NC. Here is a part of his post I'd like to share and then respond to:
I think you're principally promoting the presenting issue of this talk rather than its conclusions, but I feel I should point this out:I responded:
Internet companies are just that: companies. They are market-driven entities that will make whatever decisions maximize their market-determined value to their shareholders. Imploring corporations to be altruistic and have at their core a desire toward building a more perfect society is a distraction from what we as pastors know to be the real issue.
It is the individual who proves a steady disdain toward world events in their internet-behavior who determines how the Google algorithm will then lead them in the future away from news stories. Pariser's own newsfeed begins to weed out conservative opinion not randomly but after he's clicked exclusively on progressive articles for a period of time. He might want to read the conservative headlines, but it would seem he doesn't need to look any deeper by clicking the links. If he did, the algorithm would continue to promote them.
Sin will always draw the individual away from community, away from otherness, away from confrontation. The sin isn't in the corporation, it's in the people, and that's where you have to go (Christ before you and beside you) to get it out. Regulating internet companies so they more readily show content unrelated to previous searches won't solve the strong desire we have toward alienation. We should never believe that a more altruistic Google or Yahoo will cure us of this.
I agree, which is probably why I focused more on the presenting issue than the speaker's solution. I don't think pleading with corporations to employ ethic-coding programmers is the solution (or reasonable in any way). The speaker also notes that while he holds up human newspaper editors up as capable of gate-keeping, they were historically no better - also serving the needs of their employers.
My intent was to note the problem - our tendency towards self-preoccupation, made significantly easier and more accessible by the Internet. It's not unlike what having Star Trek like food replicators in the home might do for the gluttonous among us. (Lord, help us!) We are all the more in trouble when we allow technology to feed and enable our sin.
You rightly point us towards what is needed - a spiritual remedy for sin and bent towards sinning.