Filter Bubbles – Eli Pariser TED Talk
May 15, 2011 § 3 Comments
I started watching this video because the link said something rude about Facebook, and I’d read anything that has something rude to say about Facebook. One of the earlier iterations of this blog had much more on my burning antipathy for Facebook; unfortunately those posts died in a 408 death-of-server fire. The basic gist is that Facebook decided I was a troll because I’d been researching trolls and was friends/interacted with some trolls on my research accounts. They disabled my research accounts, no big deal, but then around Christmastime did the same to my rl account because apparently I’m some sort of criminal cyberbully. At first I was like hey, but then I realized how much I didn’t care. I still needed access to RIP pages, though, so immediately set up a new, and this time totally legit, research account — I used my real name and gave my phone number and everything. For the last few months I used that account to observe and connect with some troll contacts, never posted a single thing to a single place save the research group I was admin’ing, yet recently was b& again because………no idea. In this particular case I was not in violation of their stupid TOS, so who knows what kinds of weird algorithms they were running and why my account was tangled in the tripwire.
tl;dr, I can’t stand Facebook as a company or a platform, and was like YOU GO GIRL when Pariser started talking about the ways in which FB limits the kinds and amount of information users encounter — an analysis based on Facebook’s recent push to magically edit users’ news feeds for them (and without asking first). As Zuckerberg explained to a reporter, “a squirrel dying in your front yard may be more relevant to your life than people dying in Africa,” implying that people shouldn’t have to deal with news that might challenge or undermine or even simply diverge from their chosen spheres of influence.
The fascinating (and unsettling) thing is that Facebook is hardly alone; Google is also guilty, as is Yahoo news, as is any company which, in the service of “personalization,” ends up cordoning users into their own respective echo-chambers. Pariser isn’t arguing that algorithms should be banished from the web or that they’re “bad,” but that developers need to give users more control and, in effect, embed some sort of social ethic into the code itself.
So I guess hating Facebook really does pay off, I learn all kinds of interesting things in the process.