Shoot the Algorithm

A few days back, Michael Brown, an unarmed black man was shot dead by cops in a small town called Ferguson in the US. The incident sparked riots and protests to which the cops then responded with a shock and awe approach: they came out dressed like soldiers, sniper tripods and all! Unbelievable.

“Ferguson is about many things, starting first with race and policing in America. But it’s also about internet, net neutrality and algorithmic filtering.”
Say what??? How did the Internet get dragged into this?

Tufekci said one of her friends wondered whether it would become national news. Another friend responded:
“Yes Ferguson will make news, another friend tweeted, because… well, here you go: Twitter.”
Of course, the friend was right. After all:
“Now, we expect documentation, live-feeds, streaming video, real time Tweets.”
Soon enough, Tufekci found her Twitter feed filled with Ferguson. But not on Facebook:
“And then I switched to non net-neutral Internet to see what was up. I mostly have a similar a composition of friends on Facebook as I do on Twitter.
Nada, zip, nada.
No Ferguson on Facebook last night. I scrolled. Refreshed.”
She blames Facebook’s newsfeed algorithm for that. Because once Ferguson was trending heavily on Twitter, it started showing up on Facebook as well. That led her to wonder:
“What if Ferguson had started to bubble, but there was no Twitter to catch on nationally? Would it ever make it through the algorithmic filtering on Facebook?...(or) Would Ferguson be buried in algorithmic censorship?”
And so she reminds us:
“Algorithms have consequences.”

But this is nothing new. Eli Pariser coined the term (and wrote a book) for this very phenomenon: “filter bubble”. I even wrote a blog on it last year! Here’s the Wikipedia definition of the term:
“A filter bubble is a result state in which a website algorithm selectively guesses what information a user would like to see based on information about the user (such as location, past click behaviour and search history) and, as a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. Prime examples are Google's personalized search results and Facebook's personalized news stream.”
But I wonder how this bubble is any different from the one created by all newspapers, magazines and news channels? Except, of course, this one is created by that Great Evil called Technology. Maybe it’s just me, but if your only source of news is Facebook or Twitter (or social media in general), the problem is with you…but hey, let’s burn the technology witch anyway. Nicholas Carr famously asked:
“Is Google making us stupid?”
Tufekci and Alan Jacobs would have us add Facebook and Twitter to that question.

All this reminds me of the Facebook trial to add a “Satire” tag to articles that people link to. People on Slashdot seem to have my view on that idea. A few sample comments:
1)      “This is the new media. Clearly label satire; obfuscate native advertising.”
2)     “Can they add a "blatant politically motivated lie" tag while they're at it?”
3)     “And who determines if the content at that URL is satirical in nature?”

I so agree with this comment on Techdirt on all such criticism of algorithms:
“As a programmer, ranking algorithm are notoriously hard. There will always be bias and mistakes. There can't be a perfect algorithm.”
Just as editors of newspapers and magazines have their own bias and mistakes. But I am sure someone will come up with a reason as to why the slant of Old Media is not only different, but also better…

Comments

Popular posts from this blog

Student of the Year

The Retort of the "Luxury Person"

Animal Senses #7: Touch and Remote Touch