It's the Algorithm, Stupid!
Like it or not,
as everything goes digital, algorithms (software rules) will make more and more
decisions for us. Every now and then, those rules will misfire. Note that this
is totally different than good old bugs in the software; the issue is with the
algorithm not being appropriate in a specific context or time period. Let’s
take 2 very recent examples of this problem.
Remember that
famous 1972 pic from the Vietnam War of a naked, 9-year-old girl fleeing napalm
bombs? When a Norwegian newspaper, Aftenposten,
posted that pic on Facebook (in the context of the horrors of war), Facebook’s
algorithms removed the pic because it was (mis)identified as child porn. An
honest mistake, obviously. But Aftenpost’s
editor, Espen Egil Hansen, wrote an open
letter to Mark Zuckerburg:
“Please try to envision a new war where
children will be the victims of barrel bombs or nerve gas. Would you once again
intercept the documentation of cruelties, just because a tiny minority might
possibly be offended by images of naked children, or because a paedophile
person somewhere might see the picture as pornography?”
Ok, Hansen,
let’s roll back the rhetoric. After all, it’s not as if Mark Z personally pull down that pic on
Facebook. And, as Hansen himself says later in his letter, to take or not to
take offense varies across cultures, religions and regions. Citing the Mohammed
cartoons example, Hansen rightly says:
“It was – and remains – different in Oslo
and Karachi.”
Exactly. So how
can Facebook possibly make a call that works for everyone?!
Personally, I
think Facebook screwed this up. But I don’t see any solution: such things will
happen now and then. If Facebook screws up too often, market forces will punish
it: that’s the way the public and capitalism work. Europe repeatedly has
trouble understanding this basic principle, tends to get on its moral high
horse, and doesn’t mind if
countries (as opposed to companies) censor:
“Under the proposed law, the "site
manager" of Italian media… would be obliged to censor "mockery"
based on "the personal and social condition" of the victim -- that
is, anything the recipient felt was personally insulting… Truthfulness is not a
defense in suits under this law -- the
standard is personal insult, not falsehood.”
Apparently
Hansen is OK with views being suppressed, just not when it’s done by Facebook.
The second
example of algorithms creating bad press was after the recent explosion in New
York. Surge pricing followed; and people accused
Uber of profiting from a bomb explosion. Quickly, Uber deactivated the
algorithm for that area. As a wag
on Slashdot suggested, perhaps Uber could have tried putting a positive
spin on the episode:
“Headline should read "Uber
Increases Driver Pay to Help Meet Emergency Demand."”
Comments
Post a Comment