Studies Without Consent
A few weeks back, Facebook published a paper describing a study they had performed on 600,000 of their users without informing them. Facebook increased (or decreased) the number of positive or negative terms showing up in the feeds of certain users. Then it monitored those users’ own posts to see if their mood got affected by what they saw (The answer: it did). The overwhelming reaction, as expected, was fury. John Gruber ranted: “Yes, this is creepy as hell, and indicates a complete and utter lack of respect for their users’ privacy or the integrity of their feed content.” But Jesse Singal wondered what Facebook had done differently? “ So the folks who are outraged about Facebook’s complicity in this experiment seem to basically be arguing that it’s okay when Facebook manipulates their emotions to get them to click on stuff more, or for the sake of in-house experiments about how to make content “more engaging” (that is, to find out how to get them to click on stuff mo