Posted by Jeremy Ryan, Daily Advantage, July 2, 2014 Facebook - TopicsExpress



          

Posted by Jeremy Ryan, Daily Advantage, July 2, 2014 Facebook tries to make users sad, makes them mad instead Facebook, much like cable companies, is one of the most popular services to complain about and then use anyway (and the complaining about Facebook is probably done on Facebook). Despite that complaining, or perhaps because of it, Facebook feeds are jammed with information. The incredible amount of volume on the service means Facebook has gotten in the habit of manipulating what appears in users’ feeds—choosing to display some posts by some people and not others—and that’s what got it into trouble last week. Facebook revealed that it treated 700,000 of its users like lab rats, intentionally trying to manipulate their emotions. For one week in 2012, according to a study Facebook probably now wishes it didn’t allow, it deliberately manipulated timelines to show negative emotional posts to one group of users and positive emotional posts to another group and then waited a week to see how that emotional shading affected the users’ subsequent posts. The goal was to study what it called “emotional contagion.” What a weird week that must have been for the unknowing participants. I wonder how often they thought, “Is it just me, or is the world ending this week?” We’re part of studies all the time without our knowledge (for example, a study of commute times might include everyone driving on a certain highway). But this was a study designed to alter the psychological makeup of individuals, a much different sort of beast. Facebook literally tried to make people sad, a weird thing for a service to do when every Facebook user knows there are already plenty of people in their feed trying to do the same thing. Facebook said the research was allowed because users agreed to it in the terms of service, a claim we’ll probably start to hear more and more as companies are tempted to put the massive amount of data they gather about customers to new uses. In my view, agreeing to the terms of service of an internet service in general should never be allowed to convey informed consent to a psychological research study in particular. Even worse, at the time Facebook carried out the study, the data use policy users agreed to in the terms of service didn’t cover research. As Forbes’ Kashmir Hill reported, the research provision wasn’t included until four months later. Maybe this hits home for me because I’m married to a neuropsychologist who’s done her share of research. For all I know, I’m part of some secret experimental project my wife hasn’t told me about yet. That might explain those release forms I always have to fill out after dinner.
Posted on: Sat, 05 Jul 2014 14:26:17 +0000

Trending Topics



Recently Viewed Topics




© 2015