For example, in addition to helping people see and find things - TopicsExpress



          

For example, in addition to helping people see and find things that you do and share, we may use the information we receive about you …for internal operations, including troubleshooting, data analysis, testing, research and service improvement. -- from Facebooks Data Use Policy https://facebook/about/privacy/your-info Every Facebook user agrees to this policy when they accept the social networking giants Terms and Conditions: https://facebook/legal/terms This weekend, millions of people became aware of what that line actually meant, in practice: Facebook staff can and do use your data for internal research, including showing users differing combinations of more a reported 1,600-odd elements in the newsfeed of 2014. Robinson Meyer wrote one of the best short summaries: For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves. This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it. The experiment is almost certainly legal. In the company’s current terms of service, Facebook users relinquish the use of their data for “data analysis, testing, [and] research.” Is it ethical, though? Since news of the study first emerged, I’ve seen and heard both privacy advocates and casual users express surprise at the audacity of the experiment. theatlantic/technology/archive/2014/06/everything-we-know-about-facebooks-secret-mood-manipulation-experiment/373648/ In sum, data scientists at the worlds largest, most powerful social media company experimented with human subjects (aka users) without their knowledge, in particular the informed consent required of social scientists or the internal check of an institutional review boards, and published an academic paper about it. The ethical issues are at the heart of the reaction that has followed. A new technology hasnt changed existing standards, just the ease with which a powerful entity -- in this case, a technology company -- and the people within it can make choices that encroach upon or violate existing norms. techrepublic/article/disruptive-technologies-pose-difficult-ethical-questions-for-society/ My sense is that the public outrage that has followed stems from a feeling of betrayal, both from people whose fears about Facebooks manipulation of what they see here on the newsfeed have been confirmed and others who were at least confident that at least there wasnt an intentional manipulation of their emotions by the platform in question. Its a story where academic standards and corporate ethical worlds have collided, at Professor Ed Felten explained: https://freedom-to-tinker/blog/felten/facebooks-emotional-manipulation-study-when-ethical-worlds-collide …the core of the objection to the research is that the researchers should not have been the ones deciding whether those benefits justified exposing subjects to the experiment’s possible side-effects. The gap between industry and research ethics frameworks won’t disappear, and it will continue to cause trouble. There are at least two problems it might cause. First, it could drive a wedge between company researchers and the outside research community, where company researchers have trouble finding collaborators or publishing their work because they fail to meet research-community ethics standards. Second, it could lead to IRB laundering, where academic researchers evade formal ethics-review processes by collaborating with corporate researchers who do experiments and collect data within a company where ethics review processes are looser. Will this lead to a useful conversation about how to draw ethical lines that make sense across the full spectrum from research to practice? Maybe. It might also breathe some life into the discussion about the implications of this kind of manipulation outside of the research setting. Both are conversations worth having. I agree.
Posted on: Mon, 30 Jun 2014 19:16:17 +0000

Trending Topics



Recently Viewed Topics




© 2015