Advertisement
U.S. markets closed
  • S&P 500

    5,254.35
    +5.86 (+0.11%)
     
  • Dow 30

    39,807.37
    +47.29 (+0.12%)
     
  • Nasdaq

    16,379.46
    -20.06 (-0.12%)
     
  • Russell 2000

    2,124.55
    +10.20 (+0.48%)
     
  • Crude Oil

    83.11
    -0.06 (-0.07%)
     
  • Gold

    2,254.80
    +16.40 (+0.73%)
     
  • Silver

    25.10
    +0.18 (+0.74%)
     
  • EUR/USD

    1.0778
    -0.0015 (-0.14%)
     
  • 10-Yr Bond

    4.2060
    +0.0100 (+0.24%)
     
  • GBP/USD

    1.2621
    -0.0001 (-0.01%)
     
  • USD/JPY

    151.3610
    -0.0110 (-0.01%)
     
  • Bitcoin USD

    70,319.87
    +479.03 (+0.69%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • FTSE 100

    7,952.62
    +20.64 (+0.26%)
     
  • Nikkei 225

    40,333.95
    +165.88 (+0.41%)
     

Facebook promises to review user experiments more carefully

After raising a storm of controversy for manipulating its users emotions in the name of scientific research, Facebook (FB) announced new steps to consider future studies more carefully.

Facebook was widely criticized in June after researchers published the results of a study that measured the impact of showing almost 700,000 users more positive or negative stories in their news feeds. The study found a tiny but possibly significant impact on users’ happiness from the manipulations.

“We were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism,” Mike Schroepfer, Facebook’s Chief Technology Officer, said in a blog post on Thursday. “It is clear now that there are things we should have done differently.”

Under the new policy, Facebook researchers will have to seek approval for sensitive projects from a new committee that will include members from the company’s engineering, research, legal, privacy and policy teams. Notably absent are any reviewers from outside Facebook.

The company also promised to educate employees better during training about research policies. And Facebook will disclose all research derived from experiments on its users on a new section of its website.

But the new policy did not add a requirement to get users’ consent before altering their feeds, one of the most criticized aspects of the happiness study. Facebook’s massive terms of service that all users must approve before joining the service include consent for “research,” though the term was added in 2012 after the controversial study on emotions was conducted.

"The policy is a good start, but it falls well short of what Facebook is legally required to do," says James Grimmelmann, a law professor at the University of Maryland. "It still does not treat users as people who are entitled to make their own decisions about whether to take part in research projects."

A member of Facebook’s core data science team and two researchers from Cornell University conducted the study, called "Experimental evidence of massive-scale emotional contagion through social networks," that set off the controversy. For one week in 2012, they altered how many positive or negative items appeared in the news feeds of some 689,003 Facebook users. They then measured how many positive and negative words appeared in users’ own posts. Users that saw fewer positive posts had 0.1% fewer positive words in their own posts. And users that saw fewer negative posts wrote posts with 0.07% fewer negative words, the study found.

Critics said Facebook should have gotten explicit consent from users involved in the study, and noted the potential harm of such manipulations, given that one in ten Americans suffers from a mood disorder that could lead to depression.

Facebook is hardly the only Internet company that runs experiments on its users, though most efforts are directed at improving business results, not academic research. Twitter (TWTR) announced this week that it was funding a new research center at the Massachusetts Institute of Technology and would give the university access to its entire archive of tweets. The deal did not involve any manipulating of users' Twitter streams.

Advertisement