Facebook announced on Thursday that its future research involving over 1.3 billion users will be reviewed and monitored carefully by its executives.
In June, the social networking giant released the results of an experiment it conducted wherein the researchers manipulated the posts displayed on 689,003 users' News Feeds in an attempt to understand how human emotions were affected by posts made by other people. The experiment ran for a week in January 2012, and it included at least 3 million posts that mentioned 122 million words. The reported psychological experiment raised criticisms as users were unaware that they were involved in such huge research. It also raised privacy concerns
The recent announcement promised that the company's engineers and researchers will undergo a six-week boot camp for research ethics training, USA Today reported. But, the statement seemed lacking as Facebook did not involve outside party to review the changes and no clear guidelines were presented. It also did not clarify if users will be given the option to participate or opt out for experiments.
"It is clear now that there are things we should have done differently. For example, we should have considered other non-experimental ways to do this research. The research would also have benefited from more extensive review by a wider and more senior group of people. Last, in releasing the study, we failed to communicate clearly why and how we did it," wrote Facebook Chief Technology Officer Mike Schroepfer.
Academics and consumer advocates stated that this plan could be a positive step towards transparency; but, Facebook needs to do more than that if they are planning to appease their critics.
"This is a company whose lifeblood is consumer data. So mistrust by the public, were it to reach too critical a point, would pose an existential threat to the company," Ryan Calo, assistant professor at the University of Washington School of Law, told the New York Times.
"Facebook needs to reassure its users they can trust them," he added.