A smartphone user logs into his Facebook account in Rio de Janeiro April 15, 2013.Reuters/Reuters
Canada's Privacy Commissioner will press Facebook Inc. for more details about a controversial experiment that manipulated the social network's content to influence users' emotions, saying the study "raises some questions."
The federal watchdog's response follows reports that the British Information Commissioner's Office is probing whether the study on "emotional contagion" violated the United Kingdom's data-protection laws by performing a psychological test on Facebook's users.
The study, published in collaboration with two researchers from Cornell University, has angered Facebook users and put the company on the defensive. In January, 2012, Facebook separately filtered positive and negative posts from the news feeds of nearly 700,000 randomly chosen users, without telling them, to track whether it could make those users more or less happy.
"We are aware of the issue; it raises some questions that we are following up on," Valerie Lawton, a spokesperson for the Privacy Commissioner of Canada said in an e-mail. "We will be contacting Facebook to seek further details related to this research and we will be in touch with some of our international counterparts about the matter."
In a paper published last month, Facebook data scientist Adam Kramer and two university researchers, Jamie E. Guillory and Jeffrey T. Hancock, concluded that users who saw more positive content as a result of filtering were more upbeat in their own postings, while the inverse was also true. A total of three million posts with more than 122 million words were analyzed, and "The results show emotional contagion," the paper says.
The aim of the study, Facebook explained, was to understand Facebook's emotional impact and test the popular notion that seeing friends posting positive messages left some users feeling negative or left out.
The social networking giant maintains it was within its rights to conduct the study thanks to provisions in its Data Use Policy, which users agree to when they sign up. The policy says user information can be used "for internal operations, including troubleshooting, data analysis, testing, research and service improvement."
But in an e-mail on Wednesday, a Facebook spokesperson acknowledged, "It's clear that people were upset by this study and we take responsibility for it," saying the company is "improving our process based on this feedback."
The spokesperson also said, "We are happy to answer any questions regulators may have."
Facebook's chief operating officer, Sheryl Sandberg, told television network NDTV in India that "we clearly communicated really badly about this and that we really regret." Later she added: "Facebook has apologized and certainly we never want to do anything that upsets users."
On Sunday, data scientist Mr. Kramer also apologized online. "My co-authors and I are very sorry for the way the paper described the research and any anxiety it caused," he wrote in a Facebook post.
"In hindsight, the research benefits of the paper may not have justified all of this anxiety."
But his apology has done little to quell outrage from users and privacy advocates who have vented their anger on Facebook. And the controversy has led some academics to question whether the study adhered to ethical standards.
In a statement, Cornell University distanced itself from the study, noting that because Prof. Hancock only had access to results and not to individual data, the university's Institutional Review Board decided the research did not need to be reviewed by its Human Research Protection Program.
With files from Globe and Mail reporter Tu Thanh Ha, Reuters and Associated Press
Editor's note: A previous version of this story said Facebook's experiment took place in January. In fact it took place in January, 2012.