Facebook’s News Feed Experiment Probed by U.K. Regulators
Facebook Inc. (FB) is being investigated by the U.K.’s data-protection authority after a study showed a psychological experiment influenced what users saw in their news feeds, raising fresh privacy concerns.
A company researcher apologized on June 29 for a test in January 2012 that altered the number of positive and negative comments that almost 700,000 users saw on their online feeds of articles and photos. Disclosure of the experiment prompted some members to express outrage on Twitter about the research as a breach of privacy.
Regulators may want to examine whether Facebook users should have been informed of the experiment and what the company’s purpose was in collecting information, said Paul Van den Bulck, a lawyer at McGuireWoods LLP in Brussels.
“If there was consent it was very broad in the terms and conditions,” said Van den Bulck. “Is that a true consent? I’m not sure. Fair and lawful processing means it must be transparent.”
The U.K. Information Commissioner’s Office, or ICO, said yesterday it will speak with Facebook and work with the Irish Data Protection Commissioner, the company’s lead regulator in Europe, to learn more about the circumstances.
The Irish regulator, which governs Facebook’s compliance with EU privacy law “has been in contact with Facebook on privacy issues including consent in relation to the research” and expects a comprehensive report from the company, said John O’Dwyer, a spokesman for Ireland’s authority.
Facebook “communicated poorly” about the experiment, Chief Operating Officer Sheryl Sandberg said today at a New Delhi event to promote her book “Lean In: Women, Work and the Will to Lead.”
The probe of the social network was reported earlier by the Financial Times. The ICO is investigating whether the company broke data-protection laws, though it’s too early to tell what part of the law Facebook may have infringed, the paper reported.
“It’s clear that people were upset by this study and we take responsibility for it,” said Richard Allan, a spokesman for Facebook in the U.K., in an e-mailed statement. “We want to do better in the future and are improving our process based on this feedback. The study was done with appropriate protections for people’s information and we are happy to answer any questions regulators may have.”
According to a study published June 17 in the Proceedings of the National Academy of Sciences, the number of positive and negative comments that users saw on their news feeds was changed in January 2012. People shown fewer positive words were found to write more negative posts, while the reverse happened with those exposed to fewer negative terms, according to the trial of random Facebook users.
The data showed that online messages influence readers’ “experience of emotions,” which may affect offline behavior, the researchers said.
In a statement on June 29, Facebook said that none of the data in the study was associated with a specific person’s account. Research is intended to make content relevant and engaging, and part of that is understanding how people respond to various content, the Menlo Park, California-based company said.
“We carefully consider what research we do and have a strong internal review process,” Facebook said at the time. “There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”