Back in 2012 we reported on a surprising anti-Big Brother Surveillance patent that Apple was granted by the US Patent Office. At the time we noted that Apple's patent read like a science fiction novel borrowing from George Orwell's 1949 book titled "Nineteen Eighty-Four." It also had shades of the 1982 movie the "Blade Runner," where the Master Cloner invented a method of implanting false memories into clones so as to provide them with a confident self-image. Now leaping to last night's report from the Wall Street Journal we learn that a social-network furor has erupted over news that Facebook in 2012, conducted a massive psychological experiment on nearly 700,000 unwitting users. Society has all but accepted that governmental organizations and agencies like the NSA are capable of such deception and manipulation. Yet when we hear that Facebook, a somewhat respected social media pioneer has resorted to such tactics, we see abuse of power and privilege that's all for future monetary gain. There's absolutely no justification for this kind of study and it makes you wonder how many other tech companies are doing the very same thing under the guise of doing it for our benefit.
The Wall Street Journal reports that "To determine whether it could alter the emotional state of its users and prompt them to post either more positive or negative content, the site's data scientists enabled an algorithm, for one week, to automatically omit content that contained words associated with either positive or negative emotions from the central news feeds of 689,003 users.
The research, published in the March issue of the Proceedings of the National Academy of Sciences, sparked a different emotion—outrage—among some people who say Facebook toyed with its user's emotions and uses members as guinea pigs.
"What many of us feared is already a reality: Facebook is using us as lab rats, and not just to figure out which ads we'll respond to but actually change our emotions," wrote Animalnewyork.com, a blog post that drew attention to the study Friday morning.
Facebook has long run social experiments. Its Data Science Team is tasked with turning the reams of information created by the more than 800 million people who log on every day into usable scientific research.
On Sunday, the Facebook data scientist who led the study in question, Adam Kramer, said he was having second thoughts about this particular project. "In hindsight, the research benefits of the paper may not have justified all of this anxiety," he wrote on his Facebook page.
Comments from Facebook users poured in Sunday evening on Mr. Kramer's Facebook page. The comments were wide-ranging, from people who had no problem with the content, to those who thought Facebook should respond by donating money to help people who struggle with mental health issues."
According to Business Insider, Facebook says it does research like this experiment to figure out how to make the content people see on Facebook as relevant as possible. A spokesperson sent Business Insider the following comment:
"This research was conducted for a single week in 2012 and none of the data used was associated with a specific person's Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people's data in connection with these research initiatives and all data is stored securely."
As one Facebook user said, "I appreciate the statement, but emotional manipulation is emotional manipulation, no matter how small of a sample it affected."
CNBC's report posted earlier today noted a post written on Twitter by privacy activist Lauren Weinstein who wrote: "I wonder if Facebook KILLED anyone with their emotion manipulation stunt. At their scale and with depressed people out there, it's possible." Although that's a little over the top, it's just part of the backlash that's bound to increase as the story reaches more eyes.