Oh, look, that DFH conspiracy website Quartz says that “The US military is already using Facebook to track your mood“:
Critics have targeted a recent study on how emotions spread on the popular social network site Facebook, complaining that some 600,000 Facebook users did not know that they were taking part in an experiment. Somewhat more disturbing, the researchers deliberately manipulated users’ feelings to measure an effect called emotional contagion.
Though Cornell University, home to at least one of the researchers, said the study received no external funding, but it turns out that the university is currently receiving Defense Department money for some extremely similar-sounding research—the analysis of social network posts for “sentiment,” i.e. how people are feeling, in the hopes of identifying social “tipping points.”
The tipping points in question include “the 2011 Egyptian revolution, the 2011 Russian Duma elections, the 2012 Nigerian fuel subsidy crisis and the 2013 Gazi park protests in Turkey,” according to the website of the Minerva Initiative, a Defense Department social science project…
If the idea of the government monitoring and even manipulating you on Facebook gives you a cold, creeping feeling, the bad news is that you can expect the intelligence community to spend a great deal more time and money researching sentiment and relationships via social networks like Facebook. In fact, defense contractors and high-level US intelligence officials say that social network data has become one of the most important tools they use in the collecting intelligence….
The growth of social media has not just changed day-to-day life at agencies like DIA, it’s also given rise to a mini gold rush in defense contracting. The military will be spending an increasing amount of the $50 billion intelligence budget on private contractors to perform open-source intelligence gathering and analysis, according to Flynn. That’s evidenced by the rise in companies eager to provide those services…
Lots of exciting anedotes on the “One tweet and they can find you” theme at the link.
Meanwhile, Facebook hunts for a medium-strong frowny-face emoticon and says it is sorry, not sorry:
On Wednesday, Facebook’s second-in-command, Sheryl Sandberg, expressed regret over how the company communicated its 2012 mood manipulation study of 700,000 unwitting users, but she did not apologize for conducting the controversial experiment. It’s just what companies do, she said.
“This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated,” Sandberg, Facebook’s chief operating officer, told the Wall Street Journal while travelling in New Delhi. “And for that communication we apologize. We never meant to upset you.”…
(I’m sure FB has done the research establishing a major corporation should always shove a chick out front under these circumstances, because women do the best sorry-not-sorrys.)
And the Wall Street Journal helpfully explains that, hey, FB’s been doing this for years, and there’s nothing you can do about it, citizen:
Thousands of Facebook Inc. users received an unsettling message two years ago: They were being locked out of the social network because Facebook believed they were robots or using fake names. To get back in, the users had to prove they were real.
In fact, Facebook knew most of the users were legitimate. The message was a test designed to help improve Facebook’s antifraud measures. In the end, no users lost access permanently.
The experiment was the work of Facebook’s Data Science team, a group of about three dozen researchers with unique access to one of the world’s richest data troves: the movements, musings and emotions of Facebook’s 1.3 billion users….
Until recently, the Data Science group operated with few boundaries, according to a former member of the team and outside researchers. At a university, researchers likely would have been required to obtain consent from participants in such a study. But Facebook relied on users’ agreement to its Terms of Service, which at the time said data could be used to improve Facebook’s products. Those terms now say that user data may be used for research.
“There’s no review process, per se,” said Andrew Ledvina, a Facebook data scientist from February 2012 to July 2013. “Anyone on that team could run a test,” Mr. Ledvina said. “They’re always trying to alter peoples’ behavior.”…
Since its creation in 2007, Facebook’s Data Science group has run hundreds of tests. One published study deconstructed how families communicate, another delved into the causes of loneliness. One test looked at how social behaviors spread through networks. In 2010, the group measured how “political mobilization messages” sent to 61 million people caused people in social networks to vote in the 2010 congressional elections…
“Facebook deserves a lot of credit for pushing as much research into the public domain as they do,” said Clifford Lampe, an associate professor at the University of Michigan’s School of Information who has worked on about 10 studies with Facebook researchers. If Facebook stopped publishing studies, he said, “It would be a real loss for science.”…
Hey, as long as “no users lost access permanently”, amirite?