Is Facebook Messing with You?

November 18, 2017

       

Photo Source: Flickr  

     

      In 2014, Facebook landed on TV and cable news and the front page of many newspapers following a publication in the Proceedings of the National Academy of Science detailing the results of an experiment conducted on Facebook users. The article titled, “Experimental evidence of massive-scale emotional contagion through social networks,” (PNAS, 2014), describes how researchers at Facebook, Cornell University and University of California, San Francisco used computer algorithms to manipulate individual News Feeds on Facebook to see if they could affect Facebook users emotions. Previous sociology research had shown that emotional states like depression or happiness transfer among personal social networks in a process called “emotional contagion.” Although researchers accepted the concept of emotional contagion among people directly interacting in the physical world, they did not know if emotional contagion worked over social media. Facebook, as one of the world’s most extensive social networking and service companies with over two billion users worldwide, commanded a unique position to test whether emotional contagion would work through social networks on computers and smartphones. Facebook users post messages, links, photos, and videos that get shared with their friends. Their friends see these posts on what Facebook calls a News Feed. On the News Feed, people express their emotions about their friends’ posts through written comments, sharing of content, and liking others’ content. Facebook uses computer algorithms to decide what gets posted on everyone’s News Feed. The algorithm tries to put topics that will engage users and fit what people like based on past preferences. In other words, Facebook manipulates your News Feed to keep you engaged with Facebook. The researchers decided to intentionally manipulate the News Feed of millions of users to see if they could make their users happier or sadder by selecting more positive or negative content posted on individuals News Feeds. The research did show that presenting more negative material on a News Feed did produce more negative emotions and posts whereas, more positive posts elicited more subsequent positive posts showing for the first time that emotional contagion works across social media.

 

     A number of researchers and pundits criticized Facebook for undertaking research on its users without disclosing the program. Usually, any human research needs to be reviewed by an ethics committee, and the subjects need to consent to the investigation. However, as a private company, Facebook claims to not be bound by the same rules as public institutions like universities and that Facebook users consented to the use of their data by Facebook for research when they signed up with the company. The emotional contagion research was not the first or last time Facebook performed research on their vast social network. In 2012, Facebook published a paper in the prestigious journal Nature detailing an experiment they ran to understand how to influence voter turnout in the 2010 United States midterm elections. The paper titled, “A 61-million-person experiment in social influence and political mobilization,” published in 2012 details a simple experiment. (Nature, 2012) In the experiment, Facebook on election day posted one of two messages at the top the News Feed of every eligible US voter on Facebook, which was over 61 million people at the time. One message called the “The Informational Message” noted that it is election day, provided a link to get poll information and a button to click that says “I voted.” The other message, called the “Social Message,” had the same information and button but included the pictures of any of the Facebook users friends who have already clicked the “I voted” button. The simple addition of the pictures of friends who had already voted nudged an additional 340,000 US citizens to vote in that election. The authors suggest that a single message on Facebook on one single day drove a 0.6% growth in voter turnout between 2006 and 2010. Additional research done by Facebook on its users includes a study done by Facebook to understand reader self-censorship. Anya Zhukova reported on MUO that Facebook analyzed the comments and posts that people write but do not submit. (MUO, 2017) Facebook found that 71% of its users self-censor. In other words, an overwhelming majority write things that they feel but do not ever post. Moreover, it is good to know that Facebook saves everything. Yes, even when you don’t send it, it gets recorded.

 

     Facebook commands a dominant position as one of the world’s largest social networking providers. Setting aside, for now, the ethics of experimenting on their users, Facebook has found through their research that they can have a measurable impact on society by manipulating what information they present to their users. The experiment with voter turnout pushed 340,000 voters to the polls that would not have otherwise participated. Considering that close elections have been decided by fewer votes than that, it is essential to recognize the power and reach of Facebook. From the beginning, Facebook has used increasingly sophisticated computer programs to sculpt what individuals see on their News Feeds and coupled with the demonstrated effects of emotional contagion; people should exercise some skepticism about what they see on Facebook. A study by Pew Research Foundation recently demonstrated that 40% of adults in the United States get their news from Facebook. Considering that Facebook’s algorithms strive to keep your attention by showing you what they think you will like and their parallel motivation to have you click on advertisements, going outside the world of Facebook for entertainment, social contact and news will open your eyes to a different reality than Facebook world.

 

 

Share on Facebook
Share on Twitter
Please reload