Topic 4: The Influence of Facebook on Your Opinions, Emotions and Actions (Ethical Issues of Social Media)

Social Media Sites, such as Facebook, are a large part of many people’s everyday lives. Of these sites, Facebook is by far the most popular with more than 1 billion daily active users on average (Facebook, 2015). Facebook has seen massive growth in the last decade. According to research by the Pew Research Center (2015), in 2005 10% of online adults used at least one social networking site, with Facebook being just one of many of these. By 2015, 72% of online adults surveyed used Facebook (Duggan, 2015). Facebook now plays many roles including a social network, a source of entertainment, a video sharing platform, and a news publisher.Screen Shot 2016-04-24 at 16.05.08

40% of all news traffic now originates from Facebook (Timm, 2016).

That number is massive and gives Facebook a huge amount of power in terms of distribution of news globally, but especially in the U.S.A.

Screen Shot 2016-04-24 at 16.05.01

See this video I made on the Facebook News Feed algorithm, why it exists and how it works:

History has shown the potential influence Facebook can have on its users simply by making small changes to this algorithm. One study by Kramer, Guillory and Hancock (Kramer, Guillory, & Hancock, 2014) showed the influence Facebook can exert on users emotional state through emotional contagion that changing their News Feed to have more positive or more negative posts appear can have. In addition, Facebook can turn this power and influence to focus on more political matters. In 2010, Facebook conducted a test in which they were able to increase voter turnout by 340,000 around the country on election day just by showing users a photo of someone they knew saying “I voted” (Timm, 2016; Markoff, 2012).

With this in mind, the influence Facebook can exert on emotions, opinions and actions, the power it exerts as a news publisher, and the ability it has to filter this news through its algorithm without anyone even knowing it had happened should be cause for concern. Facebook has the theoretical ability to create ideological bubbles by limiting the news it decided to show without anyone being any the wiser (Oremus, 2016). This has come to the foreground in recent discussions on the 2016 U.S. Presidential Elections (Nunez, 2016a, 2016b). This is highlighted by Nunez (2016a) who says,

“[Facebook is] where we get our news, share our political views, and interact with politicians. It’s also where those politicians are spending a greater share of their budgets” (Nunez, 2016a)

Screen Shot 2016-04-24 at 16.04.33.png

The reason this issue is such an important one is that Facebook is protected by the First Amendment and the right to freedom of speech and so legally they are entitled to publish and filter whatever news they wish as long as they are not directly working with a political candidate. This means this is an ethical issue completely determined by what Facebook choose to do. Facebook have declared that:

“we as a company are neutral – we have not and will not use our products in a way that attempts to influence how people vote” (Nunez, 2016b; Ranj, 2016).

 

References

Anderle, M. (2015, October 15). How Facebook and Google’s Algorithms Are Affecting Our Political Viewpoints. Retrieved April 2016, from The Huffington Post: http://www.huffingtonpost.com/megan-anderle/how-facebook-and-googles-_b_8282612.html

Duggan, M. (2015). Mobile Messaging and Social Media 2015. Available at: http://www.pewinternet.org/2015/08/19/mobile-messaging-and-social-media-2015/: Pew Research Center. Retrieved from PewResearchCenter.

Facebook. (2015, December 31). Company Info. Retrieved April 2016, from http://newsroom.fb.com/company-info/

Facebook. (n.d.). How News Feed Works. Retrieved from Facebook Help Center: https://www.facebook.com/help/327131014036297/

Kramer, A., Guillory, J., & Hancock, J. (2014). Experimental evidence of massive-scale emotional contagion through social networks. PNAS, 111(24), 8788–8790.

Luckerson, V. (2015, July 9). Here’s How Facebook’s News Feed Actually Works. Retrieved April 2016, from Time: http://time.com/3950525/facebook-news-feed-algorithm/

Markoff, J. (2012, September 12). Social Networks Can Affect Voter Turnout, Study Says. Retrieved April 2016, from The New York Times: http://www.nytimes.com/2012/09/13/us/politics/social-networks-affect-voter-turnout-study-finds.html?_r=0

Nunez, M. (2016a, April 15). Facebook Employees Asked Mark Zuckerberg If They Should Try to Stop a Donald Trump Presidency. Retrieved April 2016, from Gizmodo: http://gizmodo.com/facebook-employees-asked-mark-zuckerberg-if-they-should-1771012990

Nunez, M. (2016b, April 15). Facebook Says it Doesn’t Try to Influence How People Vote. Retrieved April 2016, from Gizmodo: http://gizmodo.com/facebook-says-it-doesnt-try-to-influence-how-people-vot-1771276946?rev=1460755179651

Oremus, W. (2016, January 3). Who Controls Your Facebook Feed. Retrieved April 2016, from Slate: http://www.slate.com/articles/technology/cover_story/2016/01/how_facebook_s_news_feed_algorithm_works.html

Perrin, A. (2015). Social Media Usage: 2005-2015. Available at: http://www.pewinternet.org/2015/10/08/2015/Social-Networking-Usage-2005-2015/: Pew Research Center.

Ranj, B. (2016, April 15). Facebook promises not to use its product to influence the US presidential election. Retrieved April 2016, from Business Insider: http://uk.businessinsider.com/facebook-promises-not-to-use-its-product-to-influence-the-us-presidential-election-2016-4?r=US&IR=T

Timm, T. (2016, April 19). You may hate Donald Trump. But do you want Facebook to rig the election against him? Retrieved April 2016, from The Guardian: http://www.theguardian.com/commentisfree/2016/apr/19/donald-trump-facebook-election-manipulate-behavior?CMP=fb_gu

 

Image: pixabay.com – Public domain, infographic images: http://www.mushroomnetworks.com/infographics/social-media-news-outlets-vs.-traditional-news-infographic

Advertisements

9 Comments Add yours

  1. theo says:

    Hi Ellie,
    I found your post really interesting, as Facebook is all over the world nowadays, each and everyone of us are using it, and it is always good to read about. Although I had some ideas on how the whole news-feed-system is probably working, I have never actually researhed the topic in more concrete details before. Watching your video was stimulating, and made me realise how complex and smart this algorithm behind the news feed is. Facebook seems to have some super power with all the influence on people’s emotions, opinions and action as you are writing. Facebook however, seems to work for the good causes, as it showed when these recent terror attacks has happened, and it introduced the “safety-check in” feature. As long as Facebook stays with the news-delivery role only, and stays neutral in every dubious case they possibly involved in, such as politics, too much harm shouldn’t threat us or the concerned parties. However, have you ever thought about what would happen if Facebook’s direction got into the hands of terrorists…?
    Looking forward to read your future posts. X

    Like

    1. Hi Theo,
      Thanks for your comment, I’m really glad you found the video and post interesting and informative! I think you are right that Facebook does have some great feature such as the safety check-in feature, the problem becomes what if they don’t stay neutral. This is problematic because they could change their policy and if they did we would have no real way to know they had changed their policy. With regards to your question about terrorists, I think whilst they can and do use Facebook it would be from a user perspective for communication purposes. I do not think it at all likely that they would take control of Facebook and use the news feed to influence people for a number of reasons including Facebook’s extensive security measures and the fact that the algorithm is so complicated they would not be able to change it. So to answer your question I don’t think that would ever happen.

      Thanks again for your comment!

      Like

  2. Clayton says:

    Hi Ellie,

    As you know I am very passionate when talking about the power and potential that Facebook can have. I truly feel like it can make big changes in people’s opinions and behaviours and with a case like this that feeling is verified.

    The fact that Facebook has the potential to influence voting results/ turnout is both amazin g- in that there is such a way that this is even possible – and understandably scary – that a private, for-profit company has the ability to do so, unconfined by law.

    When we discussed this with Susan, she asked would people be concerned by this and stop using Facebook. She also expressed that she thought this was a scary prospect. Whilst I do agree somewhat for the reason that I expressed above, I can’t help but feel that this is the same power that newspapers and news organisations have where a lot of people get daily information from. The only difference I see is that Facebook says that they do not sway results whilst newspapers overtly give support to or slander political parties.
    Also many companies do have somewhat of a political stance and employ people to lobby governments but would Facebook doing such a thing be a step too far?

    Like

    1. Hi Clayton,
      Thanks for your comment. I do think it is scary but you are right that newspapers etc do this and Facebook is saying it doesn’t. I think the concern with Facebook is the pretty much unchallenged power it has, and so, for example if they swayed left there isn’t an equivalent site which sways right. Perhaps if they were more explicit about their leanings and the parties etc they supported then this would help as it would allow users to see the potential biases in what they see. But other than that I think we just have to take them for their word and hope they don’t change their policy.

      Like

  3. Hey Ellie,

    Great blog post again! I was aware of Facebook’s maketing approaches to tailor what we see on FB influenced by our recent google searches and interests however I wasn’t aware of the full extent to which they control our media. This all makes sense that I will only see popular posts or posts that have already got more attentions. Your pictures and video’s were very useful and clear, and really emphasises the extent of who is affected. With soo many FB users and a lot of data being used (with many sites linking to FB accounts these days) do you think the use of our data will be something to be concerned about in the near future. We have talked about data collecting in previous topics but FB take it a step further in deciding and dictating what we want said. Although FB are only showing posts that already exists, I feel like descepency should be used – eg if a bad event happens or if someone is getting bullied?

    I look forward to the next post!

    Like

    1. Hi,
      Thanks for your comment! I’m really glad that you found my post informative. I think the use of our data is definitely a concern and has been for a while and I think you are right that Facebook take it even further. I also think you are right that some posts become popular such as bullying posts and graphic videos which shouldn’t do and as much as Facebook try to deal with this there are too many for them to deal with and so they rely on self-reporting by users marking the posts and inappropriate.

      Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s