Research Object Analysis Paper. election, CEO, Mark Zuckerberg, made multiple posts in response to the accusations and

Size: px
Start display at page:

Download "Research Object Analysis Paper. election, CEO, Mark Zuckerberg, made multiple posts in response to the accusations and"

Transcription

1 Emily Moore March 22, 2018 Bednar COM Research Object Analysis Paper Introduction During the backlash on Facebook for their allowance of fake news during the 2016 election, CEO, Mark Zuckerberg, made multiple posts in response to the accusations and eventually set in place policy changes to be made to Facebook in response. This has brought up a discussion on how platforms on the internet should be regulated since platforms like Facebook have become a more frequent source of information than most TV news channels. Owners like Zuckerberg refuse to accept blame for situations like this and the internet has become the only media source not heavily regulated by the government. Should the government step in and regulate these platforms? Is that even possible? And what would that mean for our society and our freedom of speech? In my analysis of Zuckerberg s posts and the surrounding discourse, it has become clear that we are approaching the climax to this argument and the time in which the government and FCC are either going to intervene or not will happen soon. This debate for platform regulation is unfolding day by day and resulting in a larger debate on freedom of speech and the definition of a news source.

2 There are a few themes that have appeared during my analysis of Zuckerberg s posts and the surrounding discourse. The first two themes in Zuckerberg s posts are a large lack of accountability. In the larger argument in the surrounding discourse there a large theme of questioning the definition of platform. Object Description After the 2016 American presidential election, Facebook was faced with backlash from the American public in regards to the false news articles they allowed to circulate on their platform during the election. These fake news articles spread inaccurate information and mislead the public and most believe that influenced the election results. As a result, many people that had once relied on Facebook as their source of news information lost trust in the platform and demanded something be done about it. On November 12 th, 2016, Zuckerberg made his first post in response to the accusations Facebook was receiving in regards to fake news. He begins by mentioning that Facebook is about giving people a voice and Sometimes when people use their voice though, they say things that seem wrong and they support people you disagree with. In regards to Facebook s part in the spread of fake news he denies it had a large part in the spread of misinformation saying that Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other. He then goes on to outline in detail all the good Facebook has done by existing and being the place where discussions and conversations about the election happened that couldn t have without this platform.

3 On November 18 th, 2016, Zuckerberg made a follow up post addressing the questions on what Facebook is going to do about the misinformation. He explains the difficulty they are having at making policy changes without infringing on their users voice. The problems here are complex, both technically and philosophically. We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible. We need to be careful not to discourage sharing of opinions or to mistakenly restrict accurate content. We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties. Zuckerberg then goes on to outline some small changes Facebook is going to make to see what works and what doesn t and stresses that they are working on fixing this problem. On January 19 th, 2018, Zuckerberg made a post about the major change to they finally made to their policy in regards to the spreading of fake news. He explains that Facebook has been struggling with how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that's not something we're comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you -- the community -- and have your feedback determine the ranking. The changes being made will mean that the Facebook community will fill out a survey to determine what news sources are trustworthy enough to be allowed to be shared virally on Facebook. March 21 st, 2018 Zuckerberg made another post in regards to the Cambridge Analytica situation. He outlines the situation explaining that in 2013, before there were stricter rules on Facebook about what information developers could get access to through a users profile, a man

4 named Aleksandr Kogan had acquired detailed information about users and their friends. In 2014, Facebook made new rules that prevented developers like Kogan from getting such detailed information and restricted it to just a user s name and . In 2015, they learned from journalists at The Guardian that Kogan had shared data from his app with Cambridge Analytica and they banned Kogan from Facebook and made both Kogan and Cambridge Analytica to delete the information they had acquired on Facebook s users. On March 17 th, 2018, the New York Time and other news sources reported that Cambridge Analytica had not deleted the data acquired from Facebook and had in fact used this illegally acquired data while working for the Trump campaign. Zuckerberg goes on to say they are working on fixing this problem and that this was a breach of trust between Kogan, Cambridge Analytica and Facebook. But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that. Analysis of Themes Zuckerberg s language in these posts displays many examples of his and his company s lack of accountability when it comes to the content of its users. In his first response post to this issue he denies that there is a problem at all and discredited the public s concerns with statistics saying more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes. (Zuckerberg, 2016). However, whenever the concern and backlash did not die down after his first address, he followed up with some minor changes that many question even made a difference. Zuckerberg promised that Facebook will crack down on fakery in a number of ways. Those include making it easier for people to report fake stories, better software for detecting likely fakes, and making

5 fakery less lucrative. However, while his second post did give into the demand for changes, Zuckerberg never admits fault on the platform. In fact, Zuckerberg makes it clear that Facebook does not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties. (Zuckerberg, 2016). In both this post and his previous post, Zuckerberg focuses on giving people a voice and repeats phrases similar to this multiple times. He is reminding the reader that Facebook, to him, is a platform for social networking and not an information source and therefore should not be held accountable as such. He again reinforces the lack of accountability on Facebook by saying We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties. (Zuckerberg, 2016). Zuckerberg is using this language to try to distance Facebook from other information sources like TV news so as to not have Facebook be regulated like them. He is putting the blame on the users and off of the platform and basically saying they have little control over content. In his first post, Zuckerberg says Sometimes when people use their voice though, they say things that seem wrong and they support people you disagree with. (Zuckerberg, 2016). Which once again, places blame on the users for the content on this platform and falls in line with the theme of lack of accountability. Zuckerberg is using the angle of protecting his user s freedom of speech to avoid taking responsibility for the fake news articles on Facebook. Zuckerberg is saying that Facebook users have the right to be wrong and therefore they cannot regulate fake news since the definition of truth can sometimes be different depending on the person and their beliefs.

6 The lack of accountability continues into this year. Zuckerberg s post from January 2018, posted shortly after major changes to Facebook s policies blatantly demonstrates Facebook placing blame and responsibility on its users. after he explains the problem they are facing on how to make the decision on what news sources are trustworthy, he admits that they would rather have some else make the decision for them, We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you -- the community -- and have your feedback determine the ranking. (Zuckerberg, 2018). They are putting the decision making on their users so if this change doesn t work it s not Facebook s fault. However, there is a problem with Zuckerberg s argument that Facebook isn t a news outlet, that its intended purpose is for social communication and that they do not want to be arbiters of truth. According to the theory of affordances, how this platform asks to be used is different than how it is actually being used. An affordance refers to the mutuality of actor intentions and technology capabilities that provide the potential for a particular action (Schrock, 1230). More specifically, a communicative affordance frames the practices through which technologies come to be involved in the weave of ordinary conduct (Schrock, 1234). With the combination of Zuckerberg s use of language when reinforcing the social purpose of Facebook and the avoidance of accountability when it comes to fake news response, Facebook is desperately trying to distance themselves from being considered a news outlet to avoid regulation. News outlets on television and radio are heavily regulated by the FCC and the internet is the only remaining media that isn t intensely regulated. This sparks a debate that if a similar regulation system was put into place onto platforms like Facebook would that infringe on our freedom of speech?

7 This brings into question, could regulating Facebook actually be done in a similar manner that television and radio are regulated. Regulating Facebook is harder, though, in part because TV s traditional approach, holding control of the airwaves, doesn t really apply when it comes to the Internet. There s also the question of whether it s even possible for Facebook to address some of its problems. After all, it s much easier for TV networks to supervise a series of 30 minute sitcoms than it is for Facebook to police billions of pieces of pieces of user-submitted content a point the social network likes to make when something goes wrong. The fundamental problem, then, may not be that Facebook can t fix its problems but that it won t. Human employees are expensive, and algorithms are cheap. Facebook directly employs only about 20,658 people roughly one employee per 100,000 users. With so little human oversight and so much automation, public relations crisis like the one that surrounded the ads for hate groups are inevitable. (Tufekci, 2017). Tufekci also notes that Facebook users won t simply go elsewhere because Facebook is the only place where people can find everyone they know, it s the baseline for all generations and populations. This last point is important because it disproves the notion that Internet companies must be treated differently than TV networks. Television regulation was written to curb the big broadcasters monopoly on airwaves. Facebook, similarly, has an even more powerful monopoly in the social media/ internet business. The future of the internet and its regulation is dependent on how the government will proceed in regards to Facebook and its reoccurring problems and how vastly it is affecting the American population. It is undecided whether the broadband internet connections will be defined as an essential utility to which everyone has access and for which rates are controlled

8 (like water or electricity), or an information service for which internet service providers can charge as much as they wish (as with cable TV) (Campbell, 570).

9 Work Cited Campbell, Richard, et al. Media & Culture: Mass Communication in a Digital Age. Bedford/St. Martins, Schrock, Andrew Richard. Communicative Affordances of Mobile Media: Portability, Availability, Locatability, and Multimediality International Journal of Communication, vol. 9, 2015, pp , doi: Tufekci, Zeynep. Facebook's Ad Scandal Isn't a 'Fail,' It's a Feature. The New York Times, The New York Times, 23 Sept. 2017, Zuckerberg, Mark. Facebook Post. Facebook, Facebook, 12 Nov. 2016, Zuckerberg, Mark. Facebook Post. Facebook, Facebook, 18 Nov. 2016, Zuckerberg, Mark. Facebook Post. Facebook, Facebook, 19 Jan. 2018, Zuckerberg, Mark. Facebook Post. Facebook, Facebook, 21 Mar. 2018,