Are Facebook and other Social Media platforms the new ‘arbiters of truth’?

Online presence project
6 min readMay 30, 2021

A debate has been shifting the minds of many, covering the delicate issue of how social media platforms influence political and social discourse, questioning their responsibility towards their policies on content regulation. Users have revaluated previous actions and believe that the value of freedom of expression is a right that must be recovered.

This so called dilemma is found in various instances, one of them being the Donald Trump incident. The first social media platform to have ever silenced Donald Trump has indefinitely banned him from Facebook due to the US Capitol attack last January. Mark Zuckerberg’s intentions behind this act were to minimise the online manipulation of political violence with current events of Joe Biden’s successful election win. This act was later interfered by the board, hence upholding this decision and giving a total of six months to finalise this issue with the appropriate actions. This was not the first instance that Trump infringed with the company’s guidelines, as in 2015, Trump uploaded a video addressing for a ban to be made on Muslims to not enter the US. Nonetheless, this year was the last string for Trump as he was suspended soon after posting a video addressing the MAGA riot and the Capitol. On Wednesday, January 6th, Trump’s account was banned for 12 hrs after violating Twitter’s content policy guidelines. This followed by the suspension of Trump’s related accounts such as @Potus and @TeamTrump after an attempt of manipulating the decision.

Later on, the board stated that platforms, especially Facebook need to amend current policies and manage the material being put out, especially when the content is newsworthy. Therefore, such arrangements must be communicated with the users to prevent any future conflicts. Trump talked about this issue and described this decision as a “total disgrace”, and for that reason, the involved companies are to pay a “political price” for their actions.

Was this decision correct?

Jack Dorsey, Twitter’s chief executive, stated that this act was the right thing to do for various reasons. Nonetheless, he still showed his disappointment and sadness towards the decision and called this “extraordinary and untenable circumstances”. After thorough considerations, the banning of Trump was something that has been long called for, after past warnings from previous statements and posts.

Dorsey statement on Twitter concluded his final remarks on the issue;

I do not celebrate or feel pride in our having to ban @realDonaldTrump from Twitter or how we got here. After a clear warning, we would take this action; we made a decision with the best information we had based on threats to physical safety both on and off Twitter. Was this correct?

This created a lot of commotion and comments, which created conflict among different activists, specifically towards freedom of speech activists. Some criticised this manner and argued that removing posts and banning users infringes on the First Amendment, which addresses the right to freedom of speech. The German Chancellor’s spokesperson, Angela Marcel, found this action unethical and problematic as she believed that nobody should be excluded from sharing their opinions and beliefs. On the other hand, Joe Biden firmly believes that such companies are to make more use of this act and encouraged this action to be a rule for companies’ policies to prevent negative attitudes such as fake news.

As a result, several human rights activists protested and argued that the companies’ policies must be distributed evenly and fairly without any discretion. Facebook’s chief operating officer, Sherly Sandburg, addressed this concern and stated that the policies are there to be followed by everyone; therefore, if someone uses such platforms for hate speech or to incite violence, actions must be taken immediately (Satariano, 2021)

The policy is that you cannot incite violence, you cannot be part of inciting violence — Sherly Sandberg.

I firmly believe that specific policies must be taken into account. Regardless of the status of the responsible person, if a post is deemed to portray adverse and harmful effects, I believe that actions must be taken into consideration to prevent future damage. Nonetheless, I still support the right for freedom of expression as I feel like it shapes the individuals’ identity; however, I think that in some instances, this freedom, like any other, must be used with caution and with a limit.

Rishab Bailey described this social media act as an “arbiters of truth” due to several factors. The first factor being due to the lack of protection of freedom of speech. This matter is empowered by the Communications Decency Act, which empowers the selection of online content, therefore avoiding any government interruptions. Secondly, user-friendly platforms are controlled by contracts which are bound to hold enormous power when finalising decisions of what and whom will they authorise.

Platforms like Facebook and Twitter have always been perceived as spaces that reached diverse audiences(Deribe Damota, 2019). It also gave people a choice to freely voice their opinions by interacting with users with similar beliefs and interests. Scholars have compared Facebook and other platforms with a tool that helps enhance the awareness of political and social participation (Balkin 2017; Plantin et al., 2016; Van Dijck et al., 2018)

Over time, these digital platforms shifted and increased risks that inflicted with how individuals choose to perceive and value things. Unfortunately, many experts believe that, in the future, manipulative content such as hate speech and fake news will still be an occurring problem. Throughout the years’ social media has evolved into a global digital community that facilitates free speech and provides the world with an exploration of new knowledge and the ability to connect with other users. On the other hand, Social media has also delegated adverse effects that are perceived harmful to society, such as discrimination, illegal content, hate speech, fake news, and violence.

One of the biggest challenges will be finding an appropriate balance between protecting anonymity and enforcing consequences for the abusive behaviour that has been allowed to characterise online discussions for far too long — Bailey, Poland.

The reality is that Facebook and Twitter have become digital spaces that choose posts for relevancy, ultimately deciding on what is deemed true and what is not (Swift, 2019). A clear example is the New York Post Story that explained a detailed description of the supposed explosive allegations against Vice president Biden and his son. Creating much controversy, this shortly died down after Facebook declared that the whole story was fake. As stated by Andy Stone, the policy communications director, Facebook is “reducing its distribution on the platform”; therefore, the story was categorised as misinformation, resulting in the decrease of story exposure by 80% (Blackburn, 2020).

Twitter also tried to remove the post by making it impossible for users to link to it. The explanation behind this act was that the story seemed to hold scammy content, which then Twitter finalised as a source of misinformation and false content. This shows how much power social media companies can have towards the spread of new stories and information. Although Facebook and Twitter have no control over the news, these platforms moderate the spread of different content, therefore regardless of the stories or information, such companies that falsify stories without any justified reason are morally wrong.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — —

References

Balkin, J. M. (2017). Free Speech in the Algorithmic Society: Big Data, Private Governance, and New School Speech Regulation. SSRN Electronic Journal. Published. https://doi.org/10.2139/ssrn.3038939

Blackburn, S. (2020, October 17). Facebook and Twitter Become Arbiters of Truth. Retrieved from https://www.ifs.org/blog/facebook-twitter-become-arbiters-of-truth/

Deribe Damota, M. (2019). The Effect of Social Media on Society. New Media and Mass Communication. Published. https://doi.org/10.7176/nmmc/78-02

Keskin, B. (2018). Van Dijk, Poell, and de Wall, The Platform Society: Public Values in a Connective World (2018). Markets, Globalization & Development Review, 03(03). https://doi.org/10.23860/mgdr-2018-03-03-08

Plantin, J. C., Lagoze, C., Edwards, P. N., & Sandvig, C. (2016). Infrastructure studies meet platform studies in the age of Google and Facebook. New Media & Society, 20(1), 293–310. https://doi.org/10.1177/1461444816661553

Satariano, A. (2021, January 14). After Banning Trump, Facebook and Twitter Face Scrutiny About Inaction Abroad. New York Times. Retrieved from https://www-proquest-com.ejournals.um.edu.mt

Swift, J. (2019). Arbiters of truth, then and now. Science, 366(6469), 1081.1–1081. https://doi.org/10.1126/science.aaz3045

--

--