By HANNA BAJWA
A protest on Parliament Hill in Ottawa, Canada in 2018 in support of the Rohingya people, denouncing the horrendous actions committed by the Myanmar government.
Headlines around the world have reported that the Rohingya people are suing Facebook, now known as Meta, for $150 billion. To explain how Facebook has landed in trouble once again, we must go back to 1977, when the Myanmar military launched a campaign to register citizens and remove those deemed ‘foreigners’. Beginning in 2016, the Myanmar military launched a campaign of violence against the Rohingya community, under what the UN labelled as a “textbook example of ethnic cleansing”. In 2017 alone, more than 10,000 people were killed, and over 150,000 were subject to physical violence. Rohingya refugees are now suing Facebook, alleging it helped foster Myanmar’s genocide.
No precedent exists for such a case, at least when it comes to social-media companies; the closest being Radio Mille Collines in Rwanda, which was involved in inciting the 1994 Rwandan genocide. What the outcome of this lawsuit will be is yet to be revealed.
Facebook has admitted it did not do enough to stop its platform from being misused. The allegations claim that Facebook’s algorithms amplified hate speech against the Rohingya and also failed to invest in moderators and local fact-checkers who were familiar with the situation in Myanmar. Moreover, the social media giant failed to take down specific posts inciting violence against or containing hate speech directed towards the Rohingya people, nor banned specific accounts which were being used to incite violence and propagate hate speech.
However, this is not the first time an accusation has been made due to Facebook’s involvement in Myanmar. In 2018, UN human rights investigators said Facebook has played an “extraordinary and outsized role” within the country, specifically in regard to the genocide. Despite being warned since 2013 by NGOs and media about extensive anti-Rohingya posts, groups, and accounts on its platform, Facebook still failed to take suitable, opportune action. To this day, Facebook's recommendation algorithm still invites users to ‘like’ pages that share pro-military propaganda, which violates the platform's rules. Additionally, associates and proxies of the Myanmar military regime are still using Facebook as a platform, and considering Facebook is used by almost half of Myanmar’s citizens, their influence is huge.
Is this evidence of how social media and technology has become too influential now? The Rohingya genocide influenced by Facebook is just another example of how algorithms are becoming more powerful, persuasive, and potentially dangerous. To put this into a more political context – many people are unaware of the underlying algorithms that work to produce the media we consume online. If people believe that what they see on their feed is ‘news’, instead of content that is curated specifically for them, and also only engages with people that have similar beliefs, this creates a bubble. This bubble is also known as an echo chamber, which reinforces people’s beliefs through confirmation bias.
The main outcome of this situation is political polarisation. Since 2000, the percentage of Americans who consistently hold liberal or conservative beliefs—rather than a mix —has jumped from 10% to over 20%, and the number of Americans who see the opposing political party as a threat to “the nation’s well-being” has doubled. Due to the aforementioned echo chambers, we are no longer listening to the ‘other side’, leading to serious consequences such as the Myanmar genocide, or the potential poisoning of political campaigning and voter manipulation as seen with Cambridge Analytica, the Brexit vote, and the 2016 American presidential elections. Democracy is facing a threat from something almost everyone uses today – social media.
However, it is not all doom and gloom. If we want to eradicate, or at least lessen, the impacts of the negative effects of social media on politics, we need to rethink how our online communities operate. The solution to the problem is not to eliminate echo chambers, but rather be intentional about the social networks in those echo chambers. The more equity in people’s social networks, the less biased and more informed groups will become. We need an online environment that reflects the way a healthy society naturally acts rather than an algorithm designed to manipulate our attention to make money.
Image: Flickr (Mike Gifford)