This article first appeared in our First Issue from Spring 2019.
By: Nimalen Sivapalan and Shoumik Jamil
The Rohingya Crisis was one of the worst large-scale genocides in modern history. It was a mass ethnic cleansing that took place in Myanmar (formerly Burma) having many underlying reasons for such deep violations of fundamental human rights. However, one considerable factor stands out despite not being discussed on a global level: the role of technology. Specifically, how hate speech was allowed to run rampant on social media platforms, how rumors and false information spread like wildfire, and how these technology companies did little on their part to prevent what turned out to be an ongoing genocide. Public opinion and its accessibility played a crucial role in the making of this crisis, which is why a closer inspection upon social media can help understand how the Rohingya have been subjugated for so long.

Social media, on the surface level, is seen as a utility for good. One can connect with their friends across the world and keep in close touch, follow the latest news second by second, and voice their opinions from the comfort of their home. It allows for greater transparency, free speech, and makes the world feel a lot smaller and closer. Given all of these features that make us inherently human as social animals, social media has gained mass adoption over the last 10 years even in a third-world country like Myanmar, where nearly 93% of their social media users use Facebook in addition to the internet itself being synonymous with Facebook. “Among Myanmar’s 53 million residents, less than 1% had internet access in 2014. But by 2016, the country appeared to have more Facebook users than any other south Asian country. Today, more than 14 million of its citizens use Facebook (26%).” According to one Burmese citizen, “when people buy their first smartphone, it (Facebook) just comes preinstalled.”
A Brief History of the Rohingya Crisis
When Myanmar (then Burma) obtained independence in 1945, there were certain ethnicities deemed to be “indigenous”, and the Rohingya were not one of them. “In 1974, all citizens in Burma were required to get national registration cards, but the Rohingya were only allowed to obtain foreign registration cards.” “Before the massacres, there were thought to be around 1.1 million Rohingya living in the country. The Rohingya have existed in Myanmar—a Buddhist majority country—for centuries.”
The Rohingya, being an ethnic minority in Myanmar, are not well represented in the country’s government. The issues they faced only came to light through external investigations by organizations including the United Nations, New York Times, The Guardian, Reuters, among others. “International aid workers and journalists have been barred from the region and even arrested for trying to cover the crisis.” The military, it was shown, played a huge role in the ethnic cleansing that took place, displacing hundreds of thousands of Rohingya. Even Nobel Laureate Aung San Suu Kyi refused to issue a condemnation or take any significant action, despite being the de facto leader of the Myanmar government.
Many, including the top UN Human Rights Official and US Secretary of State, have called the ongoing catastrophe a “textbook example of ethnic cleansing”. Hate speech, rumors, and violence all served to exacerbate this ethnic cleansing by the country’s Buddhist majority, who viewed Rohingya as ‘Bengali terrorists’ and illegal immigrants. The crisis has worsened in recent years with increased violence and the Rohingya being forced out of Myanmar, turning to neighbouring Bangladesh as refugees after being rejected from other South Asian countries including Indonesia, Malaysia, and Thailand.
The Weaponization of Social Media
For social media and Facebook in particular, a platform that is so vast and open to anyone with access to an internet-connected device, policing and moderation becomes a lot harder. “For years, Facebook – which reported net income of $15.9 billion in 2017 – devoted scant resources to combat hate speech in Myanmar, a market it dominates and in which there have been regular outbreaks of ethnic violence. In early 2015, there were only two people at Facebook who could speak Burmese reviewing problematic posts. Before that, most of the people reviewing Burmese content spoke English”. It’s easy to see why hate speech and false information was allowed to run rampant with very few resources dedicated to the issue.
Facebook, on its part, after a NY Times investigation and a report by the UN on the ongoing ethnic cleansing, removed 13 pages and 10 accounts from its platform associated with the Myanmar military accused of spreading the hate and propaganda targeting the Rohingya. While this may seem like a small number, Facebook’s reach is so broad that this impacted 1.35 million unique followers of those pages.
It’s easy to see why Facebook then has become a tool that can be weaponized by those with a mass reach, like the Myanmar military. Having access to thousands of citizens’ attention and infiltrating their feeds by feeding them state-sponsored news and propaganda allows the news to spread quickly before it is even verified. As the saying goes, “A lie gets halfway around the world before the truth has a chance to get its pants on.” This has set a dangerous precedent not just for Myanmar but for social media in general, as these bad actors can add fuel to the fire and spread incendiary, false rumors.
Looking Forward and Learning From the Past
Looking at Facebook’s actions, it is apparent that there is more work that could and should be done in response to the manipulation of their platform by maliciousactors. “Even now, Facebook doesn’t have a single employee in the country of some 50 million people. Instead, it monitors hate speech from abroad. This is mainly done through a secretive operation in Kuala Lumpur that’s outsourced to Accenture, the professional services firm, and codenamed “Project Honey Badger”. For one, if Facebook wants to operate in a country that is used by and affects so many people, it should devote enough resources to the country for internationalization, by having native Burmese speakers who can easily navigate the array of posts in that language.
In addition, regulation should be considered when situations like these occur that threaten basic human rights. For example, when Facebook was utilized in a similar way in Sri Lanka to indulge in mass looting and riots against Muslims, the government temporarily blocked most social media including Facebook to bring the situation under control after pleas to Facebook went nowhere. Further stringent regulations could be proposed, issuing an ultimatum to tech companies to devote enough resources and be committed to fairness and equality or banning them entirely.
Lastly, users themselves should be held accountable for the communities they are a part of. For both users that post this false information and users that choose to share and spread it, there should be consequences. While social media itself is not a direct cause of the mass ethnic cleansing in Myanmar, it is undoubtedly a tool that can easily be misused by those wishing to cause discord, spread hate, and act in their self-interests.
Conclusion
The ongoing ethnic cleansing of the Rohingya in Myanmar had many symptoms that allowed this to occur in the first place – a rejection of the Rohingya dating back decades and treating them like second-class citizens, inherent religious discrimination, and a lack of reliable infrastructure devoted to true information and fairness. Technology and social media have helped in giving everyone a voice and provided a convenient way for the layman to get access to the news, whether it be “fake news” or not, but has also morphed the discourse from one of arguing about actual issues to demagoguery and hate speech.
Technology companies should seek to learn from the past and what needs to be done to not only prevent the current ongoing ethnic cleansing through platforms like Facebook, but also take proactive measures in preventing the minority taking over the mainstream discourse. In exchange for these companies to make billions of dollars off of these users’ data, it only seems reasonable to request accountability and action when things go wrong. It will require soul-searching, deliberation, and perhaps a change in how business is done, but this active discussion (and action) should take place before it is too late for any meaningful change to occur.
Sources
- A Genocide Incited on Facebook, With Posts From Myanmar’s Military, NY Times, 2018
- Revealed: Facebook hate speech exploded in Myanmar during Rohingya crisis, The Guardian, 2018
- Why Facebook is losing the war on hate speech in Myanmar, Reuters, 2018
- Weaponizing social media: The Rohingya crisis, CBS News, February 2018
- Removing Myanmar Military Officials From Facebook, Facebook, 2018
- Forget Washington. Facebook’s Problems Abroad Are Far More Disturbing, NY Times, 2017
- ‘Have You No Shame?’ Myanmar Is Flogged for Violence Against Rohingya, NY Times, 2018
- Prosecution of the Rohingya Muslims: Is Genocide occuring in Myanmar’s Rakhine State?, Yale Law School, October 2015
- Mark Zuckerberg on Facebook’s role in ethnic cleansing in Myanmar: “It’s a real issue”, Vox, 2018
- A War of Words Puts Facebook at the Center of Myanmar’s Rohingya Crisis, NY Times, October 2017
- Social Media Stats Myanmar, StatCounter
- A brief history of the word “Rohingya” at the heart of a humanitarian crisis, Quartz, October 2017
- On Instagram, 11,696 Examples of How Hate Thrives on Social Media, NYTimes, October 29, 2018
Supplementary sources
Where Countries Are Tinderboxes and Facebook Is a Match, NY Times, 2017
How WhatsApp is being abused in Brazil’s elections, BBC, 2018