Meta should immediately pay reparations to the Rohingya for the role that Facebook played in the ethnic cleansing of the persecuted minority group, Amnesty International said today, on the sixth anniversary of the Myanmar military’s brutal operation during which they raped Rohingya women and girls, burned down entire villages, and killed thousands.
Facebook’s algorithms and Meta’s ruthless pursuit of profit created an echo chamber that helped foment hatred of the Rohingya people and contributed to the conditions which forced the ethnic group to flee Myanmar en masse.
Although this stands out as one of the most egregious examples of a social media company’s involvement in a human rights crisis, the Rohingya are still awaiting reparations from Meta.
Pat de Brún, Head of Big Tech Accountability at Amnesty International
“Six years have gone by since Meta contributed to the terrible atrocities perpetrated against the Rohingya people. Yet although this stands out as one of the most egregious examples of a social media company’s involvement in a human rights crisis, the Rohingya are still awaiting reparations from Meta,” said Pat de Brún, Head of Big Tech Accountability at Amnesty International.
“Our investigations have made it clear that Facebook’s dangerous algorithms, which are hard-wired to drive “engagement” and corporate profits at all costs, actively fanned the flames of hate and contributed to mass violence as well as the forced displacement of over half the Rohingya population of Myanmar into neighbouring Bangladesh.
“It is high time Meta faced its responsibilities by paying reparations to the Rohingya and by fixing its business model to prevent this from happening again.”
Coincidentally, 25th August also marks an important step in holding Big Tech to account for its human rights impacts as it is when key provisions of the Digital Services Act (DSA) come into force for major online platforms in the European Union. The DSA is a landmark piece of legislation aimed at strengthening rights in the digital age, which could create ripple effects far beyond the EU.
A personal plea to Meta and Mark Zuckerberg
Today, Amnesty International and Al Jazeera publish a searing first-person account by Rohingya refugee Maung Sawyeddollah, who was forced to flee his village in Myanmar when he was just a teenager. He fled through torched villages and fields filled with dead bodies and now lives in the world’s biggest refugee camp, Cox’s Bazar in Bangladesh, with around a million of his people.
I’d like to meet Mark Zuckerberg and his team. Maybe they’d like to come and spend a night or two in the refugee camp?
Maung Sawyeddollah
As a child, before the hate took root with the help of Facebook, he and his mostly Muslim Rohingya friends played happily with the mostly Buddhist Rakhine children from the neighbouring village — but that all changed when the military moved in.
“I’d like to meet Mark Zuckerberg and his team. Maybe they’d like to come and spend a night or two in the refugee camp?”, Sawyedollah writes. “I’d tell them: ‘Can’t you see your role in our suffering? We asked you, repeatedly, to try and help make things better for us… Yet you ignore our pleas. Tell me, do you feel anything for us? Is it only about the data, is it only about the dollars?”
Background
Last year, Amnesty International published a report detailing Meta’s role in the atrocities committed against the Rohingya people by the Myanmar military in 2017. It revealed that even Facebook’s internal studies dating back to 2012 indicated that Meta knew its algorithms could result in serious real-world harms. In 2016, Meta’s own research clearly acknowledged that “our recommendation systems grow the problem” of extremism.
Beginning in August 2017, the Myanmar security forces undertook a brutal campaign of ethnic cleansing against Rohingya Muslims in Myanmar’s Rakhine State. They unlawfully killed thousands of Rohingya, including young children; raped and committed other sexual violence against Rohingya women and girls; tortured Rohingya men and boys in detention sites; and burned down hundreds of Rohingya villages. The violence pushed over 700,000 Rohingya — more than half the Rohingya population living in northern Rakhine State at the beginning of the crisis — into neighbouring Bangladesh.
Meta contributed to serious adverse human rights impacts suffered by the Rohingya in the context of the 2017 atrocities in Rakhine State and therefore has a responsibility under international human rights standards to provide an effective remedy to the community. This includes making necessary changes to its business model which can ensure this never happens again. All companies have a responsibility to respect all human rights wherever they operate in the world and throughout their operations. This is a widely recognized standard of expected conduct as set out in international business and human rights standards, including the UN Guiding Principles on Business and Human Rights (UN Guiding Principles) and the OECD Guidelines for Multinational Enterprises (OECD Guidelines).
Time for Facebook to Pay Reparations
Six years ago today, 25 August 2023, the ethnic cleansing of the Rohingya people of Myanmar began in earnest. Thousands died and over half the Rohingya population had to flee into neighbouring Bangladesh. Here Maung Sawyeddollah, a 22-year-old Rohingya refugee puts the blame, at least in part, firmly at the feet of Facebook.
I’d like to tell you about my home.
I’m from a village called Nga Yent Change in western Myanmar. My father had a thriving store there, and I lived with my parents and six younger siblings in a large house in a spacious compound surrounded by mango, coconut, and banana trees. Sometimes elephants would meander into the village and then out again into the forest.
I had many friends in the village next door. It didn’t matter that they were Rakhine (mostly Buddhist) and we were Rohingya (mostly Muslim). We were just kids who’d meet in a shared field to play chinlone (a popular team game using a weaved ball). We had a lot of fun together, same as any other kids.
Life is a daily struggle to find even food and water. There have been fires, there have been killings.
Maung Sawyeddollah
Now, six years since the Myanmar military’s ‘clearance operations’, here I am in Cox’s Bazar: the biggest refugee camp in the world across the border in Bangladesh. Around a million of my people are now crammed into this place, living in tiny shelters made from bamboo and tarpaulin. Life is a daily struggle to find even food and water. There have been fires, there have been killings.
How did we end up here?
I blame Mark Zuckerberg, Facebook, and the people who run Meta for helping to create the conditions that allowed the Myanmar military to unleash hell upon us. The company’s vast wealth is generated, at least in part, through the human misery suffered by the Rohingya.
I was only 11 when I first saw a rise in hate speech against my people on Facebook. It came after a group of Rohingya was accused of raping and killing a Buddhist girl in 2012. That crime, to my knowledge, was never solved. Around then my warm friendship with my Rakhine neighbors began to cool.
There had been a long history of tension between communities in the area, but I had experienced no substantial day to day animosity until Facebook and smartphones came along. Facebook became a tool for politicians, bigots, and opportunists to propagate and escalate hate against my people which was then translated into real life harm.
I saw many hateful and Islamophobic messages against Rohingya on Facebook.
Maung Sawyeddollah
In late 2016, the persecution began to have a direct impact on my family. My father and some other financially stable Rohingya were falsely accused of attacking a police station and handed big fines. My uncle Abusufian and his son Busha were arrested for not paying their fine and were jailed without trial; they spent over four years in prison.
Between 2016 and 2017, I saw many hateful and Islamophobic messages against Rohingya on Facebook. One message incited people to get together to “save the country and kick out the illegal ‘Bengalis’”, while another stated that “the birth rate of the illegals is very high. If we let it continue, soon the president of our country will have a beard.” The days of playing chinlone in a field with my Rakhine friends were now over.
I reported this to Facebook, but they did nothing, telling me: “It doesn’t contravene our community standards.”
Then the killing began.
I was aged only 15 at the time and a good student, I hope to become a lawyer someday. On 25th August 2017 I got up early to study for my matriculation exams. Suddenly, I heard guns firing from the police station, it went on for about three hours, and then the military arrived; they killed Mohammad Shomim, a villager who owned a shop in the local market. I didn’t see him die, but I saw his body in the street. Then they secretly laid explosives in the street and I saw a villager called Hussin Ahmed die in an explosion. Everyone was scared and many people went to hide in the forest.
We’d heard what had happened in other villages and were sure they would kill us.
Maung Sawyeddollah
We heard that the authorities had started slaughtering Rohingya in other villages, and some people started fleeing to Bangladesh the next day. We stayed in our home until 30th August. Later, the military announced that everyone must gather in a field in our village beside a Red Crescent office. We didn’t go. We’d heard what had happened in other villages and were sure they would kill us.
We escaped to the homes of family friends in other villages for a few nights and briefly went home but our village was deserted. Everyone knew there had been widespread violence in the area – many people had been killed and women and girls had been raped.
My family and I then decided to escape to Bangladesh on foot. We saw lots of dead bodies in the villages, on the road, and in the paddy fields on the way. Houses were burnt to a crisp. We trekked on through the jungle and across the mountain in the cold and rain. We didn’t eat for days. After 15 days, we arrived in Bangladesh.
Today, exactly six years later and living in Cox’s Bazar refugee camp, I still yearn for home. I refuse to give up on my dream of becoming a lawyer but there are very few opportunities for young Rohingya to escape the camp, we have no right to education.
I still believe in a better future, to be able to live in a secure and peaceful world. Do the people who run Facebook feel the same way? They failed to address this hateful content, even by their own policies.
I’d like to meet Mark Zuckerberg and his team, maybe they’d like to come and spend a night or two in the refugee camp. I’d tell them: “Can’t you see your role in our suffering? We asked you, repeatedly, to try and help make things better for us. Funding education to help young people can’t ever reverse what happened, but it would, at least, help us build a brighter future. Yet you ignore our pleas. Tell me, do you feel anything for us? Is it only about the data, is it only about the dollars?”
Effective regulation of Meta ‘a matter of life and death’
By Pat de Brún, Head of Big Tech Accountability at Amnesty International
Meta’s disastrous role in the persecution of the Rohingya people must act as a stark reminder to EU regulators: effective regulation of Big Tech is a matter of life and death.
Today is not only the six-year anniversary of the Rohingya’s darkest day. It also marks the coming into force of key provisions of the Digital Services Act (DSA) – the EU’s landmark new legislation governing the Big Tech industry. This law contains significant constraints on Big Tech, including minimum safety standards for algorithmic recommender systems. If properly enforced, it has the potential to prevent or mitigate any recurrence of what happened to the Rohingya.
Facebook became an echo chamber of hate and incitement targeting the long-persecuted minority group.
Pat de Brún
I have absolutely no doubt that Facebook’s dangerous algorithms – hard-wired to drive “engagement” and corporate profits at all costs – actively fanned the flames of hate, ultimately contributing to mass violence and the forced displacement of most of the Rohingya population of Myanmar into neighbouring Bangladesh six years ago.
In the years and months leading up to these atrocities in 2017, Facebook became an echo chamber of hate and incitement targeting the long-persecuted minority group. And this happened in a context where “Facebook [was] the Internet”, according to a UN investigation.
What’s more, The Facebook Papers, leaked by whistleblower Frances Haugen in 2021, highlighted the inner workings of the company – one stunning revelation after another. These leaks made it clear that Meta had long been aware that its algorithms were responsible for disproportionately spreading hate and disinformation, and that its business model was fueling serious real-world harms, particularly in communities affected by conflict.
It was evident, even when presented with this information, that the company had maintained its ‘business as usual’ approach. The leaks also revealed that Meta’s narrative about the company’s supposedly passive role in Myanmar did not hold true. This realization prompted us at Amnesty International to launch an investigation into the company’s role in the ethnic cleansing of the Rohingya.
Far from being a neutral actor faced with an unprecedented crisis, Meta was an active contributor to the horrors faced by the Rohingya.
Pat de Brún
Last year, Amnesty published the findings of this investigation. It revealed that Meta had dramatically underplayed the true nature and extent of its contribution to the suffering of the Rohingya and found that, far from being a neutral actor faced with an unprecedented crisis, Meta was an active contributor to the horrors faced by the Rohingya.
We can now authoritatively conclude that the algorithms which power the Facebook platform fueled the spread of hate and violence like wildfire, proactively pushing content which incited violence, and disproportionately amplifying the most inflammatory content in the lead up to the horrors of 2017.
Facebook was an enabler of the violence and atrocities to come.
Meanwhile, as its algorithms fanned the flames of hate, Meta staff ignored repeated warnings from human rights activists, academics, and other experts. Between 2012 and 2017, senior Meta staff received at least 15 direct warnings stating that the Facebook platform risked contributing to an outbreak of mass violence against the Rohingya.
There is little doubt: Meta contributed to serious human rights violations and, therefore, has a responsibility under international human rights law and standards to provide reparations to the Rohingya.
We are calling for the Rohingya to be compensated and for Meta to take steps to ensure this never happens again by altering its business model – a business model that profits from the proliferation of hate.
We presented our findings to Meta, tens of thousands of people joined our campaign and yet, so far, little has changed. Meta’s toxic business model remains hardwired for engagement above all else.
We at Amnesty International will stand with the Rohingya until justice is done.
Pat de Brún
Meta, one of the wealthiest companies on the planet, has even refused the community’s modest request to provide $1million as partial reparations for an education fund for displaced Rohingya youth struggling to realize their potential in the sprawling refugee camps of Cox’s Bazar. Meta does not, so they say, engage in “philanthropic activities”. But this was no request for charity; it is about Meta failing to fulfil its human rights responsibilities.
Despite the enormous power and wealth wielded by Meta, the Rohingya community has refused to give up hope, and remain steadfast in their determination to secure accountability from the company. We at Amnesty International will stand with the Rohingya until justice is done.
Today’s coming into force of the DSA marks an historic and vital step forward in efforts to rein in Big Tech. Yet much is still at stake. Robust enforcement and implementation are critical if the DSA is to fulfil its promise and protect people from Big Tech’s destructive business practices.
The European Commission and EU members states now have a pivotal role to play in ensuring that the DSA is more than just a paper tiger. It is essential that EU regulators learn from history, and commit to ensuring that we never again see any repeat of Meta’s role in the Rohingya crisis.