Meta, the parent company of Facebook, contributed to serious human rights abuses against Ethiopia’s Tigrayan community, Amnesty International said in a new report published today.
A death sentence for my father: Meta’s contribution to human rights abuses in northern Ethiopia, shows how Meta has once again failed to adequately curb the spread of content advocating hatred and violence, this time targeting Tigrayans during the November 2020 to November 2022 armed conflict in northern Ethiopia.
Three years after its staggering failures in Myanmar, Meta has once again – through its content-shaping algorithms and data-hungry business model – contributed to serious human rights abuses.
Agnès Callamard, Amnesty International’s Secretary General
Amnesty International has previously highlighted Meta’s contribution to human rights violations against the Rohingya in Myanmar and warned against the recurrence of these harms if Meta’s business model and content-shaping algorithms were not fundamentally reformed.
“Three years after its staggering failures in Myanmar, Meta has once again – through its content-shaping algorithms and data-hungry business model – contributed to serious human rights abuses. Even before the outbreak of the conflict in northern Ethiopia, civil society organizations and human rights experts repeatedly warned that Meta risked contributing to violence in the country, and pleaded with the company to take meaningful action,” said Agnès Callamard, Amnesty International’s Secretary General.
“However, Meta ignored these warnings and did not take appropriate mitigation measures, even after the conflict had broken out. As a result, Meta has again contributed to serious human rights abuses, this time perpetrated against the Tigrayan community.”
The Facebook platform is a major source of information for many Ethiopians and is considered a trustworthy news source. Facebook’s algorithms fueled devastating human rights impacts by amplifying harmful content targeting the Tigrayan community across Facebook during the armed conflict.
Meta’s contribution to human rights abuses
Amnesty International’s research established that Facebook’s algorithmic systems supercharged the spread of harmful rhetoric targeting the Tigrayan community, while the platform’s content moderation systems failed to detect and respond appropriately to such content.
These failures ultimately contributed to the killing of Tigrayan university chemistry Professor Meareg Amare. Amare was killed by a group of men after posts targeting him were posted on Facebook on 3 November 2021.
“I knew it would be a death sentence for my father,”
Abrham Meareg told Amnesty International.
The Facebook posts contained his name, photo, place of work, house address and claimed that he was a supporter of the Tigrayan People’s Liberation Front (TPLF), accusing him of stealing large sums of money. These allegations were denied by his family.
His son Abrham Meareg believes that these hostile Facebook posts contributed to his father’s death.
“I knew it would be a death sentence for my father,” Abrham Meareg told Amnesty International.
At the time, the armed conflict in northern Ethiopia involving the TPLF, Ethiopian Federal Forces, and other armed groups was raging.
Internal Meta documents reviewed by Amnesty International show that Meta knew of the inadequacies of its mitigation measures in Ethiopia and the risks this presented in a country that the company itself considered to be at a high risk of violence. An internal Meta document from 2020 warned that “current mitigation strategies are not enough” to stop the spread of harmful content on the Facebook platform in Ethiopia.
Alongside amplifying harmful content, Meta’s poor response time and refusal to take down reported content caused multiple people interviewed by Amnesty International to feel that there was no point in reporting content to the company.
“The mass dissemination of these posts incited violence and discrimination targeting the Tigrayan community, pouring fuel on what was already an inflamed situation with significant ethnic tensions,”
Agnès Callamard.
Meta received multiple warnings both before and during the conflict from civil society organizations, human rights experts and its own Facebook Oversight Board, which recommended Meta undertake an independent human rights impact assessment on Ethiopia in 2021.
“The mass dissemination of these posts incited violence and discrimination targeting the Tigrayan community, pouring fuel on what was already an inflamed situation with significant ethnic tensions,” said Agnès Callamard.
*Gelila, a member of Ethiopian civil society, was part of Meta’s ‘Trusted Partner’ program – an initiative that aims to provide selected civil society groups with a designated channel to alert Meta of any harmful content – said Facebook’s failure to act on alerts made the human rights situation in the country worse.
“As someone who has been in Ethiopia for a long time, I can say that Facebook is making communities more vulnerable to conflict with each other,” said Gelila.
“They are extremely slow in reacting to things. They are not sensitive to what is said – I think they have standards which are very far from what is happening on the ground.”
Meta should take urgent measures to adequately mitigate the risks posed by the Facebook platform in Ethiopia, which is facing another security crisis in the Amhara region.
This is essential, given that UN-appointed investigators have warned about potential future atrocity crimes, a concern later echoed by the UN Special Adviser on the Prevention of Genocide, who warned of a heightened risk of genocide and related atrocity crimes in Tigray, Amhara, Afar and Oromia regions.
A recurrent failure to respect human rights
Meta’s content-shaping algorithms are designed to maximize user engagement for the purpose of serving targeted ads, with the result that they boost inflammatory, harmful, and divisive content, which tends to attract the most attention from users.
In 2018, Meta pivoted the Facebook news feed algorithm around a new metric called “MSI” or “Meaningful Social Interactions”, in a supposed attempt to “fix Facebook”.
However, Amnesty International’s analysis of evidence from the Facebook Papers – the internal Meta documents disclosed by whistleblower Frances Haugen in 2021 – show that this shift did not “fix” the problems associated with Facebook’s algorithms; instead, the algorithms remained hard-wired for maximum engagement, therefore disproportionately favouring inflammatory content, including advocacy of hatred.
In a document from the Facebook Papers from 2021, there is evidence suggesting that Meta CEO Mark Zuckerberg personally intervened to stop mitigation measures being applied in high-risk countries like Ethiopia, because the measures may have interfered with the MSI metric.
As the fundamentals of Meta’s engagement-centric business model have not changed, the company continues to present a significant and ongoing danger to human rights, particularly in conflict-affected settings.
Making it right – Meta’s responsibility to provide remedy
Meta has a responsibility to provide remedy for the human rights abuses it has contributed to in Ethiopia.
Urgent, wide-ranging reforms are needed to ensure that Meta does not contribute again to these harms in Ethiopia or in yet another country.
This includes deploying ‘break the glass’ measures – the steps which Meta can take in crisis contexts to reduce the power of algorithmic amplification – in high-risk situations as soon as they arise and ensuring equality and consistency between jurisdictions in terms of the resourcing of content moderation, policy and human rights teams globally.
States must fulfil their obligation to protect human rights by introducing and enforcing legislation to effectively rein in Big Tech’s business model. This includes prohibiting targeted advertising on the basis of invasive tracking practices.
Equally, Big Tech companies also have a responsibility to respect human rights independent of states’ obligations and where they fail to do so, they must be held accountable for the violations they have caused or contributed to. Meta disputed the findings. Their response is reflected in the report.
“My father didn’t have a Facebook account, but he was murdered on Facebook.”
This is his story.
Abrham Meareg is the son of a Tigrayan academic who was killed after being targeted in Facebook posts. He is one of the petitioners suing Facebook in the Kenyan High Court. Abrham reported the threats against his father to Facebook and asked for the posts to be taken down. Facebook only removed the posts eight days after his father’s death.
My name is Abrham Meareg, and my father was Professor Meareg Amare Abrha of Bahir Dar University, Ethiopia. He was a well-respected university professor in Ethiopia. He was also a family man with a wife and four children. We were a happy family living among the Amharic-speaking community even though we were Tigrayan. My father was living with the Amharic speaking people for 40 years with absolute peace and respect. His national stature made him a target. On 9 and 10 October 2021, two posts appeared on the “BDU STAFF” Facebook page, which has 50 000 followers, showing pictures of my father and our home address. The post and the comments it garnered referred to him as a corrupt person, a supporter of the Tigrayan People’s Liberation Front political party (TPLF), and that he had fled to the USA after relocating his family to Addis Ababa. All those claims were false.
Armed conflict in northern Ethiopia
At the time, Ethiopia was in the throes of a raging armed conflict in the north of the country involving Ethiopian Federal Forces, the TPLF and other armed groups. The conflict was characterised by a rise in online posts inciting violence and the use of racial slurs against Tigrayans. As soon as I saw the posts, I knew they were a death sentence for my father. Facebook is a prime public platform in Ethiopia. It’s the main source of information in many parts of the country. I immediately reported the posts to Facebook and asked them to be taken down. There was no immediate action despite my multiple reports using their onsite reporting tools.
My father’s death
Facebook finally replied on 11 November 2021, eight days after my father had been killed saying the posts on BDU STAFF were against their community standards policy and removed them. It was too late. Three weeks later, eyewitnesses told me that on the morning of 3 November, 2021, my father was followed home from university by gunmen wearing Amhara Special Forces uniform, a part of the regional force. He was shot twice as he opened the gate. Bullets were sprayed into the house, the fence, the gate, and the rooms inside. The eyewitnesses told me neighbours were ordered not to deliver first aid, take him to hospital or cover his body, saying he was a traitor. These accounts were corroborated by the Ethiopian Human Rights Commission Bahir Dar division.
“He is a Junta! He is a Tigre! He supports the TPLF! He is their agent hiding and living with us here! We warn you not to help or cover the body,”
the armed militias chanted quoting the Facebook posts that led to him becoming a target.
Two other eyewitnesses told me my father was buried in an unmarked grave. The killers also took his private car on that day, vandalized our property, forced our mother to relocate to Addis Ababa and took full control of our family home till today.
We were in the dark as all this happened only to be alerted of his burial through a report of the Ethiopian Human Rights Commission. This makes our grief even more bitter. So, now you know my story and that of my father.
Facebook accountability
I believe in my heart that Facebook contributed to his murder. The platform has contributed to the spreading of hate and violence leading to the deaths of thousands of my countrymen. My father didn’t even have a Facebook account – but he was slandered on Facebook, doxed on Facebook, and murdered on Facebook. That’s why I’m bringing a legal case in the Kenyan High Court – the country in which the posts that incited the murderers of my father were moderated.
I believe in my heart that Facebook is directly responsible for his murder. The platform has been used to spread hate and violence causing the deaths of thousands of my countrymen.
Abrham Meareg
It’s not only posts from Ethiopia that are moderated in Kenya, but all Eastern and Southern Africa, where more than 500 million people live. Don’t those lives matter? Don’t all African lives matter? I demand a public apology from Meta. I’m demanding Facebook to invest to stop these tragedies from happening. I am asking the court in Kenya to order Facebook to fix its safety systems and to hire many more moderators so that violence and hate doesn’t keep spreading. Finally, I’m asking the Court to create a restitution fund of 250 billion Kenyan shillings (about $1.5 billion USD) for posts leading to violence, and another 50 billion Kenyan shillings (about $335 000 USD) for sponsored posts doing the same. None of this will bring back my father and many other people’s loved ones but it will help them rebuild their broken lives. It is after all, just a fraction of the vast profits Facebook has made from viral hate. Mark Zuckerberg can stop what happened to my father from happening again. He hasn’t done enough and that’s why I’m doing this to say, ‘never again.’