On the sixth anniversary of the Myanmar military’s brutal operation, Meta should pay reparations to the Rohingya for the role that Facebook played in the ethnic cleansing of the persecuted minority group, says Amnesty.
Facebook’s algorithms and Meta’s ruthless pursuit of profit created an echo chamber that helped foment hatred of the Rohingya people and contributed to the conditions which forced the ethnic group to flee Myanmar en masse, reports Anmesty.
“Six years have gone by since Meta contributed to the terrible atrocities perpetrated against the Rohingya people. Yet although this stands out as one of the most egregious examples of a social media company’s involvement in a human rights crisis, the Rohingya are still awaiting reparations from Meta,” said Pat de Brún, Head of Big Tech Accountability at Amnesty International.
“Our investigations have made it clear that Facebook’s dangerous algorithms, which are hard-wired to drive “engagement” and corporate profits at all costs, actively fanned the flames of hate and contributed to mass violence as well as the forced displacement of over half the Rohingya population of Myanmar into neighbouring Bangladesh.
“It is high time Meta faced its responsibilities by paying reparations to the Rohingya and by fixing its business model to prevent this from happening again.”
Coincidentally, 25th August also marks an important step in holding Big Tech to account for its human rights impacts as it is when key provisions of the Digital Services Act (DSA) come into force for major online platforms in the European Union. The DSA is a landmark piece of legislation aimed at strengthening rights in the digital age, which could create ripple effects far beyond the EU.
A personal plea to Meta and Mark Zuckerberg
Amnesty International and Al Jazeera published a searing first-person account by Rohingya refugee Maung Sawyeddollah, who was forced to flee his village in Myanmar when he was just a teenager. He fled through torched villages and fields filled with dead bodies and now lives in the world’s biggest refugee camp, Cox’s Bazar in Bangladesh, with around a million of his people.
Tell me, do you feel anything for us? Is it only about the data, is it only about the dollars?
By Rohingya Mawng Sawyeddollah to Meta
As a child, before the hate took root with the help of Facebook, he and his mostly Muslim Rohingya friends played happily with the mostly Buddhist Rakhine children from the neighbouring village — but that all changed when the military moved in.
“I’d like to meet Mark Zuckerberg and his team. Maybe they’d like to come and spend a night or two in the refugee camp?”, Sawyedollah writes. “I’d tell them: ‘Can’t you see your role in our suffering? We asked you, repeatedly, to try and help make things better for us… Yet you ignore our pleas. Tell me, do you feel anything for us? Is it only about the data, is it only about the dollars?”
“I’m from a village called Nga Yent Change in western Myanmar. My father had a thriving store there, and I lived with my parents and six younger siblings in a large house in a spacious compound surrounded by mango, coconut, and banana trees. Sometimes elephants would meander into the village and then out again into the forest,” he says.
Sawyeddollah continues, “I had many friends in the village next door. It didn’t matter that they were Rakhine (mostly Buddhist) and we were Rohingya (mostly Muslim). We were just kids who’d meet in a shared field to play chinlone (a popular team game using a weaved ball). We had a lot of fun together, same as any other kids.Now, six years since the Myanmar military’s ‘clearance operations’, here I am in Cox’s Bazar: the biggest refugee camp in the world across the border in Bangladesh. Around a million of my people are now crammed into this place, living in tiny shelters made from bamboo and tarpaulin. Life is a daily struggle to find even food and water. There have been fires, there have been killings.”
Sawyeddollah says he blames Mark Zuckerberg, Facebook, and the people who run Meta for helping to create the conditions that allowed the Myanmar military to unleash hell upon them. The company’s vast wealth is generated, at least in part, through the human misery suffered by the Rohingya. he adds.
There had been a long history of tension between communities in the area, but I had experienced no substantial day to day animosity until Facebook and smartphones came along. Facebook became a tool for politicians, bigots, and opportunists to propagate and escalate hate against my people which was then translated into real life harm.
“In late 2016, the persecution began to have a direct impact on my family. My father and some other financially stable Rohingya were falsely accused of attacking a police station and handed big fines. My uncle Abusufian and his son Busha were arrested for not paying their fine and were jailed without trial; they spent over four years in prison. Between 2016 and 2017, I saw many hateful and Islamophobic messages against Rohingya on Facebook. One message incited people to get together to “save the country and kick out the illegal ‘Bengalis'”, while another stated that “the birth rate of the illegals is very high. If we let it continue, soon the president of our country will have a beard.”
Sawyeddollah say he reported this to Facebook, but they did nothing, telling me: “It doesn’t contravene our community standards.”
He reveals that he would like to meet Mark Zuckerberg and his team, maybe they’d like to come and spend a night or two in the refugee camp and tell them: “Can’t you see your role in our suffering?
“We asked you, repeatedly, to try and help make things better for us. Funding education to help young people can’t ever reverse what happened, but it would, at least, help us build a brighter future. Yet you ignore our pleas. Tell me, do you feel anything for us? Is it only about the data, is it only about the dollars?”
Beginning in August 2017, the Myanmar security forces undertook a brutal campaign of ethnic cleansing against Rohingya Muslims in Myanmar’s Rakhine State. They unlawfully killed thousands of Rohingya, including young children; raped and committed other sexual violence against Rohingya women and girls; tortured Rohingya men and boys in detention sites; and burned down hundreds of Rohingya villages. The violence pushed over 700,000 Rohingya — more than half the Rohingya population living in northern Rakhine State at the beginning of the crisis — into neighbouring Bangladesh.
Meta contributed to serious adverse human rights impacts suffered by the Rohingya in the context of the 2017 atrocities in Rakhine State and therefore has a responsibility under international human rights standards to provide an effective remedy to the community. This includes making necessary changes to its business model which can ensure this never happens again. All companies have a responsibility to respect all human rights wherever they operate in the world and throughout their operations. This is a widely recognized standard of expected conduct as set out in international business and human rights standards, including the UN Guiding Principles on Business and Human Rights (UN Guiding Principles) and the OECD Guidelines for Multinational Enterprises (OECD Guidelines).
Last year, Amnesty International published a report detailing Meta’s role in the atrocities committed against the Rohingya people by the Myanmar military in 2017. It revealed that even Facebook’s internal studies dating back to 2012 indicated that Meta knew its algorithms could result in serious real-world harms. In 2016, Meta’s own research clearly acknowledged that “our recommendation systems grow the problem” of extremism.