Report indicates Meta’s role in Ethiopian conflict

Meta-Ireland-1.jpg

Meta accused of aiding conflict in Tigray

from ADANE BIKILA in Addis Ababa, Ethiopia
Ethiopia Bureau
ADDIS ABABA, (CAJ News) – META, Facebook’s parent company, has lurched into a fresh crisis in the continent after Amnesty International said it contributed to serious human rights abuses against Ethiopia’s Tigrayan community.

A report released by the human rights group suggests Meta failed to adequately curb the spread of content advocating hatred and violence, this time targeting Tigrayans during the November 2020 to November 2022 armed conflict in northern Ethiopia.

Amnesty said it had previously highlighted Meta’s contribution to human rights violations against the Rohingya in Myanmar and warned against the recurrence of these harms if Meta’s business model and content-shaping algorithms were not fundamentally reformed.

“Three years after its staggering failures in Myanmar, Meta has once again – through its content-shaping algorithms and data-hungry business model – contributed to serious human rights abuses,” said Agnès Callamard, Amnesty Secretary General.

She said even before the outbreak of the conflict in northern Ethiopia, civil society organizations and human rights experts repeatedly warned that Meta risked contributing to violence in the country, and pleaded with the company to take meaningful action.

However, sentiment is that Meta ignored these warnings and did not take appropriate mitigation measures, even after the conflict had broken out.

“As a result, Meta has again contributed to serious human rights abuses, this time perpetrated against the Tigrayan community,” Callamard said.

The Facebook platform is a major source of information for many Ethiopians and is considered a trustworthy news source.

Amnesty reports that Facebook’s algorithms fueled devastating human rights impacts by amplifying harmful content targeting the Tigrayan community across Facebook during the armed conflict.

Amnesty research established that Facebook’s algorithmic systems supercharged the spread of harmful rhetoric targeting the Tigrayan community, while the platform’s content moderation systems failed to detect and respond appropriately to such content.

Meta has also run into problems in Kenya, where some 184 moderators sacked by a company contracted by Meta to review Facebook posts have sued the company over unfair labour practices.

– CAJ News

 

 

 

 

 

 

 

 

scroll to top