Rights organisation, Amnesty International, has raised concerns about Facebook’s alleged role in exacerbating violence during the two-year conflict in Ethiopia’s northern Tigray region.
In a critical report, Amnesty accuses the social media giant of significantly amplifying the spread of harmful content through its algorithms, asserting that the company inadequately addressed the dissemination of such content.
These allegations present another challenge for Meta, Facebook’s parent company, which has previously denied similar claims.
Meta has emphasised its substantial investments in content moderation and the removal of hateful content from its platform. Facebook remains a vital source of information for many Ethiopians.
However, as the conflict between the federal government and allied forces and the Tigrayan forces continued, Facebook’s role in allegedly promoting hate speech came under increased scrutiny.
The African Union’s peace envoy, former Nigerian President Olusegun Obasanjo, estimated that approximately 600,000 people lost their lives during the conflict, with causes of death attributed to combat, starvation, and inadequate healthcare.
The conflict reached a ceasefire nearly a year ago following a peace agreement between the federal government and the Tigray People’s Liberation Front (TPLF), which predominantly governs the Tigrayan region. Nevertheless, Ethiopia continues to grapple with other conflicts, including those in the expansive Oromia and Amhara regions.
Amnesty International’s report underscores Meta’s “data-hungry business model,” which, according to the report, still poses “significant dangers” to human rights in areas affected by conflict.
Facebook has faced previous accusations of spreading incitement messages against ethnic Tigrayans. Currently, Meta is facing a lawsuit alleging its failure to address harmful content, with two petitioners seeking over $1.5 billion (£1.2 billion) in damages.
Amnesty’s investigation involved reviewing internal documents from Meta, including communications received by the company from 2019 to 2022.
The rights group claims that despite repeated warnings and a history of contributing to violence in other nations, Meta failed to implement necessary measures.
According to Amnesty, “Facebook’s algorithmic systems amplified the dissemination of harmful rhetoric targeting the Tigrayan community, while the platform’s content moderation systems failed to detect and respond appropriately to such content.”
Meta responded by informing the BBC that it was actively improving its capabilities to combat “violating content” published in widely spoken Ethiopian languages.
Ethiopia, Africa’s second most populous state with a population of 113.6 million, recognises Amharic as its official working language, although other languages like Afaan Oromoo, Tigrinya, Somali, and Afar are also spoken in the country.
1 Comment
Pingback: Meta’s Oversight Board is fast-tracking two cases about Israel-Hamas war content - Innovation Village | Technology, Product Reviews, Business