Facebook has failed to capture Islamic State and al-Shabaab extremist content in posts targeting East Africa, a new study has found, as the region remains threatened with violence and Kenya prepares to vote in a competitive national election.
Last year, an Associated Press series drew on leaked documents shared by Facebook whistleblowers, showing how the platform has repeatedly failed to act on sensitive content, including hate speech in many parts of the world.
The new, unrelated two-year study by the Institute for Strategic Dialogue found that Facebook posts that openly support IS or Somalia-based al-Shabaab — even posts branded al-Shabaab and branded with Swahili Languages such as Li, Somali and Arabic call for violence – and are allowed to be shared widely.
The report is particularly concerned with narratives linked to extremist groups that accuse Kenyan government officials and politicians of being the enemy of Muslims, who make up a large portion of the East African country’s population. “Xenophobia has long been pervasive in the Kenyan Somali community,” the report states.
Al-Qaeda-linked al-Shabaab, described as Africa’s deadliest extremist group, has carried out high-profile attacks in Kenya, far from its bases in neighbouring Somalia, in recent years.
The new study found no evidence that Facebook posts planned specific attacks, but its authors and Kenyan experts warned that even allowing widespread calls for violence would pose a threat to the competitive August presidential election. Concerns about hate speech surrounding voting have grown, both online and offline.
“They erode trust in democratic institutions,” Mustafa Ayyad, a researcher at the report, told the AP extremist post.
The Institute for Strategic Dialogue found 445 public profiles, some with duplicate accounts, sharing content related to both extremist groups, and flagged more than 17,000 other accounts. In shared narratives, Kenya and the US were accused of being enemies of Islam, while in postings, al-Shabaab’s state media arm praised the killing of Kenyan soldiers.
“Why didn’t they act on the rampant content posted by Al-Shabaab?” he asked. “You would think that after 20 years of dealing with al-Qaeda, they would have a good understanding of the language and symbolism they use.”
The authors have discussed their findings with Facebook, and some of the accounts have since been deleted, he said. The authors also plan to share the findings with the Kenyan government, he said.
Ayad said civil society and government agencies such as Kenya’s National Counter Terrorism Centre should be aware of the problem and encourage Facebook to do more.
Asked for comment, Facebook asked for a copy of the report before it was released, but was declined.
The company then responded with an emailed statement.
“We have removed some of these pages and profiles and will continue to investigate once we have full findings,” Facebook wrote on Tuesday, without giving any names, citing security concerns. “We don’t allow terrorist groups to use Facebook, and when we become aware of these groups, we remove content that praises or supports these groups. We have dedicated teams – including those in Arabic, Somali and Swahili. Native speakers – dedicated to the work.”
Critics say concerns about Facebook’s content surveillance are global.
“As we’ve seen in India, the US, the Philippines, Eastern Europe and elsewhere, the consequences of failing to moderate content posted by extremist groups and supporters can be deadly and can push democracy to the brink,” the watchdog said. Watchdog The real Facebook oversight board spoke about the new report, adding that Kenya is currently “the epitome of everything wrong” for Facebook owner Meta.
“The question is, who should be asking Facebook to come forward and do its job?” asked Leah Kimathi, a consultant on governance, peace and security in Kenya, who suggested that government agencies, civil society and consumers all have a role to play. “Facebook is a business. The least they can do is make sure that what they sell us doesn’t kill us.”