Kenya's National Cohesion and Integration Commission (NCIC), a government agency that aims to eradicate ethnic or racial discrimination among the country's 45 tribes, has given Facebook seven days to tackle hate speech related to next month's election on its platform. If the social media fails to do so, it faces suspension in the country. The agency's warning comes shortly after international NGO Global Witness and legal non-profit Foxglove released a report detailing how Facebook approved ads written to instigate ethnic violence in both English and Swahili.
The organizations joined forces to conduct a study testing Facebook's ability to detect hate speech and calls for ethnic-based violence ahead of the Kenyan elections. As Global Witness explained in its report, the country's politics are polarized and ethnically driven — after the 2007 elections, for instance, 1,300 people were killed and hundreds of thousands more had to flee their homes. A lot more people use social media today compared to 2007, and over 20 percent of the Kenyan population is on Facebook, where hate speech and misinformation are major issues.
The groups decided not to publish the exact ads they submitted for the test because they were highly offensive, but they used real-life examples of hate speech commonly used in Kenya. They include comparisons of specific tribal groups to animals and calls for their members' rape, slaughter and beheading. "Much to our surprise and concern," Global Witness reported, "all hate speech examples in both [English and Swahili] were approved." The NCIC said the NGOs' report corroborates its own findings.
After the organizations asked Facebook for a comment regarding what it had discovered and hence made it aware of the study, Meta published a post that details how it is preparing for Kenya's election. In it, the company said it has built a more advanced content detection technology and has hired dedicated teams of Swahili speakers to help it "remove harmful content quickly and at scale." To see if Facebook truly has implemented changes that has improved its detection system, the organizations resubmitted its test ads. They were approved yet again.
In a statement sent to both Global Witness and Gizmodo, Meta said it has taken "extensive steps" to "catch hate speech and inflammatory content in Kenya" and that the company is "intensifying these efforts ahead of the election." It also said, however, that there will be instances where it misses things " as both machines and people make mistakes."
Global Witness said its study's findings follow a similar pattern it previously uncovered in Myanmar, where Facebook played a role in enabling calls for ethnic cleansing against Rohingya Muslims. It also follows a similar pattern the organization unearthed in Ethiopia wherein bad actors used the Facebook to incite violence. The organizations and Facebook whistleblower Frances Haugen are now calling on Facebook to implement the "Break the Glass” package of emergency measures it took after the January 6th, 2021 attack on the US Capitol. They's also asking the social network to suspend paid digital advertisements in Kenya until the end of the elections on August 9th.
from Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics https://ift.tt/gsVrwmk
No comments:
Post a Comment