Facebook to Police Content in African Languages
Facebook has announced that it will set up its first center to monitor African languages. The content review center will review content in diverse African languages—Swahili, Oromo, Somali etc. The move is expected to enhance safety and security on the platform.
In order to enhance their user experience, Facebook has enabled multiple languages on its platform. This means that a user, for example, can switch the platform’s language from English to French. In addition, the platform has translation services. For example, a user can post in Zulu or Somali and use Facebook features to translate the content into English.
More importantly, the launch of a content review center for Africa is motivated by the need to police harmful content—such as hate speech. In 2018, Facebook had to hire Sinhala speakers in Sri Lanka during the anti-Muslim riots. The riots were fueled by online vitriol. The Sinhala speakers were hired to identify inflammatory content on the social media platform.
The company faced a similar challenge in Myanmar. Facebook posts were used to spread ethnic violence. The company started efforts to remove fake and harmful accounts used to spread fake news and hate. Also, Facebook banned from its platform groups considered to be dangerous in Myanmar.
The Sri Lanka and Myanmar incidents emphasize the need for social media giants to include native speakers in their teams. Due to the lack of enough Sinhala speakers during the Sri Lanka riots, extreme content could be spread without detection. The new Facebook center in Africa aims to prevent the use of its platform to fuel such hatred.
Facebook in Nairobi
The new content review center for African languages will be located in Nairobi, Kenya. The social media giant will partner with Samasource in the venture and will hire around 100 reviewers. In addition to the reviews to enhance safety and security on the platform, the new center shows Facebook’s commitment to continue investing in Africa—and in African people. Mark Zuckerberg visited Nairobi in 2016 to learn about mobile money.
Reviewers will be checking content for hate speech, nudity, misrepresentation, and any other form of abuse. In addition to the 100 reviewers, Facebook will use machine learning and artificial intelligence to detect harmful content.
Although this will be the first in Africa, Facebook already has content reviewers in other regions in the world. The reviewers are assigned a queue of reported posts. The staff then evaluate the posts one by one. According to the company, there is no quota or limit on the amount of content that reviewers should scrutinize at a given time.