Meta, the owners of Facebook, Instagram, and Whatsapp, said it has set up a cross-functional team spread across the world to monitor posts by Nigerians pre, during, and post-2023 elections in the country, especially on both Facebook and Instagram.
According to Meta, the team was put together to tackle hate speech, misinformation, and fake news throughout the election period.
It highlighted several other steps it has taken to protect the integrity of the Nigeria elections.
Composition of the team: Without disclosing details about the team members, Meta said the team include many Nigerians and others who have spent time living in the country.
- “The team also includes individuals with global expertise in misinformation, hate speech, elections and disinformation. These teams are working hard to prevent any abuse of our services before, during, and after Nigeria’s 2023 general elections. Locally, we also have staff who reside in Nigeria and work in public policy, & public policy programmes, and communications,” it added.
Why the move: Addressing a press conference in Lagos on Wednesday, Meta’s Head of Public Policy for Anglophone West Africa, Adaora Ikenze, said the company’s approach to the coming elections was informed by its experience from past elections in Sub-Saharan Africa and its conversations with human rights groups, NGOs, local civil society organisations, regional experts and local election authorities.
- “We know we have an important responsibility when it comes to helping keep people safe during the elections. Using lessons from the past including input from experts and policymakers across the national spectrum, we’ve made substantial investments in people and technology to reduce misinformation, remove harmful content on our platforms, fight voter interference and promote civic engagement during the elections. We continue to work closely with election authorities and local partners in Nigeria to ensure we’re preparing for the specific challenges in Nigeria and taking appropriate steps to stay ahead of emerging threats,” she said.
She further disclosed that since 2016, Meta has quadrupled the size of the global teams working on safety and security to about 40,000 people, and has invested more than $16 billion in teams and technology in this area. This also includes over 15,000 content reviewers located across the globe in every major timezone.
She said these reviewers can review content in more than 70 languages- including Yoruba, Igbo and Hausa.
WhatsApp misinformation: While WhatsApp is widely used by many for misinformation through forwarded messages, Ikenze said Meta has introduced measures to limit the spread of viral messages. According to her, any message that has been forwarded once, will now only be able to be forwarded to one group at a time, rather than five, which was the previous limit.
- “When we introduced the same feature for highly forwarded messages, it reduced the number of these messages sent on WhatsApp by over 70%. We also label ‘forwarded’ and ‘highly forwarded’ messages to highlight when something has been shared multiple times,” she said.