Meta’s Policy Restrictions and Civil Unrest
What we asked for was for Meta to clarify that its policy restricting accounts for public figures should apply not only in contexts where we have incidents of civil unrest or incidents of violence, but also where political expression is preemptively suppressed or responded to with violence or threat of violence from the state using Meta’s platforms. The question is, what should we consider civil unrest? Civil unrest has to be an incident—an isolated incident of violence, or an ongoing incident of violence. When you have violence that preemptively suppresses political opposition, political discourse, through the use of Meta’s platforms, should that also be considered civil unrest? For the board, it should have been considered civil unrest.
Emergency Decisions and Democratic Processes
WIRED: We saw the board deal with its first emergency decisions around the Israel–Hamas conflict late last year. The case dealt with posts that had been improperly removed from Meta’s platforms for violating its policies, but the board felt they were important for the public to understand the conflict. Do you anticipate that this is a mechanism the board may need to rely on to render judgments in time spans that can have a meaningful effect on the democratic process?
I think that the exercise we had with the Israel–Hamas conflict was successful, and I expect us to use it again this year, maybe in election-related issues. And I say “maybe,” because when you are trying to protect elections, when you’re trying to protect democratic processes, it is something that you have to prepare ahead of time. The reason why we, for example, asked Meta to establish what its election integrity efforts would be, and what they expected to achieve with those, is because you need planning to establish the different measures to address what can result from the elections. There, of course, can be things that have to be addressed at a specific moment.
But Meta, for example, when they prepare for elections, when they establish what they call the EPOC, the Election Operations Center, they establish it with enough time for them to be able to implement the measures that will be adopted throughout the election. We expect Meta to prepare correctly if there is a need to take an expedited decision. We do expect Meta to take the steps preemptively, not to wait until we have a decision that has to be addressed.
Company Layoffs and Election Preparedness
WIRED: We’ve seen a lot of layoffs across the sector, and many of the people who were in charge of election efforts at Meta have been laid off in the past year. Do you have concerns about the company’s preparedness for such a major year for democracy, particularly given their track record in the past?
A context in which you have huge layoffs is something of a concern. It can’t just be the countries with the most users or that generate the most revenue that get prioritized. We still have problems with inadequate staffing, the underinvested countries, many of which will have elections this year. We are living through a worldwide democratic backlash. And in that context Meta has a heightened responsibility, especially in the global south, where its track record has been poor in living up to these expectations.
I acknowledge that Meta has already set up, or knows how to set up, different risk evaluation and mitigation measures that can be applied to elections. Meta has also used election-specific initiatives in different countries—for example, working with electoral authorities, adding labels to posts that are related to elections, directing people to reliable information, prohibiting paid advertisement when it calls into question the legitimacy of elections, and implementing WhatsApp forward limits. But the board has found that in the enforcement of its community standards, Meta sometimes fails to consider the wider political and digital contexts. Many times this led to disproportionate restriction of freedom of expression or to underenforcement of content promoting or inciting violence. Meta must have adequate linguistic and cultural knowledge, and the necessary tools and channels to escalate potentially violating content.
It is clear that Meta, formerly known as Facebook, has been gearing up for the upcoming election year. With the rise of misinformation and fake news, the social media platform has been under scrutiny for its role in spreading false information. In response, Meta has introduced new measures to combat misinformation, such as fact-checking and content moderation. They have also updated their advertising policies to prevent foreign interference in elections. However, many critics are still skeptical about Meta’s readiness for the biggest election year in history. It remains to be seen whether these measures will be enough to prevent the spread of misinformation and ensure a fair and transparent election process.
Meta has undoubtedly taken some steps to prepare for the upcoming election year, but it is uncertain whether they are truly ready for the challenges that lie ahead. The platform has faced criticism in the past for its role in spreading misinformation and enabling foreign interference in elections. While Meta has implemented new measures, such as fact-checking and content moderation, to address these issues, many skeptics are still concerned about the platform’s ability to prevent the spread of false information. With the biggest election year in history on the horizon, it is essential for Meta to demonstrate that they are committed to creating a fair and transparent election process. Only time will tell whether the platform is truly prepared for the challenges that lie ahead.