Věra Jourová, Vice President of the European Commission for Values and Transparency, oversees the implementation of the Code of Practice on Disinformation. This unique initiative, established in 2018, involves various stakeholders such as tech companies, advertisers, and major online platforms, working together to uphold self-regulatory standards to combat disinformation. Signatories to the Code include Microsoft, Meta, Google, Twitch, TikTok, Adobe, and Reporters Without Borders, among others.
We’ll be watching you
While X was initially a signatory to the Code, it withdrew from the agreement in May 2023. Nevertheless, the company remains subject to the EU’s Digital Services Act, which mandates safer online environments for users of online platforms.
In response to X’s withdrawal, Jourová issued a warning, stating that leaving the Code of Practice doesn’t exempt X, as it has now been designated as a very large online platform. She emphasized the obligations imposed by legal frameworks and asserted that the European Commission would closely monitor the platform’s compliance.
Each signatory has submitted a report to the European Commission, outlining their efforts to combat disinformation over the past six months. With upcoming elections in Poland, Slovakia, and the European Parliament election next year, the Commission is particularly concerned about the potential for Russian interference to spread misinformation online and influence voters.
It’s important to note that the Code of Practice was developed in response to the Facebook-Cambridge Analytica scandal and allegations of Russian interference in the 2016 US elections.
Jourová stressed that while disinformation is not a new phenomenon, increasing digitalization has given malicious actors new tools to undermine democracies. She underscored the relevance of the Russian-Ukraine conflict and the upcoming EU elections as contexts where the risk of disinformation is particularly severe.
Disinformation on X is nothing new
X has a history related to disinformation issues. In November 2022, when X was still known as Twitter, the platform laid off staff responsible for combating misinformation. A research report published in early 2023 indicated a surge in antisemitic posts on X following Elon Musk’s takeover of the company on October 27, 2022.
In December 2022, Twitter introduced a feature called Community Notes, which relies on crowd-sourced fact-checking to combat misinformation. Users can leave notes on posts they find suspicious, and if these notes receive sufficient endorsements from contributors with diverse perspectives, they become publicly visible on the post.
However, concerns have arisen regarding the effectiveness of crowd-sourcing as a means to address misinformation, especially given the high volume of posts on X, with over 6,000 posts being uploaded every second. X has also stated that they take measures to limit the amplification of misleading content and remove it from the platform, particularly if it poses immediate and severe offline consequences.