Digital Wildfire of Disinformation in the Netherlands

View Report

Executive Summary

Foundation the London Story, a diaspora-led think-tank committed to human rights and transnational issues, presents its findings on how Facebook’s business model is enabling COVID-19 “infodemic” in the Netherlands.

In recent years, headlines have focused on Facebook’s role in undermining democratic elections, amplifying hate speech and spreading public health misinformation in large democracies such as India, the US and the European Union.However, less attention has been paid to its effects in smaller countries, particularly those which are not English-speaking. This is the first analysis of its kind to extensively cover disinformation and misinformation in the Netherlands, and highlight the different roles of political actors, inconsistent platform policies, and social influencers in igniting and sustaining a wildfire of disinformation.

The study uses three time periods to provide a holistic view of the disinformation problem and how Facebook has contributed to it. The period between 17th March 2019 to 17th March 2021 is used to investigate the growth of political parties on Facebook, particularly through paid Facebook content. The period between 17th March 2020 and 17th March 2021 is used to study the growth of COVID-19 disinformation in Netherlands-based Dutch-language pages and groups. And the period of 17th December 2020 and 17th March 2021 is used to highlight how Facebook’s business model allows political parties to push COVID-19 disinformation.

The study confirms that politically motivated social media campaigns that originated in the United States of America influenced the social and political behavior of people in the Netherlands. The study defines this person-to-person transmission of misinformation narratives and content between countries as a “silent influence” of social media.The role of social media in silently influencing peoples’ opinions and breaking down social cohesion is a transnational problem that needs to be addressed in a transnational environment.

Despite Facebook’s policy against COVID-19 misinformation and users who support QAnon, weak enforcement in the case of political actors and a lack of appropriate checks contributed to an acceleration (and rationalizing) of COVID-19 disinformation especially during election campaigning period of December 2020 to March 2021.

The study traces 170 public groups and pages, having a total of 1.3 million ‘page likes. These groups and pages are infested with COVID-19 disinformation and QAnon-related conspiracy content (see Annex 1 for the list of groups and pages investigated). The number of Dutch-language QAnon and COVID-19 pages on Facebook is rising. Between 17th March 2020 and 17th March 2021, Dutch-language pages and groups generated nearly 21.33 million interactions and around 93.13 million video views. Entities flagged in this report have grown by +188.9% (conspiracy groups) and +155.45% (conspiracy pages) over the year-long time period and have proliferated on Telegram and the MeWe app through actively recruiting on Facebook.

The study highlights how COVID-19 disinformation created distrust in the COVID-19 measures adopted by the Dutch government. In turn, these led to anti-COVID-19 measure protests across the Netherlands, unwillingness towards the use of facemasks in public places, and a hesitancy towards COVID-19 vaccines, as documented by Eurofound and Rijksinstituut voor Volksgezondheid en Milieu RIVM.

The investigation also found that the official Dutch COVID-19 communication channel has a far lower page following and level of interactions when compared to conspiracy groups and pages on Facebook. While the level of interaction generated by these conspiracy groups and pages was close to 21.33 million between 17th March 2020 and 17th March 2021, in comparison, the official RIVM channel on Facebook generated only 608,000 interactions.

Members of the Facebook groups traced during this study organized and participated in sometimes violent anti-lockdown protests across the Netherlands.  There were also threats levelled against officials, such as the director of the public health Centrum Infectieziektebestrijding (CIb).

The study shows Facebook’s claim that the company does not make “any meaningful profits” from political ads is factually incorrect in the Netherlands. Between 17th March 2019 and 17th March 2021, Facebook’s total ad revenue in the Netherlands was approximately 15 million Euros, of which political ads constituted around 25%, making them the largest contributing group. The primary Facebook pages of the 18 political parties tracked in the study spent close to 3.3 million Euros on Facebook ads during this time.

The far-right party FvD ranked among the top five Facebook ad buyers in the Netherlands. In this period the 18 parties we tracked ran a total of 32,125 political ads. Of these, at least 1,095 political ads ran without a disclaimer, in spite of Facebook’s Ad policy requirements.

The study found that Dutch far-right parties and their members, actively promote sensationalist content that is not permitted according to Facebook’s own ad and community standards. The study further found that towards the run-up to the election far-right actively promoted disinformation on Facebook through both organically grown content and paid political ads. These political ads contained COVID-19 disinformation and QAnon styled conspiracy theories which Facebook has specifically committed to remove and reduce. Facebook’s approval process clearly allowed these advertisements, and the platform generated revenue from this polarizing content. Over three months, between 17th December 2020 and 17th March 2021, Facebook earned at least 199,300 Euros through COVID-19 disinformation ads.

Over a period of 7 weeks (From February 1st 2021 to 21st March 2021), the team of researchers from St. the London Story flagged 938 posts (including political ads and comments) that violated community standards to Facebook. Only 12 of our flagged posts were deleted. Despite Facebook’s commitment to countering COVID-19 disinformation and removing QAnon content, Facebook’s content moderation fails miserably when the content is not in English.

The study findings underscore what Mark Zuckerberg admitted in 2018: the more “borderline” the content, the more likely it is to receive engagement. Far-right parties consistently receive high engagement on social media. This is not reflective of Dutch society and its political preferences, but of Facebook’s algorithms which amplify polarizing narratives. The study also found that Facebook’s lack of clear policy on global and transnational issues, such as COVID-19 and climate change, creates an unchecked space in which the unfounded skepticism and denialism of far-right groups such as the FvD flourish. While political parties may differ on issues like climate policy and regulation, the study authors argue that political engagement and advertisements should be evaluated in a context-specific setting. This means that Facebook needs to invest heavily to improve its moderation policy, process, and outcomes.

Facebook maintains it is a community that is representative of human society. The study contend that if this is the case, this community is patriarchal, exclusionary, and saturated with gossip and populist beliefs. The alleged “community” does not reflect the advancement of human morality, and ethics but is over-reliant on an unevolved AI. Facebook is wrong to assert that their online community is representative of human society, as they have done numerous times previously, including in meetings with us. Rather, Facebook’s algorithm and business model distort the public sphere, making extreme and polarizing views mainstream in a way that is not representative of the Dutch society.

Key recommendations

For Facebook:

The study recommends that Facebook move beyond the baseline of the corporate commitments to human rights. Rather than relying on “ethical AI” and a generalized sense of fairness, it should embed its models in universal human rights standards and sustainable development goals. This process should start with inscribing a firm and compulsory human rights commitment into their policy documents.

If Facebook wants to truly represent itself as a “community”, it should take moral grounding and critical legal responsibility to ensure that its platforms channel life-saving information instead of spreading life-threatening noise, be it regarding health, society or otherwise. This means a stronger commitment to basic rights such as the Right to Health, the Right to Information, the Right to Life and the Right to Dignity, which should not only be exhibited through their Corporate Social Responsibility reports, but through proactive measures towards combating bullying, disinformation, misinformation and hate speech on its platforms. Facebook’s Community and Ad policies currently make no mention of commitments to basic human rights.

The study strongly recommend Facebook invest in technologies which are inclusive and recognize the intersectionality of human experience. As an auxiliary finding the study also revealed that leading female Dutch politicians like Sigrid Kaag had only a 1% “share of the total voice” on the platform. Facebook determines “share of total voice” based on the share of interaction a profile has received in a Crowd Tangle List. The Crowd Tangle List for this study consisted of all the Dutch candidate pages available in public domain on Facebook.

Facebook must pay attention to reports such as this one and address the content that is flagged. If Facebook is genuinely interested in making its “community safe for all”, then content flagged in research made by identifiable civil society actors, academics, and independent non-partisan organizations should be taken seriously.

Facebook must align its policies, including those on transparency and accountability, with the basic tenets of procedural, evidentiary and administrative law, while remaining open to challenges and modifications. This would mean investing in Law and Policy teams whose responsibility is not to defend Facebook in litigation, but rather to create sound and coherent rules.

For the Dutch Government:

Dutch Government must strengthen scientific communication as well as communication around civic responsibility. While the Netherlands has already taken measures by instituting funds and commitments towards both these goals. These are welcome steps. However, the communication must embed an empathetic view of the problematics faced by the Dutch society. The recent Child Benefit Scandal has revealed a certain structural apathy towards the country’s citizens. Dutch government must take action to incorporate sensitizing-training and improving empathy while dealing with public health communications.

The Netherlands proactively adopt a Digital Services Act while the European Union’s proposal for legislation remains in the pipeline. The act would ensure that platforms used by Dutch citizens, irrespective of the geography of the service provider, would have appropriate checks in place to filter out misinformation and disinformation. In effect, platforms would have to build consistent mechanisms for self-regulation of content or face penalties where they failed to set standards in a timely manner or arbitrarily deviated from them. It may also include reliance on the principles of Universal Jurisdiction to protect the fundamental rights of the Dutch citizens across the world, including the right to correct information.

Finally, tackling digital spaces in a future-oriented way means looking beyond national boundaries and notional ideas of company registration. European nations should build on the foundation laid by GDPR, to collectively ensure that digital spaces are inclusive and, regardless of location, conform to universal human rights.

For Civil Society:

Walk out and stage a boycott. We invite organizations who buy Facebook Ads to organize and regularly participate in Facebook Ad Boycott actions.

Report Overview

Chapter 1, The Dutch elections, introduces the non-Dutch readers to the multi-cultural, vibrant political landscape of the Netherlands. The authors make brief remarks on the 2021 election result and introduce readers to the research objectives and methodology.

Chapter 2, The ad will shock you, digs deeper into the content of ads purchased by far-right political parties in November 2019. This month was selected as it marked the beginning of the accelerated growth of the far-right parties online. It is highlighted that while all Dutch political parties and candidates continue to spend heavily on Facebook ads, only those parties whose ads are shocking, contains borderline content or disinformation gather higher reach, engagement and follower growth.

Chapter 3, STOP DE LOCKDOWN, highlights how political ads laden with COVID-disinformation were widely shared on Facebook by far-right parties and enriched Facebook by at least 199,300 Euros between December 2020 to February 2021. The chapter highlight how the COVID-19 disinformation led to a lack of trust in the COVID-19 measures adopted by the Netherlands government. In turn, this led to anti-COVID-19 measure protests across the Netherlands, reluctance towards the use of facemasks in public places, and a hesitancy towards vaccines. For example, a recent report by Avaaz discussed how Facebook is neglecting Europe’s infodemic and a Dutch study by Eurofound indicated that 54.9% of respondents believe that the risk of COVID-19 is exaggerated (a claim in line with the far-right FvD) with 24.9% indicating they are very unlikely to be vaccinated against COVID–19. The study findings from the Netherlands confirms this infodemic.

Chapter 4, Chasing QAnon down the rabbit hole, shows that despite Facebook’s clear policy against platforming QAnon and QAnon-linked conspiracy theories, there are at least 170 Dutch language pages and groups with a membership of 1.3 million users (and growing) actively spreading disinformation, including on COVID. We also identify intricately connected actor-networks, in which Dutch celebrities with verified Facebook profiles actively spread conspiracy theories.

Chapter 5, The post does not violate community standards highlights the result of flagging 938 posts to Facebook between February 2021 and March 2021. Of these posts, 98% remain on Facebook. Several of these posts were fact-checked by Facebook’s third-party fact-checking program. Despite Facebook’s policy to label COVID-19 disinformation, none of the fact-checked posts reported to Facebook were labelled at the time of reporting. This implies that Facebook’s policy on COVID-19 disinformation is far from effective in a non-English-speaking context. Despite its public façade the study finds that Facebook does not follow through with regards to upholding election integrity in non-English speaking countries. Chapter 6 makes Recommendations to Facebook, the Dutch government and civil society based on the findings.

Menu