Tackling Disinformation the European Way

Ulrik Trolle Smed

European Political Strategy Centre

Ulrik Trolle Smed is a Policy Analyst at the European Political Strategy Centre with focus on foreign, security, and defense affairs. He has a special interest in European sovereignty and advises on hybrid threats and interference in democracies as well as EU-NATO cooperation and European defense initiatives.

Prior to joining the EPSC, Ulrik was Head of Section for the Sahel region with the Africa Department at the Danish Ministry of Foreign Affairs. Earlier, he was Research Assistant at the Centre for Military Studies at the University of Copenhagen with a focus on maritime security and development in Africa. Ulrik is also an active member of the Youth Atlantic Treaty Association.

Mr. Smed graduated from University of Copenhagen with an MSc and BSc in Political Science with a focus on international security and studied intelligence and diplomacy as a Visiting Student at Boston University. During his studies, Ulrik also worked as think tank liaison and defense policy trainee at the Danish Embassy to the United States in Washington, DC.

Ahead of European Parliament elections in May, Ulrik Trolle Smed outlines how Europe can counteract disinformation campaigns in this essay from AGI’s new report “Defending Democracy in the Cybersphere.”

Between May 23 and 26, 2019, more than 300 million voters will take to the ballot boxes across twenty-seven European nations, and, in doing so, will participate in one of the world’s largest democratic exercises.

Free and open elections are the foundation of our democratic societies. They make Europe what it is: a place where you can speak your mind without fear of being arrested or prosecuted, and a place where voters trust that election results reflect open and transparent public debate.

Yet today, more than seven out of ten EU citizens are concerned that disinformation may interfere with parliamentary elections or other polls in their country, while six out of ten are worried that cyberattacks can manipulate election results.[1]

Restoring public trust—and safeguarding our elections without risking further polarization—requires a credible but also balanced approach in these special times for Europe as well as the world.

An Era of Disruption 

We live in an era of disruption to politics and society: there are increased levels of anxiety and perception of threats, a loss of trust, and a growing polarization. Democracy—as a mechanism to mediate between the different groups in the society for common good—is coming under pressure from within and from without, even in Europe.

What factors account for that? Partly, it is the legacy of the crisis, and the concern about the level of protection that can be expected from the public domain. Given the pressure on public finances and the demographic trends, a more widespread assumption has been to be to expect less, rather than more, from the government.

We are increasingly aware of the challenges to the democratic debate posed by technological innovations that can result in the “filter bubble” effect.

The most important factor has to do with technology. The digital age is a new layer added to an already quickly changing world.  At the onset of the digital revolution, there was significant hope—and indeed an expectation—that digital technologies would be a boon to democracy and freedom, and support more social and political engagement in terms of enhancing deliberation, enabling more voices to be heard, and making information more available. But we are increasingly aware of the challenges to the democratic debate posed by technological innovations that can result in the “filter bubble” effect, which allows only the voices of those within a narrow demographic to be heard or displayed.

The polarization that we see driven by new digital tools leads to people not just disagreeing with each other, but often also ruling out a compromise, which curtails the possibility of having a dialogue. Consensus-building has always been the cornerstone of democracy and becomes a daunting challenge in such an environment.

Defining the Problem of Disinformation

Finally, pressures from external forces have come to the fore in recent years, not least due to the ability of third country actors and private companies to spread propaganda and disinformation on new online media outlets.

Disinformation—which includes the intentional creation, presentation, and dissemination of verifiably false or misleading information—has become a growing challenge in the past couple of years, which have been marked by a series of attempts to manipulate electoral processes in at least eighteen countries, including in the EU. The most visible and worrying examples from recent years include Russian-sourced influence operations in the U.S., Germany, France, Britain, Catalonia, and many more.

The threat can be split into two vectors: attacks that target systems and data to interfere with the electoral process or voting technology, and threats that manipulate voting behavior.

Protecting the integrity of our elections is therefore an absolute priority—for the European Union, for the member states, and for all citizens on both sides of the Atlantic. The threat can be split into two vectors: attacks that target systems and data to interfere with the electoral process or voting technology, and threats that manipulate voting behavior.

In terms of the first, although this approach is relatively crude, even the suggestion that it has happened or could happen is corrosive to public trust and confidence in institutions. For the second, we can break it down further into three categories: targeted hacks and leaks to change public opinion; fake news to influence the results; and the use of psychometrically targeted messaging based on mined user data—such as that used in the Cambridge Analytica case. Finally, other purposes of disinformation include manipulation of web traffic on social media and other online platforms for economic gain.

This is all happening against a backdrop of declining societal trust in institutions, at a time when the world has experienced twelve consecutive years of decline in democracy and freedom while what might be dubbed “digital authoritarianism” is increasing abroad.

That is why member states—together with the EU institutions—have decided to take action to combat disinformation.

The European Response: A State of Play

With a view to protecting European public debate and elections against influence campaigns by third countries and non-state actors, the EU has boosted its resources and engaged with member states as well as private stakeholders such as Google and Facebook to ensure a joint effort to creating a secure, credible, and responsible online ecosystem.

EU institutions and member states decided to provide critical units with new staff for digital analytics, intelligence gathering, and strategic communications before the European Parliament elections in May.

With a new and strengthened budget to fight disinformation, the EU institutions and member states decided in December 2018 to provide critical units with new staff for digital analytics, intelligence gathering, and strategic communications before the European Parliament elections in May—expecting an even greater boost in European External Action Service headquarters and delegations in the EU’s neighboring regions over the next two years.

Meanwhile, online platforms have committed to a new code of practice to help reduce issues of fake accounts and disinformation as well as increased transparency for automated robots and political advertising through their services. This follows an understanding that if implementation and impact proves unsatisfactory, the Commission may propose further action, including legislation.

Finally, member states are organizing themselves in a new EU rapid alert system to share “real time” warnings about disinformation campaigns and election interference, to react, and to ensure coordination between the capitals and Brussels. Together with improved efforts to educate EU citizens in digital media literacy, this system can provide relevant and timely counters to rising disinformation campaigns and boost the resilience of European democracies in the digital age.

While We Are Waiting for the European Elections

These steps are taken in close coordination with and complementarity to NATO as well as the G7 and other institutional partners in a growing defense of the democratic way of life and governance in the twenty-first century. And they will certainly not be the last institutions to do this.

Touching upon a wide range of issues—from free speech in a new era to the workings of a growing digital economy—disinformation has proved to be a politically sensitive topic.

We must push forward to create a harmonious and credible European system if we want online democratic debate as well as markets to grow in a safe and sustainable manner.

Nevertheless, we must push forward to create a harmonious and credible European system if we want online democratic debate as well as markets to grow in a safe and sustainable manner. Waiting will only risk fragmenting policy responses across member states even further.

Therefore, here are three ideas that Europe could explore to further improve its response to disinformation.

Develop a New Disclosure Policy for Election Interference

The last few days before any election are critical but multiple democratic governments have faced challenges with disclosing foreign influence campaigns without risking (further) polarization in the public debate, as was the case with President Obama when a potential Russian influence campaign was reported. Governments can also be accused of seeking political gain from such attacks.

Meanwhile, incoming governments tend to have little interest in investigating such campaigns following an election, as the Mueller investigation in the United States shows. The same could have happened in France—had President Macron not won—in the aftermath of the Macron leaks. Therefore, we are in dire need of a new “trip wire system” to ensure that the relevant authorities know exactly who will be disclosing and speaking about such cases and what the threshold is for doing so.

Create a Fund for Future Fact Checkers to Increase Democracy in the Digital Age

Since 2001, the share of global ads revenue has plummeted for newspapers while the market turned digital in a hunt for wider reach and cheaper, more flexible and targeted services. Today, Google and Facebook dominate the market with a share of over 90 percent of all ads spent on digital venues. While the money moved to a new outlet, factchecking did not, and it needs a new and innovative business model.

Without the necessary financial resources, Europe can be sure that factchecking will continue to run on a “smoke and steam” basis. And with an increasing need to understand online behavior, gather analytics according to highly methodological standards, and a guarantee for the quality of data to decision-makers as well as the public, there is a need to take this challenge seriously.

Today, the European network of fact checkers still has only a presence in twelve out of twenty-eight member states, less than half a year before the European Parliament elections, and financial resources are a big part of the problem. Creating a fund—with the necessary reach—to foster new partnerships between entrepreneurs, journalists, and a large media industry could become a new European “showcase,” if done and promoted properly, helping to push a larger European momentum.

Set Up a New Ombudsman Position for Data and Algorithms

The number of new algorithms is climbing at lightning speed as we speak while our public understanding of their impact on our societies is extremely limited.

According to an MIT group, the ratio between neural-network research papers mentioning new algorithms compared to the study of existing ones were more than 10 to 1 in 2017. This gives us little opportunity to address unintended consequences in time. Public authorities are already investigating everything from video games to slot machines and financial trading algorithms without breaking with intellectual rights. So why not data and algorithms?

There is a need for a new ecosystem to support democratic values ​​and “clean lines” in the digital age. This can be done in the roaring tech industry, in the same way as in other sectors. In the wake of the General Data Protection Regulation (GDPR) and recent EU initiatives, there is an opportunity for us to help define a set of new norms that can make an impact not only in Europe, but also in the rest of the world.


The views expressed in this article are those of the authors and do not necessarily correspond those of the European Commission. The European Political Strategy Centre is the European Commission’s in-house think tank.

[1] European Commission (2018): “Special Eurobarometer 477 – Report: Democracy and elections, fieldwork September 2018, publication November 2018,” European Commission, 2018. See pages 36 and 56.

The views expressed are those of the author(s) alone. They do not necessarily reflect the views of the American-German Institute.