The Parliamentary View: Protecting Our Societies from Propaganda and Disinformation
Andreas Nick
German Bundestag
Andreas Nick has been a CDU member of the German Bundestag since 2013. He serves on the Committee on Foreign Affairs and is rapporteur for the Council of Europe, the United Nations, issues of global order and cybersecurity, as well as regional rapporteur for Turkey, Hungary, and South America. In addition, he is a substitute member of the Finance Committee and the Digital Agenda Committee. Dr. Nick was elected to the Bundestag following a professional career in banking. His final positions were as head of M&A at Sal. Oppenheim Jr. and Cie. and as professor of corporate finance at the Frankfurt School of Finance and Management. Dr. Nick holds a Master’s Degree and a Doctorate in business administration from WHU Otto Beisheim School of Management in Vallendar, as well as a Master of International Public Policy (MIPP) from the Paul H. Nitze School of Advanced International Studies (SAIS) of the Johns Hopkins University in Washington, DC.
Inger-Luise Heilmann
German Bundestag
Inger-Luise Heilmann is a parliamentary advisor to Dr. Andreas Nick, member of the German Bundestag. Her field of work includes the preparation of the deputy’s work in the Foreign Affairs Committee and the Committee on Digitalization as well as the Subcommittee on United Nations, International Organizations and Globalization. She holds a Master’s Degree in International Relations with a Specialization on Security from Rijksuniversiteit Groningen.
Understanding Disinformation and Digital Propaganda
Today’s interconnected societies have largely benefited from the Internet. The world-wide web enables unlimited information sharing, communication, and transactions. Some argue that data has replaced oil as the world’s most valuable resource going forward.[1] A whole new data-driven economy has emerged. Ad-based social media platforms and smart technologies such as Artificial Intelligence (AI) have a considerable impact on our societies—on business models, on media companies, and on the privacy of our citizens. From a mere technological viewpoint, digitization is not political as such.[2] Yet it enables new societal practices and thus becomes political.
Policymakers on both sides of the Atlantic have become increasingly aware of the challenges in conjunction with the success and spread of social media. On both sides of the Atlantic, digital propaganda and fake news in particular are regarded as harmful. The United States and the European Union share the experience of targeted disinformation campaigns, the majority being conducted from outside our countries. According to a recent Eurobarometer poll, 85 percent of Europeans view fake news as a problem in their countries, and nearly as many consider fake news a threat to democracy.[3] As Christian Democrats in the German Bundestag, we consider the spread of fake news and targeted disinformation campaigns a challenge to the integrity of our liberal, democratic discourse.[4]
The long-term effects of social media on our society also need closer scrutiny. Historian Niall Ferguson has studied the history of social networks and recently compared the world-wide web to the invention of the printing press more than 500 years ago—with a worrying analogy of its social consequences.[5] Ferguson notes that individualization and massification, the dissemination of fake news, hate messages, and incitement, as well as the rise of religiously-motivated conflicts, increased during the first decades after the printing press was invented and before new rules and standards were established. Drawing from these dynamics, enhancing the resilience of our societies and complementing such measures with new forms of social media regulation should be the first priority from a parliamentarian point of view. Moreover, social media platforms need to take on more responsibility and need to cooperate with the public sector and civil society in order to actively combat disinformation, election meddling, and digital propaganda.
Enhancing Resilience of Our Societies
Digital propaganda and disinformation campaigns aim at undermining institutions and the fabric of society.[6] The most critical task in our democracies is thus to increase resilience in order to be less vulnerable and to better confront new challenges. Resilience demonstrates the ability to resist adversary campaigns, to flexibly adjust and to swiftly recover from disruptions. It needs to be rooted in our minds and our democratic institutions. Enhancing resilience to cope with online disinformation comprises measures by the political institutions and governments, the security sector, the education system, civil society, and social media companies.
Resilience demonstrates the ability to resist adversary campaigns, to flexibly adjust and to swiftly recover from disruptions.
Resilience ground work entails a stronger focus on digital literacy. Students of all ages should be able to follow comprehensive media and information literacy courses. They need to understand what algorithms are and how the ad-based business models of social media platforms function. Our society also needs to learn how to distinguish high-quality and trustworthy pieces of information from fake news or mere propaganda. Measurements of media and information literacy could even be added to the OECD’s PISA rankings.[7]
Resilient societies also require strong democratic institutions. Political institutions, media, and civil society have to increase their efforts to explain and discuss politics in a credible, transparent, and concise way. Politicians and civil servants need to work toward more effective and credible institutions. Public awareness of digital propaganda or disinformation should be raised concertedly with the media. In acute cases of disinformation campaigns, both governments and media should react in a timely and proportionate way without citing rumors, as otherwise these would receive more attention. Additionally, security agencies, governments, and the media should follow a clearly defined schedule on how to answer acute disinformation campaigns.
A broader understanding of resilience also takes into account technical and organizational resilience. As online campaigns frequently draw on leaks and hacks, both the public and the private sector need to enhance the security of their information systems. Germany, for instance, published a comprehensive cybersecurity strategy in 2016 and established an early-warning system for cyber espionage or attacks. Making societies as resilient as possible also entails secure election systems. In Germany, citizens vote on paper and no voting machines are used. The data is aggregated on computers without connection to the Internet. As a result, experts consider a hacking of voting technology unlikely: “Voters’ heads are by far the more vulnerable target,” Brookings expert Constanze Stelzenmüller concluded in her testimony before the U.S. Senate Select Committee on Intelligence.[8]
Societal resilience can be further increased by supporting the media in delivering high-quality journalism and sustaining its financing to make it less vulnerable. Moreover, tools such as source transparency indicators or verified content labels could be established by the media to recognize their outlet as trustworthy and to empower citizens. Cooperation between the media and fact-checking institutions also contributes to high-quality journalism and to the debunking of fake news. Above all, reporters need continuous training. They should always check the trustworthiness of their sources and be aware of the agenda behind the information.
Addressing Disinformation and Digital Propaganda through Legislative Means
Measures for more resilient societies and democratic institutions need to be accompanied by concise rules for social media companies and users posting online. When developing social media regulation and drafting legislative proposals in this realm, the main objective should be not to re-invent the wheel, but to draw from existing regulation in related domains.
One option policymakers in Germany and elsewhere should consider more seriously is to apply traditional media law more rigorously in the digital world as well. Social “media” platforms have been exempted from traditional media laws thus far. But they must take on more responsibility as a filter for content quality assurance and therefore must prevent the spread of false information, enforced by law if necessary. As a suitable analogy, we should think of Facebook not as a word processing program or telephone line, but as a medium such as television, radio, or newspaper. Facebook has rejected the idea of being a media company so far, claiming it would just host information, not produce it, as the company intends not to be subject to media regulations. However, as the Facebook algorithm selects stories and pieces of information, Facebook could be considered a media company. Therefore, editorial rules must apply: It should at least be comparable to the task of newspaper publishers that are responsible if inappropriate content was published in the letter to the editor section and they had not met their requirements of examination. In Germany, the discussion on the right to rebuttals, a concept borrowed from German press law, has not yet come to a conclusion on the federal level as press law lies in the realm of the German states. However, existing technological possibilities need to be utilized to post rectifications after users have seen posts of fake news or disinformation.[9]
First concrete initiatives have been launched in order to address disinformation and digital propaganda. The European Commission, for instance, convened a Multistakeholder Forum on Disinformation to develop a Code of Practice (CoP) that should serve as a self-regulatory framework for online platforms and advertisers. It should include inter alia more transparency about sponsored content and political ads and detailed information on algorithms that prioritize the display of content as well as labels and rules for bots and the fight against fake accounts. It will further demand social media platforms to improve “the findability of trustworthy content.”[10] The EU-wide CoP are supposed to produce tangible effects in the months following its publication. If these results are not satisfactory, the Commission plans to take other steps that might include regulatory measures.
With the Network Enforcement Law, passed in 2017, Germany has taken a first important step against hate speech on the Internet. The reasoning behind this law was that international social media companies have to comply with the German legal order if they make their services accessible to German users. The social media companies are thus responsible for what happens on their platforms. However, this can only be a first step in the field of social media regulation. Social media platforms also should provide more transparency about their business models and algorithms in order to aid researchers in closing research gaps.[11] Furthermore, social media platforms should provide more information about the sponsors of political advertising, the amount spent on the political ad, as well as targeting parameters.[12] Possible future legislation should also consult anti-trust regulation so that big social media platforms cannot abuse their market power to react very slowly to the demands by civil society and political institutions.
Establishing a Long-term Understanding about Disruption in Our Societies
New forms of regulation and measures to increase resilience are the two main areas in which national parliaments can act against disinformation and digital propaganda. Quick fixes, however, would not be desirable; sustainable multi-stakeholder engagement is required instead. The long-term effects of social networks on our society as a whole need further investigation by the academic and the political spheres. New regulation can only be developed on the basis of a deeper understanding about the impact of social media platforms. On that basis, legislation about media law and anti-trust regulation will need to be adopted.
Quick fixes, however, would not be desirable; sustainable multi-stakeholder engagement is required instead.
Apart from disinformation and digital propaganda, other fields for social media regulation have been widely discussed following the Cambridge Analytica case. Data protection concerns have been central to the European reaction to the scandal. Data protection is a fundamental right enshrined in the EU Charter of Fundamental Rights (Article 8) and needs proper enforcement.[13] Another issue for consideration could be to subject social media platforms to the secrecy of telecommunication if they provide telecommunications-like services. This could also include the application of data retention laws to social networks, implying that personal data might be saved for no longer than 90 days.
A wide array of areas need legislative clarification—the transatlantic exchange on common challenges and possible solutions therefore remains vital. On the parliamentarian level, we could foster cooperation through more institutionalized formats. Taking concerted measures to enhance resilience in our societies would make it harder for adversaries to undermine confidence in democratic institutions or to generate confusion via online campaigns.
[1] “Data is giving rise to a new economy. How is it shaping up?” The Economist, 6 May 2017. Online.
[2] Daniel Jacob and Thorsten Thiel, “Einleitung: Digitalisierung als politisches Phänomen,” in Politische Theorie und Digitalisierung, eds. Daniel Jacob and Thorsten Thiel (Baden-Baden: Nomos, 2017), p. 8.
[3] European Commission, “Final results of the Eurobarometer on fake news and online disinformation,” 12 March 2018. Online.
[4] CDU/CSU-Fraktion im Deutschen Bundestag, “Diskussion statt Diffamierung Aktionsplan zur Sicherung eines freiheitlich demokratischen Diskurses in sozialen Medien,” CDUCSU.de, 24 January 2017, p. 2. Online.
[5] James Homann, “How Zuckerberg’s Facebook is like Gutenberg’s printing press,” The Washington Post, 28 March 2018. Online.
[6] Andrew Weisburd, Clint Watts, and JM Berger, “Trolling For Trump: How Russia Is Trying To Destroy Our Democracy,” War On The Rocks, 6 November 2016. Online.
[7] European Commission, “A multi-dimensional approach to disinformation: Report of the independent High level Group on fake news and online disinformation,” 12 March 2018, p. 27. Online.
[8] Constanze Stelzenmüller, “The impact of Russian interference on Germany’s 2017 elections,” Congressional Testimony, The Brookings Institution, 28 June 2017. Online.
[9] CDU/CSU-Fraktion im Deutschen Bundestag, “Diskussion statt Diffamierung Aktionsplan zur Sicherung eines freiheitlich demokratischen Diskurses in sozialen Medien,” CDUCSU.de, 24 January 2017, p. 6. Online.
[10] European Commission, “Tackling online disinformation: a European Approach,” COM 2018 236, European Commission, 26 April 2018, Section 3.1.1. Online.
[11] Alexander Pirang, “Germany’s Half-Baked Approach to Fighting Disinformation,” GPPI Commentary, 12 April 2018. Online.
[12] Ben Scott and Dipayan Ghosh, “Digitale Werbung und politische Propaganda: Wie mit Technologien der digitalen Werbeindustrie Desinformation im Netz verbreitet wird,” Stiftung Neue Verantwortung, March 2018. Online.
[13] European Parliament, Council of the European Union, and European Commission, “Charter Of Fundamental Rights Of The European Union,” EUR-lex, 26 October 2012. Online.