Policy approaches within Europe

Germany

Germany’s Netzwerkdurchsetzungsgesetz (NetzDG) (Law Improving Enforcement in Social Networks) was enacted in November 2017 and came into force in January 2018. It was introduced to the Bundestag by Heiki Maas, the minister of justice, and is largely based on hate speech provisions enshrined in the German Constitution. It obliges social networks with more than two million registered users in Germany, such as Facebook, Twitter and YouTube, to remove ‘offensichtlich rechtsundrige Inhalte’ (manifestly unlawful content)10 within 24 hours of receiving a complaint and within seven days if the content is not ‘manifestly’ illegal.

The NetzDG is based on subsection 1 of the Strajgesetzbuch (StGB) under sections 86, 86a, 89a, 91, 100a, 111, 126, 129 to 129b, 130, 131, 140, 166, 184b in relation to 184d, 185 to 187, 241 and also under 269 of the German Criminal Code. As translated by Article 19, these relate to Section 86, ‘dissemination of propaganda material of unconstitutional organisations’; Section 86a, ‘using symbols of unconstitutional organisations’; Section 89a, ‘preparation of a serious violence offence endangering the state’; Section 91, ‘encouraging the commission of a serious violence offence endangering the state’; Section 100(a), ‘treasonous forgery’; Section 111, ‘public incitement to crime’; Section 126, ‘breach of the public peace by threatening to commit offences’; Section 129, ‘forming criminal organisations’; Section 129a, ‘forming terrorist organisations’; Section 129b, ‘criminal and terrorist organisations abroad’; Section 130, ‘incitement to hatred’; Section 131, ‘dissemination of depictions of violence’; Section 140, ‘rewarding and approving of offences’; Section 166, ‘defamation of religions, religious and ideological associations’; Section 184b, ‘distribution, acquisition and possession of child pornography’ in conjunction with Section 184(d), ‘distribution of pornographic performances by broadcasting, media services or telecommunications services’; Section 185, ‘insult’; Section 186, ‘defamation’; Section 187, ‘intentional defamation’; Section 201(a). ‘violation of intimate privacy by taking photographs’; Section 241, ‘threatening the commission of a felony’; and Section 269, ‘forgery of data intended to provide proof’.11

The law shifts primary responsibility for user-generated content to social media platforms. The minister cannot issue a take-down order, but content must be removed on a self-regulatory basis by platforms when faced with complaints.1’ A decision after seven days is referred to a self-regulatory body approved by the Ministry of Justice. Germany requires platforms to establish a clear complaints system for the reporting of unlawful content. The German law has more teeth than the French law, in the form of heavy fines which can be between €5 and €50 million. It is applicable to social media networks with over two million users. However, unlike in France, fines are only issued to platforms and not to their users.

Due to Germany’s highly legalistic culture, laws are highly detailed with little flexibility as to implementation. As Theil explains, ‘reporting obligations are quite detailed and include provisions that set out reviewer training and oversight requirements’ (2019:46). For this reason, he reports a sharp rise in the hiring of content moderators by Facebook and Twitter due to the law, with German speakers accounting for one-sixth of Twitter’s content team in 2018 (2019:49). The NetzDG obliges social media networks to produce biannual reports on content moderation for platforms with over 100 complaints per year (for analysis, see Heldt, 2019). Facebook was the first social media platform to be fined under the NetzDG in 2019, with a fine of €2 million.13 A new bill was proposed to update the NetzDG in April 2020, which recommends increased transparency from social media networks and reference to the updated Audiomedia Services Directive.14

France

In 2018, the French National Assembly passed a law to combat the manipulation of information (National Assembly, 2018). The 2018 law is three pronged. Firstly, it enables citizens, regulatory bodies, and political parties to report misinformation, which permits a judge to issue take-down orders. Implicitly, as in Germany, the law obliges social media platforms to take responsibility for user content published on their pages. Secondly, it demands increased financial transparency from social media platforms on sponsored content and political advertising. Political sponsorship and the amount paid for it should be reported. Thirdly, the law grants powers to the Conseil supérieur de l’audiovisuel (CSA) to temporarily suspend licenses for television and radio channels which show evidence of disinformation propagated by foreign states. As Craufurd-Smith points out, the French approach has been focused on the threat of disinformation to democracy (2019:63).

Frances Manipulation of Information law was adopted in December 2018 following scrutiny and approval by the Conseil Constitutionnel (Assembly Nationale, 2019).’’ The law is underpinned by existing measures, most significantly Article 27 of the 1881 French Press Law which, prohibits ‘ “false news” or “articles fabricated, falsified or falsely attributed to others”, where this is done in bad faith and undermines, or could undermine, public order’ (Craufurd-Smith, 2019:55). It also draws on applicable provisions on genocide denial and crimes against humanity (Article 24) and defamation (Articles 29—35 of the 1881 Press Law) and the Electoral Code,16 under which (Article 97) ‘false news, calumnies, or other fraudulent means’ and commercial electoral advertising are prohibited (Dossier in Craufurd-Smith, 2019:55).

In addition to the laws mentioned previously, the 2018 law stipulated changes to the 1977 law on election to the European Parliament,17 the 1986 law on the freedom of communication (Léotard Law),18 the 2004 law on confidence in the digital economy,19 the Electoral Code,20 a decree from the Ministry of Culture on Article LI 11—7 of the Consumer Code,21 the Education Code,22 the 2018 law on filing candidacy for election,23 and the public order code overseas.24 Due to their different legal bases, implementation is conducted by judges for take-down orders, the Conseil supérieur de l’audiovisuel (CSA) for license suspension, and the Autorité des Marchés Financiers (AMF) for the issue of fines, respectively. Decisions are subject to judicial review.

The key target is political advertising and political party campaigning. Three months prior to an election, social media platforms are required to provide citizens access to ‘fair, clear, and transparent’ information on the purpose as well as the identity (natural persons or corporate name and registered office) of those paying to promote political debate, use of their personal data, and the amount paid (over a given threshold). This is made available to the public in an aggregated public register. Platforms are required to establish a notification mechanism for users to alert them to false information. There are also requirements for platforms regarding transparency on algorithms which organise content related to ‘a debate of national interest’ and publication on their functioning; the promotion of news publishers and agencies providing impartial, accurate information; and the deletion of disinformation accounts. Social media platforms must provide the CSA with annual reports. In particular, under Article 14 of the act, information on content, access to content (how often and when, based on platform recommendations), and the referencing by algorithms must be provided. In turn the CSA publishes regular reports on measures implemented by platforms.2’ During periods of election, fines of up to 675,000 and a possible prison sentence of one year can be imposed on social media platforms26 and their users if content is not removed within 48 hours.

 
Source
< Prev   CONTENTS   Source   Next >