Corrected oral evidence: Freedom of expression online
Tuesday 16 March 2021
Members present: Lord Gilbert of Panteg (The Chair); Baroness Bull; Baroness Buscombe; Viscount Colville of Culross; Baroness Featherstone; Baroness Grender; Lord Griffiths of Burry Port; Lord Lipsey; Lord McInnes of Kilwinning; Baroness Rebuck; Lord Stevenson of Balmacara; Lord Vaizey of Didcot; The Lord Bishop of Worcester.
Evidence Session No. 17 Virtual Proceeding Questions 147 - 152
I: Renate Künast MP, Alliance 90/The Greens.
USE OF THE TRANSCRIPT
This is a corrected transcript of evidence taken in public and webcast on www.parliamentlive.tv.
Renate Künast MP, Alliance 90/The Greens.
Q147 The Chair: I welcome Renate Künast, a German politician who is chair of the Alliance 90/The Greens Bundestag parliamentary group and was the Minister for Consumer Protection, Food and Agriculture from 2001 to 2005. Today, we are discussing the German network enforcement Act, known as NetzDG. The Act took effect in Germany in January 2018 and is intended to reduce the availability of illegal content online. It is fair to say that it has been strongly criticised in some quarters on freedom of expression grounds. Ms Künast has very kindly agreed to talk to us about the Act and its implementation today.
Ms Künast, thank you very much indeed for joining us. The session will be broadcast online and a transcript will be taken. We have a variety of questions from members of the committee. Can you start with a brief further introduction to yourself and your general thoughts on the broad issue of freedom of expression online? Then we will come to questions about the Act.
Renate Künast: I will give you an introduction to NetzDG, my thoughts and the law enforcement questions. To introduce myself, by profession I am a social worker and a lawyer. In about 2015, in Germany, we had a lot of debates on freedom of expression. A lot of organised right-wing people said, “In this country, we are not allowed to express our thoughts”. The curious thing was that, after this activity, hate speech started, especially, as I saw it, on Facebook.
In 2015, with a journalist from Der Spiegel, I started to visit people, who had posted hateful comments about me. This was the start of my focusing on this whole issue. We went all over Germany, and what did we see? There were two different kinds of people. Some were manifestly right-wing extremists, either organised or individuals, who were, through the filters of an algorithm, focused on extreme right-wing, racist speech, talks and content. The other group was just using social media platforms to express their emotions, pushed by what was happening on the net.
With this, I started to focus on this issue. After some time, I went to the public prosecutor and was astonished by the reaction: “No, this is freedom of speech”. They did not see how this means of expression—hate speech, disinformation and fake quotes—is orchestrated systematically, especially by right-wing groups, including parties and little organisations, and people running with them.
I had to focus on what was happening there. Why? I thought that it was not acceptable to be in a situation where a public prosecutor says, “I will not go to court with this because you are a politician and you have to deal with it”. That is true even if you get comments including extreme sexual violence; for example, “I hope that you are raped by a group of north African men and then you will see”. It is at that level. I thought that this cannot be okay. We have the wrong balance between freedom of speech and, on the other hand, the general personal rights to develop in freedom. In Germany, we have both: “Meinungsfreiheit”, freedom of speech, and in our constitutional law “allgemeines Persönlichkeitsrecht”, which is a general personal right to develop yourself in freedom. Justitia has both in her hands.
In 2017, the Minister for Law Enforcement was fed up with the situation, because the right-wing extremists were more and more organised. He introduced the NetzDG, which you already mentioned. Then there was the next level of debate on freedom of speech and general personal rights. A lot of people said that this would be a problem and would harm freedom of speech”.
I want to give you some cornerstones of the NetzDG (network enforcement Act). This is not the penal code. The penal code is beside it; even the civil code is beside it. It is a special law that forces big social networks such as Facebook, Twitter, YouTube (Google) to remove or block unlawful content, and to fulfil reporting and transparency obligations. What do they have to do? They do not have to search for unlawful content on their own, but there is a notice and take-down principle. There is a button for users to report possible unlawful content and request its removal. Only then platforms have to become active.
There are two ways for unlawful content to be removed. First, the platforms can take down content because it violates their community rules. We had a lot of cases in civil courts, looking at whether these community rules were okay and whether they had to remove the content. Secondly, we now have notice and take-down-mechanism for the removal of unlawful content in accordance with the NetzDG. When users report content, they can select, if they want to report because of violation of the community rules or violation of the NetzDG.
If content is reported because of the NetzDG, the platforms have to examine whether this is in fact illegal content violating the NetzDG. To do this, the platforms build offices; educated themselves and hired lawyers. They have a high level of information about important rulings of the penal code court or the constitutional court. The NetzDG obliges them to remove “manifestly” unlawful content within 24 hours of receiving the complaint. General unlawful content without the word “manifestly” must be removed immediately, which generally means within seven days.
I went to one of the offices responsible for content removal and looked at how they operate. I asked the people working in the content removal some questions. For example, I showed them hateful content I had sent to the public prosecutor and received replies and asked them how they would decide. They always came to the same result as the public prosecutor. This is just an example to show that they are highly educated and informed by their lawyers. They have an internal supervision system if someone is not sure. Every half year, they have to report how many cases they had, whether they had problems and so on.
NetzDG is only dealing with illegal content, but the platforms can still remove content due to their community rules What do they have to remove according to the NetzDG? It does not cover every paragraph of the penal code. Amongst others, it entails criminal content by expression, Insults, threats, defamation or symbols of forbidden terrorist organizations and so on.
What have the problems and criticisms been? The critique started immediately. My party, the green party, said that freedom of speech is not protected sufficiently. We were fearful of overblocking, as were many others. But until now, there is no evidence of systematic overblocking. There are one or two cases where a civil court said that a removal under the community rules was overblocking. In the end, Facebook, Google and the others were mostly right in blocking or removing someone by way of NetzDG.
Another criticism is that the complaining and reporting mechanisms are not user-friendly and too difficult to find and fulfil. Furthermore there is no formal, transparent and comprehensible put-back procedure for users who want to complain about the removal of their content. There is still a debate in the federal Parliament about changing this so that you can complain and have a put-back procedure.
We have another problem in cases where you need more information about the perpetrator of unlawful content in order to make a civil law claim. I did this with my lawyer, for about 22 posts and tweets, against Facebook and Twitter. In September 2019, I used the paragraph of this law where, if a court is convinced that this might be unlawful content, Facebook or Twitter is allowed to give out contact information of the perpetrators. The problem is, that it only allows social media to provide the information and does not oblige them to do so. And if someone who is systematically working against my personal rights is anonymous, I cannot go to civil court.
In September 2019, the civil court in Berlin turned down my complaint and ruled that I have to deal with all this because I am a politician. I made complaints and managed to obtain the information and start civil court cases against these people. I won some of these. In parentheses, we set up an organisation (“HateAid”) to help people go through the criminal and civil court process. If I win money from the court, I always give the money to this organisation to help others. The rest of this case is now at the constitutional court.
Social media is a global way of communicating, and you can operate there anonymously. I would underline that this is important because you can act in an analogue way like this. There is still one blind spot that we are discussing now. We do not have rules at a national level against disinformation and fake quotes. We may get a ruling at the EU level, where there are some legal proposals on the way, such as a Digital Services Act.
On the national level we are discussion another extension of the NetzDG. We think that the platforms must report some of the cases where they removed unlawful content by NetzDG to the federal police, so that the federal police can do more analysis on right-wing extremism. Since 1990, about 200 people have been killed here by right-wing extremists, so we want to force the federal police to analyse extreme right-wing organisation and mobilisation online. They then would give the cases to the public prosecutors for prosecution. That is the overview.
The Chair: It is a very comprehensive overview. Thank you very much for explaining in detail how it works. We read a great deal about it in the British and international media, but it is great to hear first hand what the intention of the law is and how it is working out. We have some detailed questions, some of them technical. We would welcome your guidance.
Q148 Viscount Colville of Culross: Good afternoon. Thank you very much indeed for coming to our Committee. It is very good to hear your overview. I want to push a bit more on the issue of freedom of speech. You said in Germany there is the balance between freedom of speech and the general right of a person to develop in freedom. However, as you said, a number of people have criticised the NetzDG law, because it does not have a penalty for deleting legal content, which might cause damage to freedom of expression. You said that, whenever the overblocking is challenged, there is no proof of overblocking by the technical companies. However, surely the principle of making sure that there is a penalty for the tech companies for overblocking is correct.
You also said that there is a debate going on in the Bundestag at the moment about the put-back procedure. Could you shortly describe for me what that debate is and how that affects freedom of expression?
Renate Künast: There are cases where social media removes your post or tweet, you go to court against someone, and the court says that in fact, that was not unlawful content. As the user and writer, I could then possibly go to the social media platform and ask that they please put back my post, since a court has found it to be lawful. Or as a hate poster I could go to civil court myself to demand that my content be put back.
The problem here is: court cases take long and social media moves much faster: who is interested in a post I made two years ago, waiting for criminal court decisions? But for freedom of speech, there must be a simple put-back procedure.
The platforms are always fearful to receive sanctions and bad reactions if they do too much. There could be fines if they do not analyse correctly or do not fulfil their transparency obligations, with half-year reports and so on, or if they do not have enough and qualified personnel to check and remove unlawful content. The parliament is now debating how a user-friendly, effective and fast put-back-procedure organized by the platforms themselves should look like.
We think that both the mechanism to report content and to demand a put-back of removed content need to be accessible through a button and be handled quickly.
Viscount Colville of Culross: When you push the put-back button, who would actually make the decision of whether to put it back? You do not want to spend three years in the criminal court, as you said, by which time everyone has forgotten about your post. How do you get that instant response to your put-back button?
Renate Künast: The big platforms have instructed offices specialized in the removal of content. The current draft of the put-back-procedure would order those centres to also make the decisions in the put-back procedure, but it needs to be done by a different person than the one who ruled to remove the content in the first place. If I am not satisfied with the decision about the put-back, there is also the possibility to call on a independent conciliation committee. And of course, the legal way is always open. Another possible solution would be independent oversight committees with experts.
Viscount Colville of Culross: Thank you very much indeed. That was very interesting.
Q149 Lord Stevenson of Balmacara: What you are saying is very interesting. My question is about the balance between the rights of the individual and the authority and power of the companies involved. You have described very well the link that has been put in place by the law to illegal activity. There is a variation on that if something is manifestly illegal or only just illegal. The test is whether there is a federal law that says that the action is illegal. Does that not give too much power back to the companies? If all they have to do is test whether, in their view, the speech or posting is illegal and act on it, there is no need, as you have been saying, to think about put-back or anything else. There is no debate and, even if there were, it would take too long. On the question of balance, do you think it is too much in favour of the companies, or is it about right?
Renate Künast: Inside this system, it is more or less okay. For example, when Facebook started to build up a group of staff to deal with these questions, it had a lot of contact with public prosecutors. It discussed this with the federal office of our State Secretary for Law Enforcement, which has the overview as to whether it is fulfilling its reporting and transparency duties. They are the ones deciding whether the social media companies pay a fine if they have done something wrong. There is always a debate with them, with companies asking what they have to do and how they can prepare.
Having criticised them a lot in the beginning, I have seen that they are very well informed. I did a lot of public work with the press, media and so on with my cases. I know that they discussed the cases I made public. Sometimes they tell me that they have discussed the latest sentences and whether I won or lost.
I see two problems. One is the situation of the people working there. This is extremely hard work. Content moderation is often outsourced to other countries and the workers have to look at extremely violent content all day, including sexual assault against children.
I asked what kind of psychological support they get, how they are paid and so on. This is important. Social media is earning money with all content and should pay for it. It is important to look at their working conditions and contracts. Do not let them end up with short-term contracts, bad payment, a lack of support and no breaks from the work. You can imagine that it is difficult. You can almost compare it with federal police looking at paedophilia and sexual assault videos. We have to care for these people.
The other point is very important. We have notice and take-down-principle, which means that users can complain because of violations of the “NetzDG” and ask platforms to check and remove content. But platforms do not need to look for unlawful content themselves. They do it a bit with software, but this is mostly focussed on detecting and removing nudity, which is part of their community rules. In Germany we are discussing, whether platforms should look for unlawful content themselves, or whether the police should be more present. In “real” life, we can see police cars, showing their presence and observing crimes and prosecuting them. In the digital environment, we do not have that.
Maybe you know the “broken windows” theory from the United States. Years ago, people said that if you allow windows in one part of a city to be broken and cars to be destroyed, and do not do any community or social work or deal with the people, the whole quarter goes down. No one will go there; people will leave and so on. There is a policeman at a university in Brandenburg for higher police education, who applied this theory to the internet and calls it “broken web” theory. If the internet is a wave of disinformation, hate speech, fake quotes and so on, and we do not have NetzDG or a good system for federal police to look at certain areas, then it will be the same as in the city with broken windows: we will have a broken web.
Many studies show that organised right-wing extremism use the internet for mobilisation and organisation. We also have problems on the internet with cyber grooming and mobbing in personal relationships. Violence against women has become increasingly digitalised and the perpetrators use digital tools to exert it, like installing spy-apps on their partners phones.
We know that hate speech and cyber-harassment affect women especially. In Germany, there was an opinion poll asking women about hate speech, the internet and so on, and 36% of the women said that they use the internet for a lot of things, but do not express their opinion there because they fear the “shitstorm” and the hate speech.
Lord Stevenson of Balmacara: We have had a lot of evidence about that. It is a very good point you make. Your point about having to have regard to the state of the web, and the ability of the internet to have a presence that is respectful for people using it, is a very good one. Thank you for that.
Q150 Lord McInnes of Kilwinning: Thank you very much for your very full answers. You raised the issue of misuse of the web. Clearly, the web is an international, global phenomenon. You have spoken about the safeguards available in Germany through monitoring by the courts and police engagement. We have had evidence that some authoritarian regimes—Belarus, Venezuela, Russia—have pointed to NetzDG as an example of online censorship that they may wish to apply, in states that perhaps do not have those same pillars of jurisprudence and protection for online expression. Do you understand how a system such as NetzDG could be misused by regimes? What can be done to avoid that happening?
Renate Künast: That is a really good question, and maybe no one has an answer. NetzDG is of course for Germany. We have some rules on the EU level—I think I mentioned that—such as the Digital Services Act for illegal content. There will be a democracy action plan, to avoid, for example, the misuse of advertising. In dealing with these problems, we also looked at the Trump election campaign and the Brexit campaign, and Cambridge Analytica. You will know about that better than I do, but I always follow this and try to learn from it. We are trying to get something at EU level.
In our country we have a constitution; you have case law. Our constitution, for me, as a Member of Parliament, is an obligation to organise both. In our law, the rule is that you have basic rights, but every basic right is not without borders; it touches another basic right and we have to rule on the part where they touch each other. I am convinced that we can look at whether the removal rules harm freedom of speech.
If I am discussing asylum and refugees, and somebody writes to me, “You should be raped for hours by a big group of north African men”, or uses a lot of bad sexual words about me, what is it? Is it freedom of speech? Is it someone wanting to express opinions? Or is the only intention of this to produce fear in me and push me out of public discourses and silence me?
Sometimes the words used are meant only to discriminate against, devalue and insult someone. This is not freedom of speech. Years ago, I wrote a book, Hass ist keine Meinung (“hate is not an opinion”) or freedom of expression. If someone says, “You are wrong as a Member of the House of Lords” or “You are wrong because you have the wrong opinion” that is freedom of speech. But if they want to insult, mob, devalue and discriminate against you, that cannot be freedom of expression As I said, when it comes to harmful content, there are the community rules by the platforms. The NetzDG deals only with unlawful content. This is why it is also extremely important that we have this put-back process.
In some months, we will have the rule that they have to report extreme cases of unlawful content to the federal police, and then it will go to the public prosecutor. I think always as a lawyer; I love being a lawyer and making laws. You need to balance freedom of speech and the general right to develop yourself. You have to have tools around this that force everyone to deal with it very carefully.
Lord McInnes of Kilwinning: On that point, the legal structure in Germany determines NetzDG’s policy on hate speech. If it is illegal, it is unacceptable. On an international basis, rather than looking at national legal systems, would it be better to look at human rights law and international law, to set safeguards to stop authoritarian regimes being able to use this example to censor opinion?
Renate Künast: If I look to, for example, Myanmar or Russia, I see that it is a big problem. There is a “but”. I cannot deal with the question of what others are doing on this; I can only say that we in Germany cannot accept people not being able to use this digital world because they fear hate speech.
We need safeguard measures on this; for example, the transparency system and the obligation to do a half-year report on what the cases are. We also want to force platforms by law to open up their data to research and universities. These are different measures by which we can have an overview and the possibility of figuring out whether things are going wrong. We have the civil court and penal court, the put-back-mechanism, research and transparency reporting.
The Chair: Thank you very much for everything you have said about NetzDG, how it works and your perspective on it. While we have you, we would like to explore some other aspects of digital regulation and draw on your expertise, moving beyond NetzDG itself.
Q151 Lord Griffiths of Burry Port: Thank you for helping us understand the inner workings of the provision in Germany. My experience of these key debates about the culture we are currently living in is that they produce impeccable arguments for privacy and set against them impeccable arguments for freedom of expression. That is the problem. The context in which that fine balance has to be worked out differs from country to country. That perhaps is where the problem lies.
The Chair has already mentioned that NetzDG has been well explained by you. We notice how it serves as a model in a number of other places around the world, but has been adapted to suit the circumstances that prevail in those countries. We are on the point of bringing in a serious effort to produce a response to the need to find that balance we have all talked about: the online harms process that will produce a Bill, further conversations and so on. We still have this opportunity to look around us. Of course, we will look to NetzDG as offering insights and wisdom.
Can you think of other places we might look to? You have had experience of operating within the system you found yourself in. You might have a bit of wisdom about some of the faults in it. What about Australia, for example? Australia seems to have found a slightly different way of doing it, where the regulator looks at particular instances, rather than systemic things. We would like a little advice from Germany. In Britain, we have not cut ourselves off from Europe to the point where we do not want to listen to advice from people who know, because they have been there, as we move through this tendentious material.
Renate Künast: To give you an idea of whom you can look at, I say that you should learn from the best. At EU level, Frau Jourová, the Law Enforcement Commissioner, is working on the legal proposal for the Digital Services Act and the democracy action plan, which include the different things you might look at. The general director of Ms. Jourova is a woman called Renate Nikolay, and I can send you a link to her. She has the best understanding, better than mine, of the different approaches to this in other countries.
In 2017, the Government implemented the structure of NetzDG. So I am focussed on how we can improve it. Renate Nikolay is the best at comparing what there is in other European countries and internationally. If I was going to ask someone, I would ask her.
The EU is also discussing what kind of advertising is allowed at election time and in what way. For example, there need to be transparency rules showing who has financed the advertisement. If you see an advert, you can click on it and see who is financing it. Digital campaigns are very different from election campaigns and opinion polls in former times. Then, you saw a big screen or a poster with writing: “Green Party”, or “Christian Democratic Party”. Online, it sometimes remains unclear who is doing the advertising, who is financing it, and whether everyone can see this advert or you were especially targeted to see it. Ms. Nikolay is dealing with all these points, such as what the rules should be in a network enforcement Act. From a broader perspective, she knows how to analyse internationally.
You mentioned that it differs from country to country. Yes, of course. We have rules in our penal code that are different from countries such as yours. The European Union’s Digital Services Act makes some basic rulings but does not decide the exact formulation of a penal code. Why? Because we have our history. For example, there is the German history of Hitler, so we do not allow someone using the Hakenkreuz to express a political opinion. We say that this always has to be sanctioned. Other countries do not have those rules. It is a question of the method. Everyone has their own sanctions and paragraphs concerning political opinions or things you say about other persons.
I mentioned that we do not have police patrolling the internet. It is different from the analogue world. I look at other countries that are trying to do this. If you have only a handful of people looking for misuse, there are billions of people on the internet and social networks, so this is more than David against Goliath. It is disproportionate.
If someone insults me on the street, in Germany, it is up to me to decide whether to go to the police and court, or whether to say, “Forget it. I do not want to. I am not even listening to him”. It makes sense that the same should apply in the digital realm. I decide whether to take the criminal or civil law route. I can block the perpetrator so that I cannot see him anymore, or I can say that I want to go to court. This starts with pushing the complaint-button on NetzDG. It makes sense for me to decide. I might have to deal with the court. On the other hand, it is not only a question of what I decide. If someone goes for big shitstorms and political hate speech, whoever it is, and the victims push the button, we are more than three or 20 policemen.
The Chair: Thank you for that and the reference you gave us for further information. We are at the stage now where the UK Government are introducing online harms legislation shortly. Everything you have said has been very helpful, and I am sure the further references you have given us will be helpful too.
Q152 Baroness Buscombe: This has been truly fascinating and very helpful. You have already touched on my question in some ways. I want to ask you more about the EU’s proposed Digital Services Act. We know it is probably not going to become law until 2022. How does it fit with NetzDG? I want to focus on the point you made, quite rightly, that we do not all think the same. We all have different cultural mores. There is an incredibly fine balance between what some people might think is acceptable and others absolutely not, in terms of freedom of expression.
As a committee, we are a small group searching for the answers on this, as are others around the world. That is why it is so helpful to have your input. Clearly, the solutions are not all there. Does something such as the Digital Services Act help, having that uniform framework across the EU that we could tap into in adopting a similar approach? I would love to know your thoughts on where this could take us. Are you comfortable with that proposed Act? There is a Digital Markets Act due to come into law also in 2022. A few more thoughts on that would be really helpful.
Renate Künast: You mentioned a lot of legal proposals that are on the way in the European Union. For example, the old e‑commerce rule has now changed to the digital market rule. There are basic rules as to how the market and contracts are working. Then we have the Digital Services Act, mainly dealing with illegal content. The point is that a penal code is not EU level but member state level. It might be interesting for you to connect with this.
Our green groups and other parliamentarians at an EU level have some ideas for changes to the Digital Services Act. For example, there is a debate about whether microtargeting is allowed going forward. Why? Because there is substantial misuse. The platforms try to get users to give their data and they have a business model based on that. This has been misused, in that they would sometimes push opinions in certain regions. If you go down a street where there is advertising, or you buy a newspaper, there is no microtargeting.
There is a big debate on whether microtargeting is allowed and what advertising has to show, especially advertising that is combined with opinion polls. What is the English word for the Brexit decision?
Baroness Buscombe: It was a referendum.
Renate Künast: For advertising during referendums or elections, you have to have special rules; for example, to show who is financing it. The platforms have these global enterprises and business models. Any of us doing politics, whether it is in the EU, a member state or Great Britain, cannot just say, “They are like a global Government, going around the world with their business models and kicking off every idea about rules in a region or a state”. How should we deal with it when they think they can allow themselves everything? How can we keep our communities and societies together if we can see beheadings everywhere, naked people, misuse and everything else?
Our law has to grow with new platforms or new ways of digitalisation. It is the same as globalisation and products. Consumer rights have to grow with globalisation, based on where and in what condition things are produced. Here, we need a basic framework. I am happy, as are a lot of people here, that the EU is going this way, so that we will have a legal framework in a bigger region.
Baroness Buscombe: Yes, you have a framework, but then it allows the flexibility to move. This year, a key issue could be microtargeting. In three years’ time, it could be something completely different that none of us has thought about yet. I speak to you as one lawyer to another. We know that the law is usually too slow and behind the curve. We need flexible codes that complement and work with it. If we keep sharing our ideas vis-à-vis those codes, that perhaps makes sense.
Renate Künast: Yes. For example, in this basic code, we might say that we compel them. This was sometimes a problem in the past, with platforms saying that because their headquarters are in the United States, why should they have to deal with law enforcement in Germany. We say that if you deliver content in a country, you have to deal with the penal code in that country.
The next level is where they pay their taxes. This is a special EU problem. Member states have different taxes and levels of data protection, so they try to go to the country that is the best for them. The minimum is that our civil code and criminal code is fulfilled. We will change our society if we are too fearful to express our opinion. As the digital environment gets more and more important, we have to organise an intervention.
The Chair: We have to draw the session to a close. Renate Künast, thank you very much indeed for joining us, for your expertise and for explaining the history of the law and its implementation to us, as well as your wider thoughts on digital regulation. It has been very useful to the committee. Do send us any further information you have, and, likewise, we will keep in touch with you. We have very much appreciated your time today.
Renate Künast: I have sent you the access to Renate Nikolay, as I mentioned, for further information and an overview.
The Chair: Thank you very much indeed for your evidence. It is much appreciated.