The road to hell is paved with good intentions. Will the EU fight against child sexual abuse lead to a massive surveillance programme?
Moritz Körner is a member of the European Parliament for the FDP (the German Free Democratic Party) since 2019. He is a member of the Committee on Civil Liberties, Home Affairs and Justice and of the Committee on Budgets. He was a speaker to the “Beyond Budapest: data and privacy in criminal investigations” FIC Agora Symposium which took place in Brussels, on the 22nd of March. inCyber seized this opportunity to interview this advocate of public liberties on the threats on fundamental rights within the UE.
The EU project about fighting against child abuse is controversial because it would set mass surveillance among chat systems. Does this project respect the balance between the protection of fundamental rights and the needs of law enforcement agencies?
No, absolutely not. This is a full-fledged attack on fundamental rights never seen before. Of course, everybody is against child sexual abuse and wants to see the perpetrators behind bars. But is this proposal really helping? What we want to do with it is to look for new pictures with IA. We can already find known material with a pretty small error margin.
But with this new detection and grooming, there will be like 5 or 10% false positives, on a very large scale of people scanned. It will mean a huge amount of people and communication to be checked. So, law enforcement will be overwhelmed and also lots of innocent citizens’ communications will be on the desk of police officers. And I think this is not helping us fighting child sexual abuse and it’s not in line with our fundamental rights.
So, how to fight against child sexual abuse?
We should have better cooperation on the EU level. We should not be relying on the Americans, we could have released something maybe with Europol. But then we also are really looking about the legal framework. The Article 140 about the common market says when Member States have different regulations, there will be a problem for the common digital market. It’s nonsense because we just applied the Digital Service Act. When you investigate what’s in it, blocking of URLs – which is not possible – and age verification everywhere, in app stores and so on, to make sure to block access to kids, you have no anonymity anymore in the digital world.
Are you afraid of potential abuses?
I don’t see that this is a good way forward. Of course, the companies have to do more to protect children from risk such as grooming. They should have complaint mechanisms for children and should be directly reporting. But making a huge Chinese-like surveillance State is not the way we should go. Because if we have this, it will not help us really fight child sexual abuse, and it will damage our rule of law and fundamental rights.
And maybe it will also be the first step, because if the technology is there to look for pictures and to look for grooming, then it will be used for terrorism, then it will be used for serious crimes. And then maybe someone will use it for the serious crime of being in the opposition. And this is not the way we should do it in Europe.
The European Digital Wallet Project might be adopted without a vote from the Parliament. What is your reaction as a MEP?
The position of the Parliament was adopted in Committee by a pretty broad majority. This is actually something we have regularly. Though I’m always in favour of going into plenary session, I don’t see a big scandal here because it’s a normal procedure.
This digital wallet might have some very strong consequences on fundamental rights.
I’m not an expert on the issue, but how I understand it, we sign on to lots of different applications with our Facebook or Google credentials, which are more or less an ID. And do we want to have a European ID with fundamental rights protections, with the possibility for citizens to identify themselves in the digital field with data protection? That is what this discussion about. Of course, there can be dangers and we have to look into safeguards but as it stands, I don’t see a big danger.
Some citizens are concerned it might be a first step towards a social credit system.
So they should be afraid about the child sexual abuse proposal, because it is really infringing in all our private communication. Every single private communication will be scanned by artificial intelligence. This is problematic. And they also want to find a way to look into encrypted files or break encryption to look into them. This is really a problematic, and it’s not very much discussed in some Member States. Of course, fighting against child sexual abuse is something people approve, but they don’t see the big changes behind.