Serious Privacy

Tackling the Tough Topics: CSAM (Dr. Teresa Quintel and Alexander Hanff)

August 03, 2023 Paul Breitbarth and Dr. K Royal, Dr. Theresa Quintel, Alexander Hanff Season 4 Episode 28
Serious Privacy
Tackling the Tough Topics: CSAM (Dr. Teresa Quintel and Alexander Hanff)
Show Notes Transcript

On this week of Serious Privacy, Paul Breitbarth of Catawiki and Dr. K Royal tackle a topic that much trigger strong reactions, Child Sexual Abuse Materials or CSAM. Our guests today have worked a lot on this topic. Dr Teresa Quintel is attached to Maastricht University and mainly works on issues related to data protection in the law enforcement sector. Alexander Hanff is a long-time privacy activist. Both were participants in a panel discussion on the same topic during this year’s CPDP Conference in Brussels, and during today’s episode, you will learn why.

Links: 

●      https://ec.europa.eu/commission/presscorner/detail/en/IP_22_2976 

●      https://techcrunch.com/2022/05/11/eu-csam-detection-plan/?guccounter=1

●      https://9to5mac.com/guides/csam/


If you have comments or questions, find us on LinkedIn and IG @seriousprivacy @podcastprivacy @euroPaulB @heartofprivacy and email podcast@seriousprivacy.eu. Rate and Review us!

Proudly sponsored by TrustArc. Learn more about NymityAI at https://trustarc.com/nymityai-beta/

#heartofprivacy #europaulb #seriousprivacy #privacy #dataprotection #cybersecuritylaw #CPO #DPO #CISO

S04E28 - CSAM

Please note this is largely an auto-transcription. For accuracy, listen to the audio. 

[00:00:00] Paul Breitbarth: Hi everyone. Let me start this introduction with a trigger warning. Today's episode, we'll discuss topics related to online sexual abuse of children. We will mainly address the topic from a data protection and a fundamental rights perspective, but if this is not a topic you are comfortable with, please stop listening now.

So there we go. On May 11th, 2022, the European Commission put forward a legislative proposal to prevent and combat child sexual abuse online, also known as C Ss a, according to the commission, with 85 million pictures and videos depicting child sexual abuse reported worldwide in 2021 alone. And many more going Unreported child sexual abuse is pervasive and although the goal of the legislation is of course, laudable, a fierce debate has sparked around this proposal, especially because of the strict detection obligations that will be imposed on online platforms and electronic communications providers.

And we will talk about that today. Our guests have worked a lot on this topic. Dr. Theresa Quintell is attached to Ma Marick University and mainly works on issues related to data protection in the law enforcement sector. And Alexander FF is a longtime privacy activist that you've also heard on the podcast before, and both were participants in a panel discussion on the same topic during this year's C P D P conference in Brussels.

and on during today's episode, you will also learn why. My name is Paul Breitbarth.

[00:01:38] k: And I am K Royal and welcome to Serious Privacy. So I was trying to look up my, comedian that I follow, and I can't think of her name, but she does an episode on why children of the eighties are traumatized nowadays. And one of the things she said is, 'cause we didn't have trigger warnings. We would be in conversation with someone and then bam, a topic would come up that would slap us sideways.

So do we have P T S D? Yes, we do. So maybe we really should have trigger warning. So I appreciate Paul putting that out there. Also, unexpected question time. So thank you both for coming in. Are you ready for the unexpected 

question?

[00:02:14] Teresa Quintel: Are  you supposed to be ready for that?

[00:02:16] k: I just like to check.

[00:02:19] Alex: I'm always ready.

[00:02:20] k: Always ready. Okay. If you had an exotic pet, one of the non-traditional pets that you would have, what would you want? So if it wasn't illegal, what animal would you kidnap and try to make a pet out of? Theresa,

[00:02:35] Teresa Quintel: I would definitely get a giraffe.

[00:02:41] Paul Breitbarth: You must just be back from Africa. 

[00:02:43] k: she didn't even hesitate.

[00:02:45] Teresa Quintel: No, they're just so great. Uh, I was recently in Ken Kenya for work we had one afternoon off, uh, and went to a nearby, a national park and I only wanted to see the giraffes. There were lions and stuff like that, but they are so beautiful and when they walk it's like they're flying. It's just very majestic.

So it's definitely a giraffe

[00:03:08] k: That is awesome, Alex

[00:03:10] Alex: Oh, there's so many, so many. I'm only allowed one.

[00:03:14] k: One or at least, you know, start with one and saying, if not that one, maybe these  others.

[00:03:19] Alex: I mean, there's a stereo stereotypical dolphin. But just 'cause they're so incredibly beautiful and intelligent, I mean, how can you not look at a dolphin and, and feel emotionally moved by them? and we swam with dolphins, when, when we had our wedding in Mauritius. So it was a particularly special moment, in the Indian Ocean.

but also, I mean, things like arctic foxes I think are incredibly cute. geckos, I'm a big fan of lizards, so, so any type of lizard. Snakes. I, I used to keep snakes when I was, when I was younger. Yeah. Yeah. So I, I genuinely love animals. That's why I live in the middle of the countryside. So, you know, pretty much anything that comes across my way is gonna get a, a nice smile and a stroke if it's furry.

Right. So,

[00:03:56] k: Wow. Wow. Paul

[00:03:59] Paul Breitbarth: Well, the first thing that springs to mind is a hippogriff, but that's obviously also not a real animal.

[00:04:04] Alex: You've just busted a few hearts, therefore, by saying that's

[00:04:07] Paul Breitbarth: But I think, I think it would be really, really nice to have an animal that you need to be polite to, because that also teaches you something, but then also that flies you across the country and across the world. I think it's a much more, probably a much more comfortable way of traveling if you can travel on onboard your own bat than being stuck in an airplane.

so that's probably one. If not then maybe a deer. There are some deer next to my mother's house in, in the meadow, and in the forest, and they come out at dusk and a dawn. They’re always very nice to, to watch from a distance.

[00:04:41] k: That is cool. I just saw a video of a mom deer who stopped traffic in the middle of the road, and when someone stopped and got out, led her back to her baby who was called in like a soccer net. and so the, the gentleman got the little baby free and as they ran off, the little baby came back to him like to say thank you, and then ran off again.

[00:05:03] Alex: have a lot of data around here. 

[00:05:04] k: It was cool, but the mom came and asked for 

help. Right. That's cool For me, I have to say, I wanted to think of something really, really exotic and it's, I'm just gonna go back to Red Pandas. I would love to have a red panda, although a cap Barra sounds really cool too.

[00:05:23] Alex: the.

[00:05:24] k: I just, I know they're awkward looking animals, but they're just cool. They're cool. Okay. Let's get on with the, the all important topic of the episode. CSAM or CSEM. if it doesn't quite meet CSAM and learn all about that, Paul, I typically give you the honor of the first question. Do you have one teed up?

[00:05:45] Paul Breitbarth: Well, I think the basic question, and I think I'll address that to Theresa first, is what is the discussion all about? Why are we talking about csem today?

[00:05:55] k: Or why are we not talking about CSAM like we should.

[00:05:58] Teresa Quintel: Well, first of all, I think I would like to make a comment about the trigger warning because I actually think that it's extremely important to talk about this topic and also that people listen to it because it's a very uncomfortable topic for many.

[00:06:14] Paul Breitbarth: Mm-hmm.

[00:06:14] k: Agreed. Agreed. And I actually told Paul, it's funny, I was of the opinion, it's a very uncomfortable topic, but if you're really uncomfortable, do something to change it.

[00:06:24] Teresa Quintel: Exactly, and I think by talking about this, and maybe that's actually a positive point about this proposal as well, that. People are actually discussing this now and also discussing solutions. Maybe different ones also then proposed in this, regulation. But I think it's a good thing that we are speaking about this because, in, in many instances during the past years, sexual abuse of, of children has happened, not only online, and no one really spoke about this.

It was never really, Worked like, I don't know you, if you think about the scanner in the church, for example, it was a big scandal and it was not resolved, right? So I think it's good that this is now being mentioned and being discussed. But coming back to your question, I think there are many different issues that need to be taken into account.

And of course, with , Speaking about privacy and data protection here, but we need to also consider the more structural problems. if we look at the proposal, and maybe we will speak about this, later also during the podcast, but if we speak about, data protection privacy, then you already mentioned the detection orders.

Those were probably the topic that was discussed most because, it would involve the general monitoring of private communication. It would probably, put an end-to-end encrypted communication, which of course then also leads to other, risks for, users, online users. But it. I think one of the, the, the things that is discussed insufficiently is also that we need to take into account the privacy rights of those children, of those, survivors and victims of sexual abuse.

And that, of course, we also be part of this proposal and that, or who's. images and whose content will be reviewed by several instances at different levels, by service providers, by different authorities, by law enforcement, by the so-called EU center, by maybe Europe or officers. And that, I think, very often is not taken into consideration that we

Come into a situation where those that were victims and survivors of abuse and exploitation, have to relive the situation again. So those are only a few, concerns that I would see. But of course there are many more in this proposal, for example, that there are, one could argue that there are a lot of procedure safeguards.

Although they might not be very precise, but there's a lack of substantive, safeguards in this proposal and the lack of limitations that it sets. And so there's of course, also risk that what is now being proposed is at the later stage, then expanded it.

[00:09:39] k: And do you mind going into details as to the proposal, just so our audience is up to date?

[00:09:44] Teresa Quintel: So Paul already introduced a little bit, the proposal is, well, I don't know how much I should go back into history and set it into context, but put it in a nutshell is, the proposal follows a so-called interim regulation. Which was, adopted because the so-called, European Communication Code made, certain service providers, subject to the EPR directive and the rules and obligation.

So the obligation, to, of confidentiality and prohibition of how do you say this, can you please help me?

[00:10:19] Alex: Surveillance and inception.

[00:10:22] Teresa Quintel: It like this.

[00:10:22] k: Okay.

[00:10:23] Teresa Quintel: And so those service providers, Previously we're scanning on their services for, child sexual abuse content of csun. and with this application of the privacy rulers, they would not be able to do this anymore before they did it, based on, on the legitimate interest legal basis, but,

[00:10:45] k: Oh, you talking about actively monitoring? Was that what you were looking for? That the proprietors are actively monitoring for C

[00:10:51] Teresa Quintel: Exactly. And now with the application of the privacy rules, they would not be allowed to do this any longer. So there was a, this so-called interim regulation that would,have a derogation in place for those, service providers to enable them to use the legitimate interest legal basis under the G D P 

to continue these, this scanning, on their services.

but this interim regulation, were no longer, apply in August next year. And that's why, the European Commission wanted to, put in place a long-term solution.

And they proposed this, regulation that we are speaking about today. 

[00:11:34] Paul Breitbarth: and is there a big difference between the long-term solution and what the companies have already been doing for the last couple of years?

[00:11:43] Teresa Quintel: Well, I would argue that there is, because it's an obligation to put in place these measures. So for example, the risk assessment and the different, orders and the mainly discussed orders are the detection orders, which would . Oblige those service providers to scan the communication. And that includes also end-to-end encrypted communication and or the weakening of end to end encrypted communication because you cannot really scan end-to-end encrypted communication.

[00:12:17] Paul Breitbarth: Is it really different from what? What's happening today?

[00:12:20] Teresa Quintel: Yes, it's different because it all, it includes all, all kinds of communications. it's an obligation. It's supposed to be the legal basis, referring to actually the EPR directive because under the EPR directive you need national measures that would allow for such kind of, Measures, although probably that would also not be really allowed.

and so the, the proposed CSUN regulation, argues that this would be such kind of measure to allow for such scanning and so

[00:12:56] Paul Breitbarth: So the scanning has happened, but it becomes much broader, much more at a larger scale. and it becomes

[00:13:04] k: legitimized,

Alex, your thoughts on it? Not that I anticipate that you have many thoughts on it, especially around the EPR directive

[00:13:12] Alex: I, I mean, first of all, I'm gonna go back to, Theresa's comment at the beginning in relation to the, the trigger warning. I agree. I I think we do need to talk about these things even though they're difficult. In my particular case, you know, my school was first investigated in 1992, with over a hundred complaints and the crown prosecution service.

Swept it under the rug because it was too embarrassing for the British government. It was a very taboo issue. Right. So it was only back in, it was only then in 2015. So you're talking what, 23 years later? After I came forward and gave my story to the press in 2012 that we saw convictions for, for three abusers, 27 boys at my school.

you know, so when you look at it from, from that perspective, it's really important that about these things because when don't those harms continue to manifest over an extended period of. And we're finding so, right. So it's very, very important. Paul issues, this is a, a concern than the. with, along with Patrick Breyer at the European Parliament, for a couple of years I gave a number of speeches at the European Parliament and elsewhere on these issues.

and politically it was a big problem because no politician wants to be seen, not to be defending the rights of children, right? So when it comes to. This type of legislation, it's a very easy win for the politicians who push this type of legislation forward. What I find particularly interesting is, in relation to the European Electronic Communications Code, I was in a meeting, in Brussels back in 2017 when the European Commission introduced the, the proposal for the privacy regulation, which as you know, extensively and

messages. Twitter, Facebook, et cetera, do not fall under the jurisdiction of the European electronic communications coders over the top services 'cause they're incidental as opposed to a core part. Yet they then still push through a piece of legislation, which is based on the contrary definition of this, right?

So one has to wonder what the true motivation is first and foremost, in this situation where they're, you know, they're arguing one point in one room, then they're arguing for surveillance in another room for something which shouldn't actually even be in scope. According to their interpretation. but yes, there's a lot more in this, that the mandatory orders obviously turn into blanket surveillance, which we'll probably discuss at length from in, in the, you know, the relevance of jurisprudence here in the EU on those issues.

The amount of data and the scope or the, the, the, you know, the, the broad use of this across anything which is in, within scope of the privacy regulation. So all over the top services, all communications networks, digital communications such as your telephone nowadays is even digital, right? Your cell phone, doesn't even use analog anymore, so that would still fall under.

The EPR regulation would also fall under this new proposal. and then of course there's the, the range of data which is being targeted there. So not just, images, but we're also talking about potential grooming conversations, phone calls, audio messages, et cetera, et cetera. Messages that you have with people in video games.

Right.

[00:16:15] k: Right. I was gonna say, to put it in perspective, we've had several video game companies get in trouble because of their chat capabilities. So, you know, a lot of grooming, a lot of solicitation, a lot of blackmail, non-consensual pornography going on across chats and video games. Mastodon just hit the news, for being linked to quite a bit of CSAM as well, not their fault.

I mean, we're not . Fault, but basically just due to the structure of the services with the Fed averse actually. and so Discord also just hit the news apple's trying to take measures. So just to put it in perspective, this is a lot of very common technologies that people use. So if you think we're talking something, let's go back to exotic, something exotic and not common that, you know, the average person doesn't use.

No, these are the most common tools that people would use.

[00:17:04] Paul Breitbarth: That would be the biggest problem with this legislation, that it is continuous surveillance of our . Common most tools,on an ongoing basis.

[00:17:14] Alex: 

Yeah.

[00:17:14] k: Yeah,

Threads has a new one already. Threads is being built on the Fed versus as well, so unlike most of the monitoring that Facebook would be able to do or could do, I mean, technologically, gonna have a hard time controlling it on threads. So it's not an easy solution, but it's not an easy

[00:17:29] Teresa Quintel: I think.

[00:17:30] Alex: Also the scope of the data,

just, just add this quickly. The scope of the data is massive under the voluntary Deion, so the derogations, the privacy directive as it currently stands, article 15 under the current delegation, it's a voluntary process, right? So we're seeing tens to hundreds of millions of, of communications being incepted in the scan.

Under the new proposal, the mandatory proposal, we're talking hundreds of billions of communications every single day.

[00:17:55] k: And people may be thinking, that's fine. These type of communications need to be monitored. They need to be sourced out. They need to be stopped. But do you know how many more billions have to be monitored to catch the ones that are

[00:18:12] Alex: Yep. And we've always said that we won't install cameras in people's homes because it's going too far. Fundamental liberties and freedoms. Right? So this is always the benchmark as to, no, we can do this in public spaces. What they're doing here is effectively putting a camera in a microphone in everybody's home.

'cause it's the things that we do in our homes, on our communications devices, in our social spheres, in our social graphs and our social circles. So it essentially putting the camera and microphone in the, albeit indirectly.

[00:18:42] Teresa Quintel: I think, well, another very problematic, aspect of this proposal is that it's not only about known csun, but it's also supposed to detect, new csun or unknown csun. And of course there if we. Speak about accuracy rates of existing tech technologies. They are much lower of course. And if we look at grooming or solicitation of children, we need to also take into consideration that of course, in order to figure out whether it is actually grooming and it's a behavior, right?

we need to put a certain text into context and analyze it. So we need someone to look at this and. Detection technologies nowadays are mainly trained on the English language. So what are we doing if someone suddenly uses another language, or for example, audio. Nowadays a lot of, people use this audio messaging function, in WhatsApp or any kind of messaging services.

Those are not included. So anyone who would like to groom a child could simply use this kind of function. So we, sorry. So we also need to speak about effectiveness here, right?

[00:19:59] Alex: Yeah,

[00:20:00] k: Hmm. Yeah, and I think there was one of them, hold on. I think there was one that just said that they were going to, hold on. I was just looking. I, I'll come back with it. Go ahead

[00:20:11] Alex: So the other big elephant in the room here as well is, and how long we've got for this podcast. 'cause we could literally talk about it all night. ai, synthetic media. How are we gonna cope?

[00:20:20] k: exactly where

[00:20:20] Alex: How the hell.

[00:20:22] k: has just blocked it.

[00:20:23] Alex: How the hell are we gonna cope with the, the production of synthetic media, which is based around css A and we know it's happening already.

There been reports of this happening already. We're gonna see the internet is gonna be flooded with these AI generated images, which is gonna make, trying to find a needle in a stack of needles effectively. with regards to

[00:20:43] k: It is, and that's exactly what I was looking for, Alex, is Discord updated its policy saying that they now explicitly prohibit AI generated photorealistic. C ss a m. This came out July 12th. so that's exactly where I was going. What I was looking for is you have to continue to the incredibly complicated additions of 

AI 

[00:21:04] Paul Breitbarth: How is, how is Discord going to enforce that policy? Because,

[00:21:08] Alex: Yeah.

[00:21:09] Paul Breitbarth: even, even open AI themselves have confessed that they are not able to recognize AI created content. So how is Discord gonna do that?

It's nice to have it in your policy, 

[00:21:20] Alex: Yep. Open AI just shut down the open, just shut down their division specifically for detecting it because they can't do it. Right.

So,

[00:21:27] k: Yeah. And so it's, it's updating its rules. It's launched a family center tool that it can, I'll give you the news to this as well. they've banned the teen dating servers. They used to support them. They now banned the teen dating services. but this also followed an N B C news investigation that found that Discord servers advertise teen dating servers.

what . That saluted nude image, solicited nude images from minors. So how are they gonna do it? I don't

[00:21:56] Alex: We're gonna see ai.

[00:21:58] k: where the really intrusive part comes in,

[00:22:00] Alex: we're gonna see AI not just create massive amounts of this media, we're also gonna see it weaponize this media, right? There is no way that AI will not be used to create synthetic media for the purpose of blackmail. for fraud and other, other benefits, other criminal activities, right?

Particularly in the political and and celebrity spaces. Where we're gonna see this happen very quickly and on a massive scale is my prediction.

[00:22:23] k: Yeah.

[00:22:24] Paul Breitbarth: So, Alex, can you talk a bit about the risk to the end of end-to-end encryption here?

[00:22:30] Alex: I mean, the detection orders make it very clear that, you know, service providers must provide means when they're under the detection order to be able to meet the requirements of the, of the proposal, when it becomes law. So they, they will be in a situation where if they have end-to-end encryption, they'll either need to leave the European market.

As Apple are threatening to do in the UK under a similar piece of legislation, or they'll need to introduce back doors or weaknesses within the end-to-end encryption to allow them communications on scanning happen. Rule out. apple attempted to do that a couple of years ago, as you know, and were hit very harsh, with some very negative diage result of that.

and pretty much the politicians in Brussels have decided that that isn't a position they're willing to support. so we're not gonna see a situation where they're gonna be able to scan this media prior to encryption. So the only way they'll be able to do it is. To actually weaken the encryption or introduce some sort of back backdoor, either way is not a good, is not a good situation because as we know and as we've worn for decades now, when we start to do that, we're opening up not just weaknesses for law enforcement, but for anybody else who might seek to, exploit

[00:23:37] k: Yeah,

[00:23:37] Alex: Right.

[00:23:38] k: gonna have a deleterious effect on legitimate uses for

And then this is also, this is also the, the thin end of the wedge, right? 'cause if these service providers now start to implement these changes because the EU have introduced this law, what about when, Iraq or Turkey or China or some other country that may not have the same, you know,Level.

[00:24:00] Alex: same level of democracy or, or the same intents and purposes and laws that we have within the EU with regards to fundamental human rights.

What happens then? Are we gonna then condemn these same companies for obeying those? Yeah,

I think we don't even have to go outside of the eu, right? If you're standing, I mean, we need to look at certain countries in the EU in terms of L G B T rights and what happens to children who, who are gay or lesbian. And there we don't have to go very far, I would say. Right. Exactly.

[00:24:31] Paul Breitbarth: Not even that. Just also our European security services, whether it is the Dutch or the French, or some of the others, they are also having very intrusive and activist data collection programs, which will become far easier to use if there is a vector in the technology.

[00:24:47] Teresa Quintel: Yeah, I mean, we wouldn't have all this case law about data retention if there wasn't, always a kind of push from the member states to actually go back to a little bit more data retention. Right. So,

[00:24:59] Paul Breitbarth: So Theresa, as the scholar, do you, do you think that should the Csem proposal be adopted by the member states, by the European Parliament, do you think it will stand the test of the Court of Justice of the European Union or the court in Strassburg for that matter?

[00:25:14] Teresa Quintel: Well actually, I think this is a very straightforward answer, and the answer is no. And this was actually recently concerned, confirmed by, the council legal service itself, which argued that even the essence of the right would be likely to be, touched upon by this proposal. So I think we don't even have to go very much further in looking into proportionality necessity here.

If . This is already being argued by, that service and, looking at the case law of the C J U and whether or not,the proposal or the proposed, mainly detection orders would, stand the, the, court's case law.

[00:25:57] Alex: It may not even get to that point. There's a, another case in Germany last week, where a, a survivor has filed a lawsuit, against. service providers for using this technology under the delegation. So, so we could see that we, we have something in front of the, the European Court, sooner rather than later.

[00:26:17] k: Will you mind making sure you send me the link to that

[00:26:20] Alex: Yes. I

got an update from

[00:26:21] k: in 

[00:26:21] Alex: Bre. Yeah, yeah.

[00:26:22] Teresa Quintel: but I think even if it wouldn't stand the, the court's, decision still, if it would be adopted and there would be these measures put in place, it would take years to, have a case, right? And you would have

[00:26:34] Paul Breitbarth: Yeah,

[00:26:35] Teresa Quintel: doing those.

[00:26:36] Paul Breitbarth: I

[00:26:36] k: yeah.

[00:26:37] Paul Breitbarth: retention was adopted in 2005 and it's still happening, so.

[00:26:40] k: Well, and I think the one thing we haven't touched on is all the other impacts that it's gonna have. We talked about the impact to legitimate, you know, valid encrypted communications. What about the people that have to be the ones to review all this

[00:26:53] Teresa Quintel: That's another very, very valid, valid question because what we see now, if we look at, reviews, they are being outsourced mainly to the global south, and we are basically destroying. People over there because we want to protect ourselves of these kind of images.

[00:27:13] k: And we don't mean just making 'em feel bad and making 'em sick daily. We're, we're talking about nightmares and long-term trauma. we've seen studies coming out from some of the social media, you know, people that review the, the band images and things. A lot more than just CS a m, but still they're, they're starting to put time periods on 'em.

They can only do it for three months or something like that, but even then, they're traumatized for life for the things that they see. I mean, I know they're not traumatized more than the victims, but this is just creating a whole new set of complications,

trying to control it. 

[00:27:48] Alex: I wouldn't even agree that they're not traumatized as much as the victims. You know, these, these people are put in horrendous situations. I think it's disgusting that we're outsourcing this to, to, to, you know, African countries purely because it's cheap. Right? That's the reason why they're doing it.

'cause it's much cheaper. it's not because of anything else. For them to hire the, the tens of thousands of people they have doing this. Would cost them a magnitude more money, which they just don't wanna do. So this is a cost cutting exercise. The, the PTSD ts that these people suffer is just extraordinary.

I mean, it's, it's one thing, it's one thing, you know, being aware of a situation. It's another thing. Having to watch multiple individuals being abused in these environments over and over again, and not being able to do anything about it. Can you, can you imagine the feelings of guilt that these people have on top of everything else and having to sit and watch that?

I, I, I find that absolutely deplorable. I do not think that's okay at all. If, if, if we're gonna introduce laws which force this type of monitoring to, to exist, then we have to have the, the balls to do that ourselves. I'm sorry, but we do, we should.

[00:28:57] Paul Breitbarth: One of the people in the panel that you spoke at at C P D P was actually the chair of the Dutch, hotline to report child pornography online. I think she, she shared similar stories from her staff. Right.

[00:29:11] Alex: Yeah.

[00:29:11] Teresa Quintel: Yes, yes, she did. And I think she, she's really one of the greatest, speakers and also very knowledgeable on this topic. So, I can only recommend to watch this, video 

and what she has to say about the topic as well.

[00:29:24] Paul Breitbarth: We talked about why this proposal is not good, but we also all recognize that this is a problem that we need to discuss, and that we need to solve. and I. I mean, we cannot probably, we cannot solve everything, all at once and all the problems regarding child sexual abuse because it is of all times.

so I think it would be an allusion to think that we can solve it once and for all, in, in the next couple of months. but what would be the right approach in your views to counter this?

[00:29:55] Teresa Quintel: Well, I, I think first of all, we cannot look at the different types of csun, so, known new grooming in the same way. We need to differentiate. Also in terms of technologies that exist, I think in terms of known csun, there's a much more straight, straightforward way of dealing with it. And technology can be used to clean up the internet of csun known CSUN to a certain extent, but I think technology cannot solve, and there we go back into this

Cost cutting exercises were technology cannot solve all the structure and underlying issues, and I was really shocked when doing one of the workshops I. listen to one of the speakers who said, every one of you knows a child who's being sexually abused. And from that moment onwards, I'm constantly, I mean, I have children myself and I, I'm looking differently at, the caretakers and parents, when I pick up my children from daycare, for example.

So I think. There's also the structural, issues that need to be solved with in a different way in terms of awareness raising of parents of care, caretakers, but also of children because it's also children who are being very much involved. Also due to social media and how much a change during the last years due to the Hypersexualization online, how children are portraying themselves and are being portrayed by their parents from early age on.

I mean, we just need to look at parenting and. parents making money by showing their children online. Like what will happen to the, those people, at the latest stage in their life. And I think those are many of the, of course there are additional ones, but there are underlying issues that need to be solved in terms of how much can we have children use the internet, right?

It was made by adults, for adults, and. Yet we see children being in front of YouTube or I dunno, TikTok at from the early age on just because the parents want to eat in quiet or I dunno. so I think this is something that needs to be taken into account here as well, because this is where it starts as well.

So this is not only, yeah, this is what I wanted to.

[00:32:21] k: Well, and we know, we know children at a young age can use it. Just look at the videos out there of seven year olds ordering pizza on their children's iPads with their parents not knowing. They, they know how to use computers. They know what it is they want to do. Prior to this, prior early on in my professional career, I was a child mental health counselor, and my youngest victims of sexual abuse

[00:32:41] Teresa Quintel: Yeah, this, I mean, this is, it's just so horrific. Like sometimes when I, my, when I read about it, I, I just, Want to cry to be honest. and of course there needs to be a balance between children need to be exposed to a certain media knowledge and learn how to deal with this. And of course, also be aware of the risks.

Or for example, if your child, wants to engage in sexual communication online. then maybe don't show your face because, I think there needs to be the right balance between how much is good in order to learn and how much is there a limit to, I don't know anymore what my child is actually doing online.

[00:33:28] k: Right. 

[00:33:29] Paul Breitbarth: Alex, what in, in your view, would be the alternative to this proposal?

[00:33:33] Alex: I mean, I'm a survivor, but I'm also a computer scientist and also, you know, a privacy advocate. So it puts me in a, in a dilemma that probably many people wouldn't find themselves in, because I have to, you know, balance the different sides of the arguments here. 'cause it's, it's what we need to do as a society when I'm dealing with these difficult issues, right?

So we can't take a knee jerk reaction, but technology does not solve the problems of the world, right? People solve the problems of the world. We use technology sometimes as tools to help us solve those problems. but it's people that solve the problems. And I think that, and I've said this so many times now, I, I've been studying this issue my entire life.

You know, it's something which impacted me from a very, very young age. I've studied it as a psychology student. You know, I, I, I ran an org, the first online organization, working with abuse victims and survivors, working with the police, tracking down this type of material, going way back to the, the 1990s, when the internet was still in its infancy.

And as, as I said then, and I, I say again now and.

Child sexual abuse by dealing with the consequences of that abuse. You know, there's a lot of talk about, you know, this, this, proposal being to prevent child sexual abuse online. It's a misnomer. Child sexual abuse really doesn't take place online. Child sexual abuse takes place in the real world. It gets disseminated online, but the abuse happens in the real world.

So we need to tackle the abuse in the real world, not in the web browser. and that means that we need to put more money into research, more funding into research as to why people abuse. We need to hold industries more to account for hypersexualization of miners, particularly the entertainment industry, advertising industry, the gaming industry, et cetera.

and I think we need to have more awareness. In the curriculum, not just for children, but for adults as well. Parents need to be aware. This is something that should be discussed, frequently at, parent evenings, at at schools. I, I believe, and Fearmongering needs to go out of the way and, and actually produce some, some real material, which is gonna be helpful.

I don't believe in the stick I use.

[00:35:41] Paul Breitbarth: And probably to add to that also, also more mental healthcare.

[00:35:46] Alex: Yeah, more mental healthcare, more support for victims and survivors. I haven't had any support whatsoever. I haven't had any, mental health support. I haven't had any financial support, nothing. and this is a big problem. You know, we, we see, home secretary saying, we're gonna do more. We're gonna set up funds, we're gonna set up services, et nothing ever happens.

Nothing ever happens, because it all just gets lost in politics. Right? And this isn't a political issue. This is the societal issue, which we really have to start taking seriously and stop treating us taboo, brushing it under the rug, because it's too difficult to talk about. So it goes full circle back to the trigger question, you know, the trigger statement.

At the beginning of this set, at the beginning of this podcast, we have to talk about these issues. We're not gonna find the solutions until we do talk about these issues, and we're not gonna resolve those, those issues if we keep brushing 'em under the rug. Technology is not the answer. People are the answer.

Let's stop it at the root instead of trying to deal with the consequences.

[00:36:38] Paul Breitbarth: Well, thank you both for this important discussion. I do agree that we need to have this discussion. That's why we wanted you to have you both of you on the show. but I also realized that for some people it is still too difficult to have the, the conversation, hence the the, the trigger warning. so thank you both.

I can imagine that there are people who want to join the conversation. we may do follow up episodes,

[00:37:02] k: Charlie, the podcast, who has not tried to join a conversation for a year or two. Now, all of a sudden today, he's incessant with it.

[00:37:10] Paul Breitbarth: we may do follow up episodes if there is a, a, a desire to do so. if anybody wants to make the counter argument, of course, they're also welcome to, to join the, the discussion on why it is inevitable that we would do this. We're open for the debate, for now, thank you all for listening. You know where to find us online.

Kay is Heart of Privacy. I'm Europol B. The podcast, is serious privacy on LinkedIn. until next week, goodbye and thank you all.

[00:37:36] k: Bye, y'all.