Serious Privacy

Coulda woulda shoulda - a talk on Ethics

March 22, 2023 dr k royal and Paul Breitbarth Season 4 Episode 9
Serious Privacy
Coulda woulda shoulda - a talk on Ethics
Show Notes Transcript

In this episode of Serious Privacy, Paul Breitbarth of Catawiki and Dr. K Royal of Outschool tackle the tough topics of Ethics in data protection and privacy. This week's episode may be a little bit more philosophical than our usual ones. No big updates on what's happening in the world of privacy, but instead of conversation about ethics, what does it mean to take an ethical approach to data processing? And how does it work on a daily basis? 

This includes a perspective by Bojana Bellamy the President of Hunton Andrews Kurth LLP’s Centre for Information Policy Leadership  (CIPL) published with the IAPP. The article is International data transfers: Time to rethink binding corporate rules. 

As always, if you have comments or questions, let us know - LinkedIn, Twitter @podcastprivacy @euroPaulB @heartofprivacy @trustArc and email podcast@seriousprivacy.eu. Please do like and write comments on your favorite podcast app so other professionals can find us easier. 


If you have comments or questions, find us on LinkedIn and IG @seriousprivacy @podcastprivacy @euroPaulB @heartofprivacy and email podcast@seriousprivacy.eu. Rate and Review us!

Proudly sponsored by TrustArc. Learn more about NymityAI at https://trustarc.com/nymityai-beta/

#heartofprivacy #europaulb #seriousprivacy #privacy #dataprotection #cybersecuritylaw #CPO #DPO #CISO

[EXCUSE ALL TYPOS]
S04E09 - Ethics

[00:00:00] paul_breitbarth: This week's episode may be a little bit more philosophical than our usual ones. No big updates on what's happening in the world of privacy, but instead of conversation about ethics, what does it mean to take an ethical approach to data processing? And how does it work on a daily basis? We'll probably talk a bit more about this topic later this year with some guests, but for now, sit down, relax and hear what K. And I have to say about a topic. 

My name is Paul Breitbart. 

[00:00:37] K Royal: And I'm K Royal and welcome to Serious Privacy. So I love it, Paul, we don't have anything truly critical or driving or up in the air right now. We have a few things, but not like usual. So I like the idea of today of maybe changing it up a little bit, talking about a couple of things, and I think you suggested privacy ethics.

[00:01:01] paul_breitbarth: Yeah, I'm, I'm very much into ethics at the moment, and that is mainly because as you know, I'm, I'm, I'm teaching this course at Mare University this month on ethics, accountability, and corporate social responsibility. And. Last week was all about ethics. So last week, Thursday, we had an amazing lecture by Catherine O'Keefe, who co-authored a book on data ethics, together with Darryl O'Brien, new edition coming in May, and we hope to get them on the podcast done as well.

. And then on Monday we had the tutorials where we started off with the case study that will run through this whole course of, of four weeks where also ethics played the the important role. So ethics is front and center of my mind this week of everything that is not daily work related.

[00:01:45] K Royal: I love it. So let's first go to the unexpected question.

What is your favorite cartoon?

[00:01:52] paul_breitbarth: Hmm.

You know what, that would be a, a, a Belgian one, but not one that you, that you would be familiar with.

[00:01:59] K Royal: Oh, what

[00:01:59] paul_breitbarth: it's called Siska and Visco, or Bob and Bobette in a, in French. I don't even think there is an official English translation of it but it's, it, it has been coming out since what, the 1950s? and it's still being, being drawn. There's still four or five albums a year that are being released. And, it's about a family in Antwerp. But they're partly Flemish, partly living in the Netherlands and

[00:02:27] K Royal: It sounds awesome.

[00:02:29] paul_breitbarth: kinds of adventures. So, but you need to speak Dutch or French to be able to read

[00:02:34] K Royal: to be able to read it. I'm trying to do the French thing. I took French in high school. I'm trying to learn how to be more conversant in French as an adult. Marie Penot is, is helping me with it. Sometimes, but yeah, probably not good enough in it yet to read that. So for me, I keep thinking like Calvin and Hobbs or Marad Duke or Garfield or Peanuts with Snoopy and Charlie Brown.

I'm gonna go Betty Boop. I am a Betty Boot freak. I even have a copy of the original Betty Boo trademark thing. Framed and everything. And stamps and, ugh. Yeah, I'm a, I'm definitely a Betty Boop freak. So let's move on. I think you had one thing that we needed to talk about. A decision that came out recently.

[00:03:18] paul_breitbarth: Yeah, actually this morning, and guess what? Facebook received another slap on the wrist.

[00:03:23] K Royal: Ah.

[00:03:24] paul_breitbarth: And this time in the Netherlands where the the District court of Amsterdam decided that between April, 2010 and January, 2020 Facebook broke the law while processing personal data for advertising purposes. You look at my surprise face right now. The court said, and this is, this is based of a a mass claim action. the court said that indeed Facebook used personal data for advertising purposes without a legal basis like consent. The legal basis also was not there to process, categories of data like sexual preference or religion. And even though it was personal data that was that was provided by the users themselves, they did not have a proper legal basis to, to process.

also, Facebook did not sufficiently inform users about the sharing of their personal data with third parties. And also in both situations, not only the data of the individuals themselves were shared, but also of their Facebook connections.

[00:04:22] K Royal: Okay. Connections, not just people that maybe they had pictures of or anything like that in their pictures, but their actual connections cuz friends of friends.

[00:04:29] paul_breitbarth: exactly. So the whole, the whole network. So, as said, this is a, this is a decision of the district court as part of a, a mass claim trajectory, the Data Privacy Association. and as a next step, they want to claim damages from. But before they could do that first, their guilt had to be established, and that has now been done.

So this was a civil procedure not an administrative procedure. So there is no immediate effect. But it is interesting to see that also in a, in a collective civil action these kind of decisions against Facebook, against meta, have been.

So I'm sure they will appeal. But in the meantime probably the claim for damages will also move forward

[00:05:09] K Royal: Interesting. How big was the penalty?

[00:05:11] paul_breitbarth: well, so that there is no penalty yet

[00:05:14] K Royal: Okay.

[00:05:15] paul_breitbarth: because this is civil. And the damages part of the of the claim is still to be still to be determined.

[00:05:20] K Royal: Okay. That will be exciting to watch then.

[00:05:24] paul_breitbarth: Yep. We'll keep you posted.

[00:05:25] K Royal: All right. I love it. And.

I don't really have anything mega, mega happening anywhere else. I mean, there's lots of things happening, don't get me wrong. Lots of things happening, but nothing that I think we really need to update you on right now. But I thought it might be a good idea. Actually, it was Paul's good idea to talk about privacy and ethics based on this. Not to the detail that he might have gotten in his class, but just something that we all have to deal with on a daily basis, or maybe we should be dealing with on a daily basis and maybe we don't.

Paul, start us off there. What would you like to talk about? Because Okay. I sit here and say, Paul, start us off there. And then I sit

[00:06:05] paul_breitbarth: Well, you actually started it already, because the first question, do you actually use ethics as part of your decision making process when it comes to privacy discussions? I think that is a, a very valid question for all DPOs to to answer. so how about it? Okay. Are you using ethics as part of your decision making?

[00:06:26] K Royal: Yes. I would say I would love to be able to say yes, indubitably at all times. I'm always thinking ethics when it comes to privacy, but especially the job that I'm in right now where most of the data I'm dealing with is out of children.

So the ethics of what we should be doing with that data is always top of mind where we are. And I'd like to say that I, that it's top of mind no matter what data we use, but you know, I'm human and there are times you're probably thinking expediency or you're probably thinking something else. I can't think of a single example, but you know, maybe you're not specifically overtly. Actively thinking ethics when you're looking at the use of data, you're looking at maybe the legalities instead of just the ethics of it.

But I would say yes, I take ethics into consideration right now.

[00:07:17] paul_breitbarth: I think that's that, that, that's a very fair point. And, when you. Look at when you, when you look back at, at earlier discussions about this, we know that in, what was it, 2018, during the International Commissioner's Conference, Giovanni Belli, then the European Data Protection Supervisor in his speech about data ethics also said Data should serve people, and not the, the other way around.

And I think that's a, that's a good mantra to keep in mind. Whenever you are discussing data processing operations, with an organization or within an organization

[00:07:53] K Royal: And like I said, I, I'd like to think that my standard method of operating is to take ethics into consideration. 

[00:07:59] paul_breitbarth: Good. And I think, I think that that, that is important. I mean, I try to do it as well. And. I think it's, it's more difficult if you are new to an organization also to find out what the organizational ethics are because you, of course, you have your personal ethics. And I think for both you and I, ethics will always play a major part in our own thinking about compliance in our own, thinking about data protection and privacy issues about fundamental right in general, because, That is how we were raised in a certain legal tradition.

In a certain cultural tradition. It's, it's part of who we are,

[00:08:38] K Royal: and it's part of probably what brought us into privacy because of the ethics of it. 

[00:08:42] paul_breitbarth: I think so, yeah. 

[00:08:43] K Royal: So yeah, I agree with you. I, don't, but you know, I have to still say I'm human. I think I would. But I hate to be an absolutist at all times, so I have to say there's a possibility, there's a time or two that off the cuff someone asks questions and you're not thinking ethics, you're just thinking legalities.

Which by the way, people, the reason I keep saying that is there's a difference. Just because something is legal does not mean it is ethical.

[00:09:09] paul_breitbarth: No. Absolutely. And that is that it, it, it's actually three questions that, I'm trying to also educate my colleagues on and my students on that you, that you should ask yourself also, when, when assessing risk, when assessing a data processing operation more in general. First of all, could we do it?

Is there a positive business case that we can make to process the personal data? Then may we do. Is it allowed under applicable laws to undertake the planned activity? And then also should we do it? And that is if this planned activity is the right thing to do or whether there are other alternatives to to consider.

And that is the, the ethical part. Now, some, some people claim that the, the final question, the, should we do it should actually be the first question and you first should discuss should we do it? And only then if you can, and if you may do it.

[00:09:59] K Royal: I can see the logic in that because one, if it's illegal, that's just a hard no. Don't get your engineers and everyone all excited about the possibilities of new technology or new processes that they could build and get excited about, and then be told no. Cuz that's usually what happens. , they usually go build it and then go, oh, we're going live this afternoon.

Can we do this? And then of course it could be the laws say they can't do it, or it could be that the ethics say you shouldn't do it. So I do agree. Maybe that legal question should come first and then the ethics, and then once it passes that which is essentially, oh wait, an impact assessment, then look at building it and whether or not you can.

[00:10:41] paul_breitbarth: Yeah, but at the same time, if you do all the legal analysis before you even know whether it is feasible is also difficult. So,

[00:10:48] K Royal: So what comes first, the chicken or the egg?

[00:10:51] paul_breitbarth: Basically, I think, I think that is a, that is a big part of the discussion and maybe the answer here is the legal answer to answer all questions. And that is, it depends,

[00:11:01] K Royal: exactly.

[00:11:02] paul_breitbarth: and it will depend on the, on the situation, on the type of data that you will be using. Where to, where to start.

[00:11:09] K Royal: Well, yeah.

[00:11:09] paul_breitbarth: When I, when I look at the whole ethical framework and, and, and also how to, how to explain it to, to business colleagues.

I think it's also the discussion between business impact and individual impact that should be taken into account. some projects will always be a no-brainer if they are if they are low value, but also have a low impact. And there are probably no regret options that you can move forward with without very many assessments and, and very much documentation. where it becomes interesting is for those projects that have a relatively low business value, but a really high impact on the individual, or even if they have a high value on the individual, on a high impact on the individual, but also a high business impact. So the return on investment,

[00:11:54] K Royal: that's where the trouble is, is the high 

[00:11:57] paul_breitbarth: and that is indeed where the trouble is.

That is where you need the discussion because then you really need to start balancing between the business interest on the one hand and the individual rights on the other hand. And

[00:12:09] K Royal: And the business

[00:12:10] paul_breitbarth: I

[00:12:10] K Royal: interest should never, never outweigh the individual interest. When it's a substantial risk. When it's a substantial risk. I did qualify that.

[00:12:18] paul_breitbarth: If it is a substantial risk, but then qualify also what is a substantial risk?

[00:12:24] K Royal: Right,

[00:12:25] paul_breitbarth: Is it a substantial risk also if the likelihood of that risk is low. So there, if there you have another dynamic,

[00:12:33] K Royal: a high impact, but a low low, likelihood, And that's a basic diagram for looking at all risk. Even when you're looking at disaster recovery and risk to systems, you always wanna look at what's the likelihood that something could happen. It could be a tornado, it could be a flood, it could be a pandemic. You look at the likelihood that it could happen, and then if it did happen, what would the impact be?

And in this case, you'd look at the impact to the individuals and the impact to the business. But yes, that is a a totally new way of looking at it, it might be a significant impact, but how often would this happen? Probably never.

[00:13:13] paul_breitbarth: No. And that makes it, that, that also, that makes it really interesting. and then you probably need to start wondering, how, how pragmatic or how fundamentalist do you want to be? And can you also be a, a pragmatic fundamentalist, 

[00:13:30] K Royal: Can you be a pragmatic, fundamentalist, ethical innovator?

[00:13:34] paul_breitbarth: I think you can, or I mean, I listen to other privacy podcasts as well in compliance podcasts, and there is a, there is a new one. Called Sustainable Compliance by Jacob Lak out of, of, out of Copenhagen. and he spoke to Tori Alexandra Avala from Norway in his first episode. And, and she actually explained in that episode that she identifies as a pragmatic purist, where.

Quite a purist when it comes to her own data and her own data processing. but much less so when she gives recommendations to companies, because there is also the understanding that companies have many more to take into account than just data protection. They have a lot of interest that need to be weighed, and sometimes data protection prevails, but sometimes the business interest will have to prevail.

And also then you need to. As a privacy professional, you need to be able to stand behind that decision, but at least make sure that you take all the possible measures to mitigate the risks, and then reduce the likelihood of those of those risks appearing.

[00:14:40] K Royal: right, and, and I agree that just like HIPAA was never put in place to prevent medical care, privacy was never put in place to stop. Businesses from operating completely. Maybe some business processes from operating. Don't get me wrong, I'm not saying it's a fair game out there, but it was never put in place to stop innovation.

You should be able to innovate legally and ethically, and if you can only innovate illegally and illegally, illegally and unethically, maybe you shouldn't be doing.

[00:15:15] paul_breitbarth: No, that's true.

[00:15:17] K Royal: I mean, that's a simple statement, but it seems to me to be a rather critical way. 

[00:15:21] paul_breitbarth: It is very, it is a very simple statement, but there is so much behind that symbol, that simple statement.

[00:15:27] K Royal: Are you saying there's too many nuances here? There's a lot of layers, like an onion.

[00:15:32] paul_breitbarth: well, there's never too many nuances. I mean, , that's nuance is my middle name. But

[00:15:38] K Royal: Oh,

[00:15:38] paul_breitbarth: is something that you.

[00:15:39] K Royal: you said that you had three middle names. Nuances is one of them. sorry.

[00:15:46] paul_breitbarth: It's one of many but no, to be honest, I, I, I do believe in nuance and especially data protection isn't black and white. It is many, many shades of gray. Many more than the 50 in those, in those books. and sometimes she moves a bit more towards the compliance end of the spectrum. And sometimes you move a bit more to the business friendly and, and, and non-compliance side of the spectrum and all those shades in be in between. And for some companies they can go completely black in some situations where they say, well, we really want to do this or we really want to try it. And for now we just don't care about

[00:16:24] K Royal: Well, and to be honest, I sit in the legal department, but not all privacy does. Sometimes privacy is its own department. Sometimes it's part of compliance, sometimes it's part of information security department. Sometimes it's part of it, sometimes it's part of marketing, whatever it is.

[00:16:38] paul_breitbarth: it's part of, of finance.

[00:16:40] K Royal: Yeah. Yeah. It can, we, we all know privacy can sit pretty much anywhere, so, but when you look at it, we're not the final decision makers.

[00:16:50] paul_breitbarth: No, no, no. We are not. We, we give a 

[00:16:52] K Royal: yet we make a recommendation or give the reasons why, and quite often business will not necessarily follow our recommendations. But that's your job. I mean, like I said, I sit in the legal department. My job is to tell them what the risks are and to tell them what the decisions should be.

They have the right to do something else. Now, under gdpr, the DPO, the DPO is. Is protected. The DPO can keep telling them what to do something, and then if they go do something against the DPOs recommendations, that's a bad thing. And you can't terminate the DPO for doing their job, including making

[00:17:31] paul_breitbarth: No, but you can also not terminate a, a, a business colleague for doing what they consider is in the business interest, and for assuming the risks that have been identified by the D P O or the compliance team.

[00:17:45] K Royal: Well, and I will say that as well. I mean, none of us really like documenting what we've advised and what the risks are, especially if the business decides to go the other way. Because our job is to not take the business down and to be like, gotcha, you didn't do this one. But on the other hand, you kind of have to, that's what an impact assessment. 

[00:18:05] paul_breitbarth: yeah, I never solve that problem. I think it's, I think it is important to, to have that audit trail. I mean, if you believe as an organization, a risk is worth taking,

[00:18:16] K Royal: yeah,

[00:18:17] paul_breitbarth: you also should not be afraid in documenting

[00:18:19] K Royal: that's right.

[00:18:20] paul_breitbarth: have taken that.

[00:18:21] K Royal: And I believe in privacy. You should be do, or data protection. You should be documenting those risks because. and in most compliance efforts, I would say it's worth it. And I know that I think differently about this than a lot of other compliance or privacy or data perfection professionals do, but I believe you should because then if you get in trouble or you get investigated for making a decision, you can show them the documentation of how many different ways you looked at doing it and what the risks were, and you made a deliberate decision to go one way versus another, or versus a hundred.

and if you can show an authority or a regulator or law enforcement that you did consider it and you went another direction, you're probably gonna get in a lot less trouble than if you can't show that. You took it into consideration to begin with.

[00:19:10] paul_breitbarth: Yeah, because then you were just reckless.

[00:19:13] K Royal: Then, then you just had bad judgment, not not being ignorant of what you should take into consideration.

[00:19:20] paul_breitbarth: Exactly, and I, I mean, I, I think that's, that's only fair. I think it's important as an organization that you do take risks deliberately. And it's fine to take risks. Don't get, don't get me wrong. it's even important to take risks because otherwise there would not be any innovation. so taking risk is not the problem. but not knowing what risk you take and, and why you take risk.

[00:19:47] K Royal: right?

[00:19:47] paul_breitbarth: that is, in my opinion, the problem.

[00:19:50] K Royal: And make sure that ethics is part of that consideration. You need to always make sure that the ethics of what you're doing is part of that consideration. Like I said, given the caveat for my own decisions, I'm quite sure there are times that maybe I didn't think of the ethics. Maybe the legal answer was enough to put a halt to it.

maybe the legal answer was so strong that, oh heck no. 15 different laws say you can't do that. I didn't need to get to the ethics of it.

[00:20:16] paul_breitbarth: Yeah. So that would, that would also argue in favor of you saying, well, the, the should we do, it comes last because you, the may you do it has already proceeded before you, has, has preceded the assessment of the ethics.

[00:20:30] K Royal: Right. Well, and it could be that the ethical consideration is so blatant and so obvious, you don't need to look at the legal part of it. I mean, should I be installing cameras in the bathroom? No, there's laws against it, but I don't need to go look them up. 

[00:20:46] paul_breitbarth: Well, if you ask a bunch of sauna complexes here in the, in the Netherlands the answer was yes for a number of years until the Dutch DPA found out. well un to be fair, some investigative journalists found out and then the DPA intervened.

[00:20:59] K Royal: which is usually what happens. The ones you don't wanna find out first does, but you don't need to go look up the laws on cameras and bathrooms because ethically you shouldn't be doing it.

[00:21:09] paul_breitbarth: no and you certainly shouldn't keep them on open servers or, web connected servers without changing the password.

[00:21:17] K Royal: Right, exactly. So sometimes the blatant, unethical aspects are so obvious that you don't need to look at the law. But most problems, as Paul said earlier, not quite so black and white. They're usually various grades and nuances in there that you might have to take into consider.

[00:21:37] paul_breitbarth: Yes, I would say so. And.

[00:21:39] K Royal: are there examples you use with your students in your new course?

[00:21:42] paul_breitbarth: no, not specifically because I'm, I'm, I'm not the one lecturing I'm I'm, I'm course coordinator, so others do the actual lecturing and we, mainly use a case study. So, the case study is about an AI. HR support system that will help organizations in sifting through job applications, but also keeping the track record of existing employees and recommendations on that.

[00:22:09] K Royal: And you do know that AI and automated processes is used for job applications all the time. I mean, I'm quite sure our listeners have had the same experience that they apply for a job. and they get rejected the same day the next day, maybe the next two business days, and yet they met every single qualification posted.

But somehow, somewhere in that automated processing, you were deemed to be not the person they wanted. Yeah.

[00:22:37] paul_breitbarth: Yeah, because you didn't take one of the boxes, or maybe you did take the box, but it was explained differently on your CV or. for, for whatever reason. So that is the, that is the, the big case that runs through this course to discuss that from, from different angles and from different perspectives from an ethical perspective, but also from an accountability perspective from an E S G perspective.

so that, that all comes together. . And then of course we have a completely different case for the final exam that I won't talk about now because , I'm not going to give away anything before the students get the, get the case to, to review. but also there ethics will play a role and also the business decisions will play a certain role and have an impact on what you can and cannot do as a DPO.

And what kind of recommendations are expected from you? because oftentimes as a DPO your, your gut feeling may be don't do this. but if the board has already decided we are going to do this,

[00:23:32] K Royal: you gotta figure out how to put the guardrails around.

[00:23:35] paul_breitbarth: Exactly. And that's a whole different ballgame. That's a whole different discussion. and that also brings me back to, to, to, to my initial point. You need to learn also your organizational ethics when you join an organization or ideally before you join the organization to understand if this is something that you can work with and, and, and something that you can agree to.

Because if you don't like the organizational approach, if you. Just hired for for, for, for whitewashing policies that you don't like. then you will have a very different different and difficult time.

[00:24:06] K Royal: Yeah, it will be. Well, and some of the, the pitfalls shortcomings you might say in looking at a corporate ethics program is one, they don't tend to roll the ethics over. The privacy or the data protection uses, the ethics are usually rolled over different parts of it. So one of the problems that companies face is looking or.

How do you say this? Isolating the ethics considerations. You either have the ethics of the law or the ethics of compliance, or the ethics of HR or the ethics or something without looking at it across the board. And I would say one of the other weaknesses is looking at the data itself and not where you get the data from.

So going back to the original. Story that Paul opened with on the Facebook decision is looking at the data, not looking at the sources of the data. And that's important as well. I mean, it's the same thing as being ethical and co. And the, the E S G, the corporate social responsibility program is where are you sourcing your resources from, and that applies to data as well as it does.

Any other resources you might have, sourcing lines that go into your products or service.

[00:25:14] paul_breitbarth: Mm-hmm. . And that is especially through with all the organizations now introducing AI and especially the chatgpt variants, within their products. What is the legal basis on which chatgpt was built? How did they obtain their data?

[00:25:31] K Royal: Yeah. 

[00:25:31] paul_breitbarth: How do you ensure that all the bias that is inherently within Chet j p t, because it was largely trained on us and and Western data, 

[00:25:41] K Royal: which as we all know is vastly different. 

[00:25:43] paul_breitbarth: how do you ensure that you can deal with that? 

[00:25:45] K Royal: Exactly. And the way companies can source their data here is vastly different than how data can be sourced in many of the other countries that have privacy and data protection laws.

[00:25:56] paul_breitbarth: Yeah. And when you, when you talk about other jurisdictions also there ethics will play a different role because the ethics that we are used to in the western world are different from ethics. For example, in Southeast

[00:26:09] K Royal: Yep.

[00:26:10] paul_breitbarth: in Africa or in Latin America, and even the European, any American ethics are a different

[00:26:15] K Royal: You have to look at the culture and the cultural moray. , that are prevalent in, in that area because there's, they're, they're different. I mean,

[00:26:24] paul_breitbarth: And with that in mind, I think it's, it's very interesting to see that this Friday on the 17th of March the new data protection law in Saudi Arabia becomes an act.

[00:26:35] K Royal: yes 

[00:26:36] paul_breitbarth: and I'm still not sure what to make of that.

[00:26:38] K Royal: I'm not either.

Maybe. Maybe we should talk about that. Maybe not this show right now, cuz I think we're running out of time here. But maybe we really do need to dive into the Saudi Arabia law and compare it to some of the others. I don't know how many of our listeners are that the Saudi Arabia law impacts or not, but it's an interesting one to go through cause it's, yeah.

[00:26:58] paul_breitbarth: Yeah. In any case, I can tell you that we do not have a lot of listeners in the Middle East yet, so.

[00:27:04] K Royal: Yeah, we can see some of the stats and that's not one of our big markets.

[00:27:08] paul_breitbarth: no, it's not. Let me just do a quick check.

[00:27:12] K Royal: I was surprised to see that most of our podcast listens are not on Apple Podcasts. Yeah,

[00:27:19] paul_breitbarth: the beginning they were,

[00:27:21] K Royal: very much so.

[00:27:22] paul_breitbarth: It's 40% Apple Podcast. It's 71%  iPhone, 

[00:27:26] K Royal: Spotify, I think is the second, didn't it?

[00:27:29] paul_breitbarth: Spotify is a second at 8%.

[00:27:32] K Royal: Ooh. Okay. 

[00:27:33] paul_breitbarth:  And 85% obviously listens on mobile. but 71% of our listeners listen on iPhone.

[00:27:40] K Royal: I reluctantly have an iPhone, only because Apple went through and started putting in more privacy protections, 

[00:27:47] paul_breitbarth: so over time, for all of our episodes, to date all almost 150 of them, we have 530 downloads in Saudi Arabia.

[00:27:57] K Royal: There we go. So, but that means we might have 15 people that are listening across all the episodes, they.

[00:28:05] paul_breitbarth: So if you are based in Saudi Arabia and you want to talk about the Saudi Data Protection Act please reach out to us because we want to hear from 

[00:28:15] K Royal: Oh, that would be fabulous. We have at least one listener there.

[00:28:18] paul_breitbarth: Let's we want to hear from you and discuss the Saudi Arabia data protection law 

[00:28:23] K Royal: Open invitation. 

[00:28:24] paul_breitbarth: that's something to come later this year. We'll find the speaker. No worries. but for now, thank you for listening to yet another episode of Serious Privacy.

[00:28:32] K Royal: we will be at Global Privacy Summit for IAPP in DC so please do find us there.

[00:28:38] paul_breitbarth: Absolutely. And we will be carrying recording materials to collect quotes from the I A P P summit. and Kay is wearing a flag, so you'll find her easily. so yes, join us at IAPP Global Privacy Summit. Talk to us. We'd love to do an extended episode on the summit. and yes, I am excited to see K for the first time since 2020.

So on that note, thank you for listening to yet another episode of Serious Privacy. If you'd like us, please do review us and leave a comment in your favorite podcast app on your favorite podcast platform. Join the conversation on LinkedIn. Look for us under Sir Privacy, you'll find K on social media as @heartofprivacy, and myself as @EuroPaulB

Until next week, goodbye.

[00:29:24] K Royal: Bye y’all.