.png)
Serious Privacy
For those who are interested in the hottest field in a technology world. Whether you are a professional who wants to learn more about privacy, data protection, or cyber law or someone who just finds this fascinating, we have topics for you from data management to cybersecurity to social justice and data ethics and AI. In-depth information on serious privacy topics.
This podcast, hosted by Dr. K Royal, Paul Breitbarth and Ralph O'Brien, features open, unscripted discussions with global privacy professionals (those kitchen table or back porch conversations) where you hear the opinions and thoughts of those who are on the front lines working on the newest issues in handling personal data. Real information on your schedule - because the world needs serious privacy.
Follow us on BlueSky (@seriousprivacy.eu) or LinkedIn
Serious Privacy
We require order! Order, we say (a week in privacy)
On this week of Serious Privacy, Paul Breitbarth of Catawiki, Ralph O’Brien of Reinbo Consulting, and Dr. K Royal launch the first week in privacy for 2025. Topics include the EDPS election, AI, and US state laws and enforcement - among others!
Powered by TrustArc
Seamlessly manage your privacy program, assess risks, and stay up to date on laws across the globe.
With TrustArc’s Privacy Studio and Governance Suite, you can automate cookie compliance, streamline data subject rights, and centralize your privacy tasks—all while reducing compliance costs. Visit TrustArc.com/serious-privacy.
If you have comments or questions, find us on LinkedIn and Instagram @seriousprivacy, and on BlueSky under @seriousprivacy.eu, @europaulb.seriousprivacy.eu, @heartofprivacy.bsky.app and @igrobrien.seriousprivacy.eu, and email podcast@seriousprivacy.eu. Rate and Review us!
From Season 6, our episodes are edited by Fey O'Brien. Our intro and exit music is Channel Intro 24 by Sascha Ende, licensed under CC BY 4.0. with the voiceover by Tim Foley.
Please note this is largely an automated transcript. For accuracy, listen to the podcast.
S6E4 - Week in Privacy
[00:00:00] Paul: My name is Paul Breitbart.
[00:00:20] Ralph: we don't decide on the proper order. Have we? I'm so
[00:00:23] K: I go, I go in alphabetical order, Paul Breitbarth, Ralph O'Brien, K Royal.
[00:00:28] Ralph: I will always go second. Fine. Okay.
[00:00:30] Paul: Her majesty always comes last.
[00:00:32] K: Okay.
[00:00:33] Paul: my name is Paul Breitbarth.
[00:00:35] Ralph: my name is Ralph O'Brien
[00:00:37] K: And I'm K Royal and welcome to Serious Privacy, which is the only reason why I want to go last so I can say welcome to Serious Privacy. But anyway.
[00:00:45] Paul: So you want the podcast to remain serious privacy instead of it becoming serious privacy.
[00:00:50] K: Exactly,
[00:00:52] Ralph: and not serious data protection at all.
[00:00:54] K: and not serious. We're not serious about data protection whatsoever, but okay. Unexpected question. Unexpected question.
What's your favorite way to unwind after a long day at work?
[00:01:06] Ralph: Okay well I'll jump in. I made the mistake during COVID of, of buying a hot tub. Would you believe?
[00:01:13] Paul: Ooh, now I'm jealous.
[00:01:15] K: I'm coming to Ralph's house
[00:01:16] Ralph: so yeah, in the back of my garden I have a full on jacuzzi which a glass of whiskey, jacuzzi, that is you can't ask for fairer than that to just kind of let your cares float away in the evening.
[00:01:29] K: with a good, with a good glass of bourbon or scotch too, right?
[00:01:33] Ralph: Just for me it's Scotch, but yes.
[00:01:34] K: Yeah. I like it.
[00:01:36] Paul: Some Northern lights above and
[00:01:38] K: Oh.
[00:01:41] Paul: yeah, that sounds really good for me. It's two options. If the weather is decent, then biking to the beach is always a good one. Just. Watching the sea, and sometimes having a drink at a beach bar in the summer, but also just walking along the beach a little bit. Screaming into the wind if it's been a difficult day.
or just cooking. cooking is also really good and onions are very forgiving if you have a lot of aggression to get rid of. And just the process of doing something that is free of screens, and just very relaxing. Put on some music, put on a podcast, or the news, or something. And then just,
[00:02:22] Ralph: I've yet to contribute to the Serious Privacy cookbook as yet, so,
[00:02:25] K: oh, right. We need to, we need to pull that out The house building or remodeling is next up too, just so you know. But for me, As y'all were talking, I'm sitting here going, going, I don't know that I have a way that I rewind because I have five dogs. I have three cats. I have a husband.
I live next door to the grandchildren and their animals and everything. And the house is still not back together from the hurricane. So I'm sitting there going, I don't think I do unwind, but I do.
[00:02:52] Paul: is? Do you, do
[00:02:54] K: Right. Do I recognize that word? Yeah. So for me, I was going to say when I'm sick of work and I'm like, I'm done, it's either going to be reading.
And I have several series that I like to follow. I do not read nonfiction for fun. That is one thing that blows people's mind. Did you read this biography? No. And I'm not going to. I, I read fiction for fun. I like vampires, werewolves, space fantasies, you know, travel in time, all that good stuff. So I would say it's reading or it's watching a movie.
It's vegging out on a movie that I've seen a hundred times in the past and I still don't quite remember what happens when cause, you know, whatever.
[00:03:31] Paul: Because you fall asleep halfway through the movie.
[00:03:33] K: No, I don't do that, believe it or not. I can't even take a nap during the day. My body's just not built that way, but it's vegging out on probably a DC or a Marvel movie or something along that line and typically Tim's with me to unwind.
Now cooking, I don't like cooking. I like baking for the holidays, so I can't say I even do that. But I agree with you, just put on something that's, that you can listen to and then just veg out.
And I need a hot tub.
[00:04:02] Paul: Sounds good.
[00:04:03] Ralph: Well, you know, you're welcome.
[00:04:05] K: I need a hot tub to come over. I need the Northern Lights. Y'all, y'all are making a list here for me.
I appreciate it. So I think we already did mention some of the recognitions. I pulled up some of the others that we've gotten in the past because I was disappointed when I was on one of the ranking sites, it was good pods or listed notes or something. And I looked up privacy podcasts and we weren't even on the list.
I'm like, what the hell?
[00:04:29] Paul: Well, I think to some extent that is because not all of the podcast aggregators like Apple podcasts and Google podcasts and, and, and all of those, not all of them have a specific category for privacy. So that doesn't mean you automatically get recognized as a privacy podcast, which is strange. Of course, if you consider that we have privacy in the name of the podcast and that probably some of these will be AI generated, but.
[00:04:54] K: Right. That is true. But I will say what was funny, though, is so after I saw that, I'm like, well, then what the heck is our podcast classified as? So I went and clicked on our podcast in it and we were not on the privacy list, but we're on the data privacy list or vice versa. We're not on the data privacy list, but we're on the privacy is like number two. And so it was really interesting to see it breaks it down that way. Cause I think we're classified as technology, business and government or something. We might be under education, I'm not sure. So however these metrics run, but it was really interesting to see that. And of course, I think if we look at our stats on Buzzsprout, we're listened to in something like 160, 170, maybe 190 countries.
[00:05:36] Ralph: And of course, that's all due to our listeners. So really, we should say a huge thank you to our
[00:05:41] K: yeah.
[00:05:42] Ralph: privets.
[00:05:44] K: Absolutely.
[00:05:46] Ralph: the privets.
That's it
[00:05:47] K: We're going to call y'all the prevetts now. And now that makes me think about the stickers that we need to do for IAPPDC. So Paul and I will be there. Ralph cannot make it this time, but I think Ralph is going to be on at the one in London. So the good thing is we should have representation somewhere where you're going, but the IAPPDC, we will be at, I'm in process of designing a wicked sticker.
I have three of them in mind. We haven't decided which is the final one. So y'all let me know. Seriously, if y'all listen to the podcast and you're there, drop us a note, say what you want, what else would you want to see? We'll always have the one with the logo. I got to figure out how to put Ralph in there.
We have Paul rocks, K rolls her eyes. My husband suggested and Ralph Ralphs. And I'm like, probably not that one.
[00:06:33] Ralph: Like I've never heard that before.
[00:06:36] K: Ralph Ralph's right. Yeah, never exactly. But I got to think of a way with the logo because we like the regular logo sticker. We're going to build in the QR code that we can do that as well. I, Paul and I keep going back to one of the ones we originally had, which was the cup of cappuccino that good to the last mic drop.
So I think going back to the classic one for that is probably good.
[00:06:59] Ralph: to the last mic drop. Very nice.
[00:07:01] Paul: And let's not spoil everything, people will see when they find us in DC what we have in store for
[00:07:05] K: Well, that's what I was going to say. We're not going to tell you what we decide on, but feel free to give us your feedback and then you'll, you have to come find us in DC to get your stickers.
[00:07:14] Paul: Exactly. Do you also know who hasn't decided yet, who hasn't made up their mind?
[00:07:19] K: Who?
[00:07:20] Paul: The European Union.
[00:07:21] Ralph: Yeah, very good. What a
[00:07:23] Paul: they are selecting a new European Data Protection Supervisor. And there are four candidates,
[00:07:29] K: One's a woman.
[00:07:31] Paul: there is including a woman, Anna Pouliou, who used to be at Chanel, who used to be at Mars who is now also the chair of the Independent Data Protection Board at CERN.
So she is a candidate. Then we have François Pellegrini, who used to be a commissioner at the French Data Protection Authority, the CNIL. We have Wojciech Wierowski, the current European data protection supervisor, and we have Bruno Giancarelli, who was the head of unit first for data protection, then for international data protection.
So for all the transfers within the European Commission. So four. Candidates, a couple of weeks ago, mid January, the candidates had to present themselves to the European Parliament and to the EU member states, and then there were votes,
[00:08:18] K: I thought we would see the results.
[00:08:20] Paul: Well, the EU member states voted for Wojciech Rybierowski in majority you, you should just, it's like Eurovision, right? You hand out points.
[00:08:28] K: Okay.
[00:08:29] Paul: and Wojciech Rybierowski for the EU member state got the most points with Giancarelli in second place, then Pellegrini, then Puliu. In the European Parliament, in the Commission on Civil Liberties, the tally was different there. Giancarelli got the most votes, then Pellegrini, then Wawierowski, and then Pouliou.
So I think it's safe to assume that Anna Pouliou won't make it as the new European Data Protection Supervisor.
[00:08:55] K: This time,
[00:08:56] Paul: but there is no clarity because the institutions do not agree.
The European Commission doesn't have a vote here. So, it's it's, it's, they need to negotiate with each other to see whether it will be Wawierowski, whether it will be Giancarelli, or if it's the third dog walking away with the bone and
[00:09:18] K: seen, we've seen that happen. In selection processes where they can't decide on number one or number two, so they toss them out and they go with number three. So, but how do they resolve this and do they have a time frame?
[00:09:31] Paul: Well, the time frame is, I think, the first of November last year.
[00:09:35] K: That's what I was thinking. We were already running behind, right?
[00:09:38] Ralph: yeah, definitely.
[00:09:40] Paul: we're already running behind. No, they need to negotiate, and, and see, and in this case, it could be that the full parliament or the presidents of the parliamentary groups that they have a different opinion than the Civil Liberties Committee. that could be so it would be moved up to more senior members of parliament.
It can be that the member states will escalate to to the ministers for them to make a decision. But this is a closed door process. This is all behind the scenes. There was nothing really leaking. It seems that
[00:10:16] K: leadership over here.
[00:10:18] Paul: Yeah, well, I mean, they just issue public or public press release after executive order, after non story, and, and fairy tales, and whatsoever, everything is, is published straight away.
This is, this is really closed door. So we don't know. Also don't know when to expect the white
[00:10:37] K: or white smoke or anything? No.
Okay.
[00:10:40] Ralph: It's not the Vatican. No.
[00:10:41] Paul: no,
[00:10:44] Ralph: but I mean, it's fair to say that a couple of the appointments might attract a little bit of controversy. You know, I'm not going to say too much, you know, but there's, there's certainly a couple of reasons, objections going around you know, about a couple of the candidates.
And so you know, whichever way they go, I think there'll be, um, there'll be conversations.
[00:11:02] K: I just say, I like how you said controversy, controversy,
[00:11:06] Paul: yeah, think we can say a little bit more without showing our own colors here too. Because whatever happens, Ralph and I may need to work with these people as well, right? So it's not like we are picking sides. But according to what I have seen in the news reports, is that at least some political groups more on the right side of the spectrum in the European Parliament have concerns about Wawierowski because of his very strict interpretation of data protection rules.
Especially in relating to CSAM, so the children online pornography material. So that is an issue on, on that side. For Giancarelli, some of the concern is that he is coming from the European Commission. So he's been working on all of the policies on data protection, on data transfers. For the past couple of years, and now he needs to be the one implementing those and overseeing them.
Which some say, well, how objective can you be?
[00:12:01] K: Yeah.
[00:12:03] Paul: so that's, that's the two sides, I think, of, of that discussion.
[00:12:07] K: And no idea of when they're going to negotiate or what, who, oh wow,
[00:12:14] Paul: Sooner rather than later, I would say,
[00:12:16] K: I would hope.
[00:12:17] Ralph: there's one thing we can put a date to, and that date is the first provisions of the EU AI Act.
[00:12:23] K: I love how y'all were doing these segues here. Y'all are brilliant.
[00:12:27] Paul: February 2025.
[00:12:29] Ralph: of February 2025 indeed, and we are recording this on the 5th of February 2025.
So three days ago the first set of several deadlines went live underneath the Act, on the list of AI prohibited practices. So that includes facial recognition built by scraping biometrics for people's identity, techniques to manipulate behavior, social scoring, emotion detecting and this is.
Even because we don't still really have the complimentary guidance from the European Commission yet, which was due to sort of accompany this first wave of AI Act revisions. So we are sort of somewhat in the dark, even though the law has gone live. I think that's fair to say.
[00:13:21] K: right. And I know that one of the provisions of the EU AI Act that we're looking at is. Or, or that concerns me is the sentiment, the detecting emotions because it is very popular to have, that in call centers, whether it's for detecting the emotion of the person calling in, which I say no, or it's detecting the emotions in base of the translated language for the call center person sending things out which I see more often than not is that's the focus of, of making sure that your call center who may be located in another country, India, Philippines, Chile, whatever, whatever.
Companies have in Costa Rica, I think it's Costa Rica or Puerto, yeah, Costa Rica that a lot of call centers are at that specialize in that a lot of U. S. companies and they use it to detect the translated cultural emotion of their call center people sending out emails or phone calls or whatever, just to make sure that what they're saying is right.
So if they're outlawing this, is that going to impact all of these companies that have call centers in some of these, you know, offshore
[00:14:32] Paul: Yeah, I would, I would say it does. Because that is a form of emotion recognition and some of them call it sentiment analysis and, and say, well, that's not the same as emotional recognition. I beg to differ.
[00:14:46] K: Right. It seems like that is an emotion, although the call centers don't usually use it for that. But still it's the same thing, right? You're telling them, Oh, well you sound angry because you're using short blunt sentences with monosyllabic words.
[00:15:01] Paul: exactly, yeah.
[00:15:02] Ralph: I'm, already seeing a little bit of pushback. I mean, just, just from a post I've read on LinkedIn, there's already a number of professionals who are, you know, let's just call them true believers in AI who are already saying that, you know, the AI will stifle innovation and we'll leave the EU, you know, millions and millions of pounds in production ahead of, more other countries who are more aggressively adopting it.
and it seems to be this race for AI implementation, but. You know, I think the thing that always worries me and, you know, I'm actually a big supporter of the AI Act is is, you know, sometimes we. You know, all technology really does is change the scale of things, you know, for great benefit and great harm.
And quite often when we jump into early adoption, you know, we often say, can we rather than should we, and cause the great harms as well as the great benefits. So I do think the AI act from the EU taking a phased approach, concentrating on risk, you know, was probably taken a good third way in regulating, but. leaving space and time but I guess like all these things, time will tell, right?
[00:16:08] Paul: And let's not forget the other important provision, right, that also went into effect, which is AI literacy. which is also mandatory as of this week, which means that we all need to start training our teams. Everybody who gets in touch with AI, who is working with AI, whether it is a Microsoft Copilot or a Google Gemini in the office software.
Or a chat bot in customer service or in AI enhanced imaging in the marketing department. Everybody needs to be trained. Everybody needs to made aware, be made aware of how it works, what is happening behind the scenes, what are the requirements that are coming also in terms of transparency and fairness and bias.
[00:16:52] Ralph: I'll tell you who needs training in co pilot literacy.Co pilot.
[00:16:59] K: Co pilot. needs training and co pilot AI. Yeah,
[00:17:02] Paul: Yeah, Co Pilot also needs training and the same for Gemini and OpenAI. I mean,
[00:17:07] Ralph: It, just appeared in all of my Microsoft Office suite this week. And, you know, me being slightly cautious, decided to try and find to turn it off. So I couldn't find the setting. So I actually asked Copilot. I said, Copilot, how would you turn off Copilot? And it said, go to settings, go down to the options, click turn off AI, which I promptly followed its instructions.
And guess what? There was no setting to turn it off where it said there
[00:17:29] K: Co pilot is lying to you.
[00:17:32] Paul: an hallucinated
[00:17:33] Ralph: It was a hallucinated setting. so, yeah, I found that really amusing that if you ask Copilot how to turn it off, it takes you to an imaginary setting that isn't there. Just shows you one of those confirmation biases that we see in AI.
You know, it never disagrees with you, right? It just tries to answer a question and assume that you're right. Yeah, totally. Yeah. So, you know, it's really incredible just to see the limitations of AI right there in front of you, even with something as simple as how to turn it off.
[00:18:03] K: Right. Well, and when it comes to AI in general the law school class that I teach, it's interesting, the law school has approved and, and the university has approved using AI for students writing papers. Anything they get from generative AI, they have to quote, and they have to cite, and they should do this, that, and the other thing.
However, the judges in the legal community in Phoenix, and I was just there when they launched the class the judges are not happy that law students are using AI, and You know, the school is saying, well, all of the lawyers are going to be using AI at 70%. Now it's going to be a hundred percent by the time you graduate.
And the judges are going, yeah, but we don't allow it. We don't allow them to submit briefs using AI or anything. And I'm sitting there going, this
[00:18:49] Ralph: But haven't we seen in U. S. the copyright laws have been saying you can't copyright something produced exclusively by AI, but you can produce something that AI, you can copyright something that AI has produced. Contributed to and that's really interesting because I'm, as we all know, I'm a little bit of a geek and a gamer anyway, and there's an awards ceremony called the Ennies, which is for people who produce games and that kind of stuff, and the Ennies have just come out and said, no AI whatsoever.
Like if your publication has got AI generated artwork in it, AI generated text, anything like that, it won't be able to be submitted for an award because we want to support creators.
[00:19:32] K: Right. Well, that's interesting.
[00:19:35] Paul: it's an interesting, but I'm not sure I think it's a sensible approach
[00:19:41] K: Right.
[00:19:42] Paul: because I have, I have my concerns about AI. I'm not an AI evangelist. I'm not the one that says, Oh, yeah, use it no matter what and whatsoever go crazy. But it does have its, its benefits as well. It does have its added value as well.
So. We need to talk about how to use it and what with what boundaries to use it and in that respect I actually applaud your your law faculty for saying students should use it So that we can train people to use it in a proper manner Instead of just being the three monkeys see no evil hear no evil say no evil and ignoring the fact that this actually exists
[00:20:22] Ralph: And that's human nature, right? Human nature is, is we often look at stuff and we'll take simple and wrong over complex and right. The problem with AI is it looks like magic. People will accept it. And actually my version of AI is what I call absent intellect. You know, what we have to do is realize that this is a tool and that we have to look at the results with skepticism and critical thinking.
And actually what people do, because we're human beings, is we see the results and accept them as face value because we want to reduce our amount of thinking. Yeah, we want to, you know, as humans, we want to reduce our amount of conscious thoughts and we see AI as a tool to do that rather than something that requires Extra thought.
[00:21:03] Paul: And for me, but for me, it really is a tool. I mean, I'm, I'm using it as a supporting contract reviews because they take up a lot of time. And if I ask AI to do a first check for me on, on my top 10 points on what to look out for in a contract and have it scan that, then I already know, Oh, is this one where I need to calculate a lot of time, set away half a day, or is this one I can do in 30 minutes?
even for planning purposes, that is a benefit of having AI compare the vendor contract against my template or my wishlist and have it done that way. At the same time, You do need to be careful. I checked an NDA earlier in the week and it, one of the things it needs to check is jurisdiction and applicable law.
And the AI concluded that the vendor NDA had no references. To applicable law and to jurisdiction and that that needs to be the netherlands and dutch law So I thought okay, that's a point that I need to feed back to the business I went to read the nda and it was perfectly there. We submit to the courts of amsterdam under dutch law. So it was there
[00:22:10] K: Right. But it didn't pick it up in the context.
[00:22:13] Paul: up. So it is certainly not perfect. Certainly not yet.
[00:22:17] K: well, and I think one of the things that people, that people need to be aware of is you can't just go to your company and say, No use of generative AI because people are going to use it. And we have an AI policy that the privacy office owns, but of course the business has to implement it, security, it, whatever.
So we've put in processes with AI councils and technology governance and all this stuff. Still telling people you can't use generative AI means nothing. They're going to use it. You have to give them direction. You have to make sure they understand what's allowed, what's not allowed and the why behind it. You know, you can't use generative AI for your emails, writing your emails, because what happens if an external generative AI, because what happens if you're making an offer for a new executive, or it's your business support plans, you don't want to accidentally share out there. There's a solution, get a generative AI that is an enterprise and stays within your company and does not go back to the mothership.
There's, there are solutions, they probably cost more. But they are there. Privacy is a rich person's luxury, right? So that is one of the things to take into mind. And I see companies taking very broad approaches to this. You can use it for this. You can use it for that. You can't use it for this. You can't use it for that.
It's all over the board.
[00:23:35] Ralph: That's a very
[00:23:36] K: There's no consistent approach.
[00:23:37] Ralph: a very U. S. approach that privacy is a rich person's luxury. Over here in the E. U. and in the U. K. it should be a fundamental human right.
[00:23:46] K: It should be. But if, if y'all choose not to submit to Meta's advertising, y'all have to pay for it.
[00:23:53] Paul: Well,
[00:23:54] K: Which is now being looked at by the courts, right?
[00:23:57] Paul: That won't stand in the long run.
[00:23:59] Ralph: Well, that's interesting because, because, because again, over here in the UK, we've actually had the ICO release their own guidance on the pay or okay model. And theirs was a lot more positive than the EU's, it's got to be said, right? There's was a lot more positive than the EU's.
[00:24:11] Paul: They're what's called more
[00:24:14] Ralph: Business friendly. Yeah.
[00:24:15] K: They usually are and they're very pragmatic when it comes to the US. Let's just be
[00:24:19] Ralph: Yeah,
[00:24:20] Paul: Well, the question is for how long that will remain this way, right? I mean, it looks like the US is doing everything it can to antagonize everybody in the world including its own citizens. So maybe in the not too distant future also the UK will not be that US friendly anymore.
[00:24:38] Ralph: We will see, won't we? And I'm really, really pleased that K brought up her, her class, actually, because I wanted to actually, yeah, point out a big, big to her class, because I had the privilege of speaking to them late last night, late, late night last week, actually you know, just highlighting the top 10 differences between U.S. and E. U. data protection law.
[00:25:03] K: Fabulous fabulous fair presentation and they loved it,
[00:25:07] Ralph: Yeah, well, I was gonna say how much I appreciated them and the fact that they reached out and the insightful commentary they had and it really, you know, like it's like working with the apprentices here. It really instilled in me this sort of hope of the grasp of the issues that young people have. And, you know, now that we're dinosaurs, you know, we can actually
focus our time on, you know, making sure that the you know, new data protection professionals of tomorrow don't make the same mistakes we did, right?
[00:25:35] Paul: So Ralph, what's the one thing that you learned? You said you, you was a conversation, right? It was not just you talking to them, but.
[00:25:44] Ralph: yeah, it was two ways. They were very insightful, actually, in in sort of asking again about balance, actually, you know, I think once I sort of lifted the scales from their eyes about what we see as the difference between privacy and data protection and what we would see as sort of you know, the difference between certainly the one thing I don't think they'd really considered is What they would consider public information.
And we would consider that still subject to the law over here in, in the EU. And so they, they, they were talking about, well, how practically do you control it? Yeah. If someone just wants to take it and use it, what can you really do to stop them? You know, if somebody acts out of jurisdiction, what can you really do?
And I think that sort of pragmatic again, points of view really came across quite strongly and not really with a sense of hopelessness, but with a sense of, well, what can we do? You know, how can we change what we do to, to, to make things better? And that was the sort of the hopeful note that, That I found was, was really positive.
[00:26:43] K: and what they all hate is the fact that you have the, the transparency, the privacy notices, the the principles that we go by. That was something that has come up, I say, in almost every class. We were on class three. But it's come up in all of them because, you know, they hate them. They don't want to read them.
It's downloading an app. But on the other hand, we want. To tell them what we're doing and how do you find out what a company's doing with your data? So it's, it's interesting seeing them bounce back and forth trying to figure this out. And I'm like, yeah, welcome to our world.
[00:27:15] Ralph: I'm sure we've done it in the past, but it would be interesting to do a podcast episode on tools and techniques to improve transparency notices. Fair processing statements. I mean, your icons, videos, layered approaches, you know, it's just an area where we could do so much better on as a profession.
[00:27:37] K: absolutely. And we, we do have a habit of doing that, as our, our listeners know. We don't try to do a guest every week. We don't try to do a week in privacy every week. Although, we can't forget the request that when it's not a week in privacy, spend five minutes giving the, the top news items, in case they missed it to go catch it.
Which by the way, we tend to pull from the Trust Art Nemedy. Privacy polls that they put out every week on TrustArk, as well as the IAPP digest for the different regions. So there are resources out there you can go to, but we try to at least cover those. But we also kind of try to move in these topical episodes.
We've done vendors, we've done children, different things like that. So I think technical tips and tricks for notices would be a great one.
[00:28:25] Paul: Okay, apart from the chaos in the, in the White House, what else is going on in the U. S.? Any new state laws that are on the horizon suddenly? Any enforcement action that we may have missed?
[00:28:38] K: Well, not really anything sexy but one thing that we have not talked about are the different laws that went live here in 2025.
[00:28:48] Paul: we briefly mentioned them, but that was about it.
[00:28:50] K: yep, and that was it, so they went live, so companies need to pay attention to that. The, I think it's the Oregon one we mentioned last year, but you still need to pay attention because the Oregon regulators in conferences and stuff.
When they're presenting have stated that they deliberately put specific things in their state law That were different from other states and they did that deliberately and that's what they intend to enforce on so you need to go look At Oregon's in particular and pay attention Although I am starting to see where companies as always are Interpreting the laws differently and actually had Bermuda come up in a professional discussion that I saw where companies are interpreting what Bermuda requires differently.
And I almost wanted to go to the Bermuda regulator and say, what are you seeing, David? What, what, what has come out? You know? But that's one that I'm seeing a lot of different interpretations on is Bermuda in the, in the chats with other privacy professionals. But I would say enforcement wise for the U.
S., nothing really sexy. I think most people are responding to what we're seeing on a federal basis. In case y'all didn't catch it. COPPA has the COPPA 2.0 out. They did not raise the age, but they're tightening that HIPAA strengthening the security laws, but we're not seeing a lot of enforcement, of course, coming out from FTC, from FCC, from OOCR or HHS.
[00:30:22] Paul: I think many of them are still wondering whether they have staff available to do the do the enforcement. What I did see is that New York is working on a
[00:30:30] K: yes,
[00:30:31] Paul: new Health Information Privacy Act that's currently under Governor's Act. Desk.
[00:30:38] K: Yeah, HIPAA, which is another way of HIPAA. So this is just going to make our lives a lot easier with the acronyms as we have the New York HIPAA and a HIPAA
[00:30:50] Paul: Always nice.
[00:30:52] K: I got nothing people. I got nothing. The acronyms that people choose are very, are very interesting for how they want to present that. But yeah, that is one to pay attention to because New York is being active in regulation. There's the New York Department of Financial Services that tighten their cyber security requirements.
That deadline is coming up for companies to turn in their certifications signed by their CISO and their CEO of the protections that they have in place. And of course, anytime you make a CEO sign something, certify something in place, companies are going to pay attention. Transcribed And of course the SEC cyber security, which has already been in place, but we're starting to see a lot more news stories around the cyber security incidents that are being reported publicly from SEC.
So that's interesting one too. But otherwise I'm ready for people to dig in and, and do some things, right? I mean, we're seeing Texas be active but not, that's really about Texas and California, Oregon a little bit, but you're not really seeing anything sexy. Sorry.
[00:31:52] Paul: Okay. Well, if you're doing business in Africa, then you may want to keep an eye on Cameroon,
because they have recently adopted their first data protection law that will apply as of June 24, 2026, so you have about 18 months. To be compliant with that law, it's very GDPR like in purpose and scope including all the individual rights, with maximum fines up to just over 150, 000 U.S. dollars.
[00:32:22] K: I was looking to see
[00:32:23] Paul: look out for.
[00:32:24] K: back on the Nairobi lawsuit yet, but that might take a little while to work through. Yep.
[00:32:31] Ralph: I'll take one.
[00:32:32] Paul: there yet.
[00:32:34] Ralph: from, from, from the UK, not too much to, to, to report at the moment. Like I said, we had the ICO opinion on pay are okay. Which, which is interesting. The data use and access bill is still working its way. It's gone through the house of Lords now, and then moving on to the house of commons.
Not much has changed to be honest. They're still planning on abolishing the information commissioner and having an information commission. One thing that has slid in though, is an extension to the soft opt in rule under e privacy. So as you might, might be aware, e privacy currently states that everything business to consumer is opt in, unless obtained In a sale or negotiations for a sale, which means commercial organizations can take the opt in to an opt out if they've bought something or made an inquiry about something.
Now that's going to be extended according to our data use and access bill at the moment to charities as well, because at the moment, because charities aren't commercial. They don't have the ability to say they've collected the data in the context of a sale. So by extending that to charities it will be charities being able to take advantage of opt out rather than opt in.
My personal opinion is that charities are aggressive enough with their sales techniques. But they appear to see them as a special case and want to lessen the rules for them. So that should be an interesting development if that goes through into law.
[00:34:03] K: well here in the states some of the the newer state laws took after Colorado and not Nonprofits are not exempt. Nonprofits? Nonprofits are not exempt from the privacy laws. And so, and most people say, well, they're exempt from California. Not technically, the language is around for commercial purposes.
If it doesn't have a commercial benefit to the members of the nonprofit, which think things like the California medical board or attorneys bar association, they're nonprofits, but. The members get a commercial benefit out of it. So not all of them were exempt, but you don't see any enforcement. So I don't think that's high on the radar, but for some of these other Colorado specifically did not exempt nonprofits and they said, well, of course, the mom and pops that are nonprofits would be exempt.
We're not going to go after them. We're going after the big United ways and red crawls to have a ton of data and a ton of money, and we're going to make them follow the privacy laws. But the other thing, and this is not a smooth segue at all. I need to learn from y'all. Canada, if no one had Law 25, which is Quebec, the Bill 61 on their radar, y'all do know they passed that, right?
And there, there's laws there. And then the, the news thing that I, I was thinking, Oh, there was one news thing I wanted to share was Brazil. They halted the collection of iris scans for the cryptocurrency rewards. And so that was pretty cool,
[00:35:25] Paul: The world coin issue, right? The Sam Altman coin that was also blocked by other data protection
[00:35:31] K: Yep.
[00:35:32] Paul: including in Kenya.
[00:35:33] K: Yep. So it's, it's really cool to see that Brazil is a lot more active than people give it credit for. It doesn't make enforcement headlines a lot, but it has some pretty cool things going on down there. I think last year they had WhatsApp blocked which is really big for Latin America because they're, they heavily use WhatsApp, but they had it blocked because of the violation of their privacy law.
So do pay attention to some of these. That's
[00:35:56] Paul: Yeah. Maybe to, to wrap up also something that most people may have missed but Sweden published the data protection authority in Sweden published a report to the government on integrity and new technology and data protection and all of that as part of their data protection day efforts. But in the press release, they also state that according to the Swedish data protection authority.
It is now time to start renegotiating the GDPR. The English translation actually says it needs to be reviewed. But if you look at what they actually say, technology has advanced so much. There have been so many new developments. GDPR is Hard to apply for a lot of, especially smaller companies. So we need to at least adjust the law, update the law.
And that is something that most countries and most data protection authorities so far have pushed back upon.
[00:36:51] K: Yeah,
[00:36:52] Paul: because it's like opening up a can of worms, obviously.
[00:36:55] K: it is. And I think we're going to add a piece into last week's recording. So if we do that, we'll take this part out. But one of the things we didn't mention with Wayne Unger is that we have a privacy casebook coming out on privacy law. And by West Publishing, I think it's going to come out on March 15th.
It is intended. to be used in the academic world. One of the interesting things is, you know, everybody's like, well, you're going to be outdated as soon as you're published. Well, yeah, because technology and enforcement and everything changes. But the thing is, the decisions on these cases are the final decisions.
They're either the Supreme Court or high court or whatever, they're the final cases. Some of the considerations we talk about may actually be a little outdated or have some opinions owned by specific countries. But if you're academic or looking for that, do pay attention to that as well. Wayne is absolutely brilliant. Very hungry going to watch his privacy professor career take off there. I'm just me. I'm not taking off from anything. I mean,
[00:37:56] Paul: well,
[00:37:56] K: yeah,
[00:37:57] Paul: like a good conclusion.
[00:37:59] K: yeah.