Serious Privacy
The PICCASO award winning Podcast, for those who are interested in the hottest field of human rights and laws on the digital frontier. Whether you are a professional who wants to learn more about privacy, data protection, or cyber law or someone who just finds this fascinating, we have topics for you from data management to cybersecurity, from social justice to data ethics and AI. In-depth information on serious privacy topics
This podcast, hosted by Dr. K Royal, Paul Breitbarth and Ralph O'Brien, features open, unscripted discussions with global privacy professionals (those kitchen table or back porch conversations) where you hear the opinions and thoughts of those who are on the front lines working on the newest issues in handling personal data. Real information on your schedule - because the world needs serious privacy.
Follow us on BlueSky (@seriousprivacy.eu) or LinkedIn
Serious Privacy
If it ain't California, it's Texas
In this episode of Serious Privacy, Ralph O'Brien and Dr. K Royal discuss the weekly news, including the Google settlement in Texas, ClearviewAI and much more.
If you have comments or questions, find us on LinkedIn and Instagram @seriousprivacy, and on BlueSky under @seriousprivacy.eu, @europaulb.seriousprivacy.eu, @heartofprivacy.bsky.app and @igrobrien.seriousprivacy.eu, and email podcast@seriousprivacy.eu. Rate and Review us!
From Season 6, our episodes are edited by Fey O'Brien. Our intro and exit music is Channel Intro 24 by Sascha Ende, licensed under CC BY 4.0. with the voiceover by Tim Foley.
You're listening to Serious Privacy, powered by Trust Art. Here are your hosts, Paul Breitbart, Ralph O'Brien, and Dr. Kay Royal.
SPEAKER_03:Well, welcome to another week in privacy, and this one's probably going to be a little bit of a quick one because not only do we like to keep these short to make sure that you're available to walk the dog or do whatever it is while you listen to our podcasts, but also because you catch us packing our bags and we're ready to move on to a busy week next week. So we should be able to get plenty of content out on what's going on next week. But in the meantime, I'm afraid that our good friend Paul Breibarth is not with us this week. So it's Kay Royal and myself as we go into a week in privacies. So my name is Ralph O'Brien.
SPEAKER_01:And I'm Kay Royal, and welcome to Serious Privacy. I think Ralph's a bit optimistic on saying this is going to be a shorter than usual episode. I doubt we've ever hit less than 30 minutes, but we do try to hit right at 30 minutes. Let's see. Unexpected question. I'm being judgmental on this one because I'm trying to see if make sure I haven't asked these questions before. Let's go with this one. What do you buy way more of than most people do?
SPEAKER_03:Wow. Now there's there's an honest answer and there's a not honest answer.
SPEAKER_01:Honest answers only. Honest answers only. Although, wait a minute. If you're going to talk about your personal sex life, I don't want to know what you spend money on.
SPEAKER_03:Okay. Well, what do they say in the US is a plead the fifth? No, no. Well, probably my hobbies. Probably my hobbies, to be fair. There's a saying in the gaming community that I belong to that they call the little plastic miniatures plastic crack because it's fairly addictive.
SPEAKER_01:I have a friend that does a Discord channel on painting those.
SPEAKER_03:Yes. Yeah. I probably was to.
SPEAKER_01:Yeah, probably have. Oh, that is hilarious. I'm gonna say that most people here could probably guess what I spend more money on because I almost got the ones that I'm customizing in to show you. Yes, shoes. I spend way more money on shoes. But be fair, and I will defend this to my dying day. It's not that I go out and buy Christian Louboutons or whatever the heck these shoes are. I don't like spending a lot of money for any particular shoe because one, I've never noticed that there is any noticeable difference in cop in comfort or lasting, because I may wear, you know, with over 200 shoes, not counting boots and sandals and flip-flops. I don't wear most of them more than once a year. So I don't need them to last for everyday wear for 10 years, right? So, but I will say there are some that I'm starting to like a little better than the others, some particular designers, but only because of their quirky designs, like Betty Johnson, Betsy Johnson. I love her little quirky shoes. I love David by David, Charles by David, Charles Davis, something like that. They had a wonderful pair of shoes I love before. And then now a new one I've discovered is Azalea Wong. And I just love the quirky style. Of course, as you can imagine, they do lots of flash and beating and sparkling and weird stuff. So yeah, that's what I like.
SPEAKER_03:I mean, I love shopping for unusual suits and um York. I quite often go downtown to Camden Town and I find a shoe supplier that I believe I sent across to you guys because they were just unusual. Yes, and I love those.
SPEAKER_01:I love them, love them, love them. Okay, so week in privacy. Let's start with perhaps some of the biggest news we've heard that most people have tuned in on, which is the Google fine by Texas. I guess it's not a fine if it's a settlement, right?
SPEAKER_03:Don't mess with Texas, right?
SPEAKER_01:Yes. And the largest one before then was by Arizona now, by a single state. I know that a coalition of states came together to to uh assess Google and there was a big settlement there. Texas is the biggest settlement, I believe, against Google. Now, I didn't go actually read the full settlement document, which I should have, but one of the points that really gets me is that Arizona had a specific provision in their settlement where Google had to pay, I think it was$5 million, to an Arizona law school in order to set up privacy and data protection education for the attorneys' general offices as well as judges. And I'm part of the Center for Law Science and Innovation there. I teach privacy law there. So we were very excited that I would be able to spin up a fabulous program that we've been looking forward to. And the governor at the time gave the money, I don't know, to some law school out in Virginia. And so the governor after that, or the attorney general after that, went, nope, nope, we're pulling that money back to Arizona. You're not allowed to do that. And then the new governor took the money. It might have been the old attorney general allocated it to a law school in Virginia, but the new governor took the money. And it was a bunch of settlements and fines like this across a wide variety of activities, went to the Arizona Supreme Court, and they said the governor was allowed to do that because essentially the money would be used for the purpose for which it was dictated. Not true. Maybe on the one particular big one they were conversing about, but on the privacy education one, that was very specifically to an Arizona law school. And the governor did not give the money to an Arizona law school. So to me, the terms of the settlement were broken, but the only one that was going to fight it would be Google, and they don't give a damn, right?
SPEAKER_03:That's a real shame. I mean, we were talking with Paul last week about one of the settlements over here that was to allocate a certain amount of funds into data protection education and awareness from the Dutch Law Law and Tennis Association. So I'm a huge fan of creative enforcement and people traveling that sort of penalty into data protection education. But yeah, it's got to get there, as you say, right? It's got to get there. I think that Attorney General Paxton has got a bit between his teeth. I mean, it's not really 1.375 billion on Google, but that follows up 700 million and 8 million for anti-compassative practices and 1.4 billion with meta for biometric tech collection. It's a huge amount of money added up, isn't it?
SPEAKER_01:Oh, yeah, absolutely. He's I mean, Texas is a big state. I guess they need big money. So, you know, that's one thing to take into account. But I just love seeing them continuing to push forward with these penalties and these fines under state law, even though it's a variety of state laws that they're doing it under. I love seeing their activity with this. But I almost reached out to Ken Paxton, was like, can I carry your briefcase? Let's just stay here, dude. You're quickly becoming one of my heroes. Although, frankly, I don't know how big of a personal action plan he's bringing to this. But he is the attorney general, so he has to approve all activities that his staff is undergoing. And I would think something like this would definitely have his personal hand work on it, wouldn't you?
SPEAKER_03:Yeah, I mean, I'm actually, you know, unusually for someone in Europe, you know, we can set our noses at the US occasionally go, oh, we've got the GDPR. We've got all these data protection laws. But yeah, what an attorney general really goes after somebody. I mean, you've got some powers to settle. I mean, 1.375 billion, 1.4 billion, we use the sort of fines and penalties that were almost unheard of. These are big. That and I saw that Tom Kemp this week was launched some sort of model where Yeah, the CCPA data subject access request tool on his website.
SPEAKER_01:I was going to tell him I think it's time for him to come back on the podcast. Let's talk a little bit about, you know, what he's doing at the agency and in information that they're pushing through. But another under CCPA is California AG because got a 530,000 settlement with Sling TV. I mean, California is Texas, right? If it ain't Texas, it's California.
SPEAKER_03:So Yeah, well, they do seem to be the two biggest names in state protection as far as the U.S. is concerned. I mean, I know they're the two biggest states, I guess, in terms of income and but I'm surprised they don't see more out of places like New York, actually. But they've struggled to get their law full.
SPEAKER_01:Oh, they're a lot in the news anyway on what they're doing. Their focus ain't privacy. It might be another P word, right? But it ain't privacy. But that's yeah. There there's a lot going on there. Well, since we're over at the U.S. and talking about New York, there is a voice actor's lawsuit against Lovo after the court ruled that the AI-generated voice replicas can violate state personality rights. Voice cloning is not just an ethical frontier. It's now they're saying it's entering the legal realms, but we all knew when they were arguing about it ethically, somebody was going to have to find a legal underpinning for them to base it on. So I thought that was really interesting there because I have a lot of friends that are personalities in the media. And they were posting that they were being contacted by AI companies saying, can we use your voice, record these 30 minutes worth of reading to us, and then we'll pay you a few pennies, you know, for doing this? They're like, no, this is my living. This isn't just my voice. This is my living. And any of you out there who are not voice actors or not personalities with the recognizable voices, it's true for you too. Why would you want to sign up to be an AI voice model when they're really just capitalizing on your voice and not paying you anything for it?
SPEAKER_03:No, I mean, we've seen this happen, especially with deceased actors recently, with uh people like, you know, in in the world of Star Wars and things like that. But, you know, I as I heard, or I understand it, James Earl Jones, instantly recognizable for the voice of Dart Veda, wasn't paid one cent for the last sort of time that the the Dart Veda voice was used in Star Wars because we're able to sort of generate his voice, as I understand it, from previous samples.
SPEAKER_01:And that's just that's horrible, right? You know he's from Mississippi, just saying. But have to throw that in. Most people don't think Mississippi has anything good in the world going for it. It absolutely does. Very strong in the arts. Oprah's from there, the king of rocks from there, the king of countries from there. I mean, you know, just keep going. But okay, I'm getting off topic. Squirrend. What do you have going on? Those are some of the bigger ones here in the U.S. I'm sure there are some others to go through, but you got a couple of notable ones coming out of Europe. I'm liking the phone.
SPEAKER_03:Yeah, we do. Before you do that, I just want to follow up on the AI story because we actually had a court case that uh today that actually came through from a company called Stability AI against getty images in the world of AI. Now, as you know, the Seattle-based Getty Images has accused Stability AI of infringing copyright and trademark by scraping 12 billion images for its website to train stable diffusion, the popular image generator. This went all the way to British High Court, and it's really the first of a wave of lawsuits involving Gen AI as sort of movie studios, authors and artists, you know, exactly as you were saying, using their work to chain AI trackbots. And what really struck me about this case is it didn't go the way I would expect, you know, because both sides kind of claim victory. Yeah, of course.
SPEAKER_01:That's the way they spin it, right?
SPEAKER_03:Yeah, I mean Getty won the argument that stability had infringes its trademark, but lost the rest of the case because the courts, when you read into the judgment, started talking about what was actually in the AI model and stored in the AI model. And they ruled that stable diffusions AI doesn't infringe copyright because it doesn't store or reproduce the copyrighted works inside the model. It kind of scans them, uses a mathematical representation of them, and then uses those mathematical representations to produce new materials. But what was really interesting about the court case is it said, well, they haven't stolen it. They because they're only using these mathematical models which are their creation, they you haven't sucked the actual data into their databases or into their platform. It's just a mathematical representation.
SPEAKER_01:So yo Okay, that's just wrong. I'm sorry, it may be right, but it's wrong.
SPEAKER_03:That's kind of where I am. And you know, it you know that it's a really it's a real sort of sort of difficult one to kind of talk about because yeah, I think that a mathematical representation of the data is is the data, right? You know? Yeah. But that's not how the court saw it. That's not how the court saw it. So it'd be interesting to see if then that's used as a precedent for all of these other court cases going forwards and whether I mean the decision sort of leaves the UK without a meaningful verdict on the lawfulness of AI models. So again, it's kind of one of these weird wishy-washy court cases that doesn't really give us a direction to go.
SPEAKER_01:Damn those judges, what is wrong with them?
SPEAKER_03:Yeah, why can't they just say it's a legal copyright theft and have them Right.
SPEAKER_01:I mean, why can't we get some clear guidance on something that is absolutely clear and is not so limited by very specific details that you can't extrapolate it to any other decision you need to make as a company to know if what you're doing is legal or not?
SPEAKER_03:Or am I just saying sour grapes because it didn't go the way I wanted it to? I don't know.
SPEAKER_01:Well, it could be that too, but still.
SPEAKER_03:The bigger news, however, is we've got an opinion from the European Data Protection Board today on the draft adequacy agreement for Brazil. So you remember a while back that Brazil uh or sorry, the European Commission brought out an adequacy draft to say that they wanted Brazil to be adequate and looked at the law and said they kind of think it is. Well, we've got the European Data Protection Board's sort of analysis and opinion on it. Broadly they agree. Broadly they agree. Broadly, yeah. I mean, broadly they say not it's adequate, but they do have some sort of questions, I guess, mainly around international transfers and public bodies and redress and sanctions. So again, I think they're saying that on paper the law looks good, but obviously obviously you still need an enforcement body that actually is taking actions to enforce the law. And to do with that and to do with international transfers and especially the potential for onward transfers from Brazil, they have some questions still. I mean, it's very politely worded, but it essentially says, you know, these are areas that require a closer look, and we're not quite happy with them. So whilst they broadly agree, I think they're saying that there are some still concerns around onward transfers, international transfer in general, and of course the ability of Brazil to carry out enforcement action. So I don't think that's any surprise, but I don't think the European Data Protection Board will, you know, their opinion will in any way stop the adequacy agreement going through, is my understanding. Right. That being said, we are recording this in November, and it's actually worth people remembering that the UK's own adequacy expires in December as well. So we should expect to see some news on that pretty shortly as well.
SPEAKER_01:Yeah.
SPEAKER_03:Other big news from Europe, you remember a couple of weeks ago the Letumbe judgment, which basically was sort of mutt mutt toted as Shremes free, trying to get rid of the EU-US data transfer arrangement is illegal. Well, Latumbe ha lost as we know, but he's going to seek to overturn the court's assessment. So there's basically an appeal there. So that's gotta be interesting. That's not dead yet. That's going to go up another level.
SPEAKER_01:Never will be. Never will be. Always.
SPEAKER_03:I mean, if it was operating okay, I I probably wouldn't have an issue, but you know, given some of the issues with the Civil Liberties Oversight Board and the makeup and what's Trump done since he's coming to power with the people who are on the board, you know, there is some questions about is it even functioning, I guess, at the moment.
SPEAKER_01:But Well, and it's not whether or not it's even functioning, it's whether or not it's even fixable. I mean, with the current administration, I don't see them trying to make the thingy legit. If Europe points out several issues, you need to fix this, is I don't really see the current administration prioritizing that. So I mean, the only thing they could do would be kill it and then hope that they can fix it in three years. I mean, and that's not a that's not a way out.
SPEAKER_03:They're too worried about ballrooms and bathrooms, right?
SPEAKER_01:Right. Exactly. I'm not getting into there. I'm talking to administration in general. But yeah, and he everybody knows I was a little disappointed that we didn't see a lot more data privacy and protection movement under Kamala. But we didn't. I mean, let's be honest. We thought it was going to come in and be a rock and roll show. You look back on it, there were some significant things done, absolutely. But I would be shocked if we saw significant movement on the direction we would like to see it under the current administration. So yeah, I'm with you on that one. So, I mean, in other words, why bother even fixing it? I mean, if you kill it, there there's no putting something additional in place that's going to bridge the gap this time. Under all the others, there was always the ability that we would work on it and put in some, you know, revisions that would make everything a little bit more palatable. And yes, I'm hedging myself with lots of adjectives here, but there was always that hope. Now, if shrimp three happens and it kills the thingy, you've got nowhere to go, nowhere to run to, nowhere to hide.
SPEAKER_03:Yeah, and I think that that's, you know, one of the reasons I think why Latumbe was sort of unsuccessful. You know, that they have tried their best, but what's the alternative? And you know, but maybe there isn't one. Maybe all data transfer succeeds between the EU and the US because it's the laws are just too different. I don't see that happening, by the way.
SPEAKER_01:I mean, that's a shutdown of pretty much global commerce, right? So but on the other hand, a lot fewer companies signed onto the thingy than had historically been under the safe harbor or the what was the second one? SHIELD, yeah, that's right. Yeah, the safe harbor and the shield, and incrementally you're seeing less and less companies signing up for those.
SPEAKER_03:Yeah, yeah, I think it's fair to say that everybody is now going back to SCCs and TIAs and things like that as well.
SPEAKER_01:So Which the new FCCs aren't as horrendous as the old SCCs were. So there's that.
SPEAKER_03:Yeah, there is that. But I mean, because you know, considering the concerns with the with the US, I think you know, if I was doing a transfer impact assessment on transfers to the US, given all of the media coverage and the concerns that there has been, you know, ha y you'd have to fudge your TIA to make it in any way look like you can transfer the data.
SPEAKER_01:Well, but here's the thing. If you're transferring data to the US and you're doing a TIA, it depends on what kind of data you're working with. Just put some standard contractual clauses in place and stop that nonsense.
SPEAKER_03:Well, I've always had a problem with the idea of two lawyers sign a document and then walk away whistling and what what actually practical protections doesn't give to the data subject. I've always had a lot of people.
SPEAKER_01:Well, but the thing he doesn't either. Let's just be honest.
SPEAKER_03:Yeah. I mean paper-based Yeah.
SPEAKER_01:Interesting. Very interesting. You know, this could be a whole episode all on its own as to what actually needs to be in place. And I know what Paul's gonna say. Rights guaranteed to people who are not US citizens.
SPEAKER_03:I yeah, I'm I'm doing something similar actually, in that I'm taking uh doing a bit of a roadshow next year. One of the places I'm going is the UAE, uh doing sort of a couple of days uh privacy program management course where I'm sitting down with senior managers and I'm saying, right, you know, you've got a choice. You know, you've got uh tick box, cover your bum, paperwork-based compliance that will cost you money and not do anything. You know, what I'm gonna call the the paperwork route, or I'm gonna call it the more technical route, which is where we actually get in place proper data protection by design, embedded in apps and applications and software that allows individuals to claim their rights because they're a global customer, let alone if they're an EU or you're a US one. Well put simply, who do you want to be and how do you want to operate? You know? Yeah. So I think that that little roadshow is going to be fun. Going back to more news from Europe, talking of Shrems, our good friend Max Shrems, I mean, Lutumbe aside, you know, Neub has filed a criminal complaint against Clearview AI and its managers. So this isn't this isn't to a regulator. This isn't like you know, we've already seen Clearview finds by the UK, from the Dutch, the Italians, the Greek, the French. The Clearview, as we know, aren't fond of even claiming that they're under jurisdiction. So most of them they don't even reply to. Right. But you know, put all together that's about 100 million euros on ClearView. And all of them have said that Clearview AI has acted legally, including bans. And Clearview is just ignoring EU authorities. You know, only in the UK did they even appeal the decision.
SPEAKER_01:A note to Ken Paxton.
SPEAKER_03:Yeah, exactly. But you know, EU data protection authorities, what this really hinges on is the fact that EU data protection authorities haven't really come up with an effective way of enforcing territorial extent. You know, haven't really come up with a way of saying, hey, if you're covered by Article 3B for 3, 2B, which is, you know, you're outside, but you're targeting products and services or monitoring behaviour inside, you know, we you fall under the law, you should follow the GDPR, and then we can, you know, hit you with a fine, and you should have an EU representative set up in in the e within the jurisdiction. Right. Actually Clearview AI is just saying, but we're not gonna do that. You know, we're we're you know, we're not gonna have an EU representative, we don't believe we're covered by EU law, and if you send us letters from the EU saying you should follow our laws, we're just gonna laugh at you because we're sitting in America covered by American law, and how dare you try and, you know, h hit us with fines and penalties when we're not in your country, we're not your legal entity established in your country. So this is a different view. So Max Frems is basically saying, you know, you can run a cross-border criminal procedure for a stolen bike. So they're hoping that the public prosecutor takes action when the personal data of billions of people is stolen, as has been confirmed by multiple authorities. So instead of looking for a regulatory penalty from a seed visor authority, they're actually looking at a criminal complaint with public prosecutors in Austria. So they're trying to get Clearview AI and executives to face jail times and be personally liable if they ever travel to Europe at all. So that would be against section 63 of the national Austrian Data Protection Act, which allows criminal sanctions. So this isn't a sort of an EU GDPR thing. This is actually a national data protection act thing that does allow criminal sanctions for the unauthorised obtaining of personal data. We've got something very similar in the UK as well. So even though the GDPR allows you to, you know, doesn't allow personal liability, you know, in our own local data protection act. We've got criminal offences such as the unlawful obtaining of personal data without the consent of the controller. So the sorts of people who normally get caught by that are your call centre worker who unlawfully accesses someone's record and you know looks up their family, or your police officer who looks up the daughter's boyfriend, or your car salesman who takes the database to their next employer. You know, and these people who are done for unlawful obtaining or unlawful selling of personal data could actually be held criminally liable from the regulator. And you know, the ICO has taken a number of criminal actions against such individuals. So this is a very similar provision in Austria under section 63 of its National Data Protection Act for your unlawful obtaining of personal data. And Neuberg. Neuber going to use it to try and go after ClearView AI instead of a fine or a penalty on the actual organization. This is deliberately against its managers themselves. So Wow.
SPEAKER_01:That's a bold move. I like it.
SPEAKER_03:Yeah, I mean, you know, yeah, I mean, uh I quite like his quote against, you know, if we can do cross-border collaboration on stolen bikes, why can't we do it on stolen personal data? Yeah.
SPEAKER_01:Absolutely. Absolutely. So there's a few other things. I think we're coming to a close once we add in Ralph's thing, but there's a couple of things on the October Roundup. And by the way, y'all know we get these from, you know, newsletters from law firms, online stories, LinkedIn people postings, Trust Arc, the Privacy Pulse, their monthly roundup, their weekly posts, as well as IAPP at all. So if you want to know where do we get these from, there's our sources right there. You are our source. We pay attention to what you post too. So anyway, we had a few more things happen in October that I don't think we've actually mentioned. So we're not going to dive into any great detail here. But Bangladesh had a personal data protection ordinance 2025, Gambia Personal Data Protection and Privacy Bill 2025, which is awaiting the presidential approval. And I don't think I heard that he signed it yet. Algeria's amendments to its current data protection act, Vietnam, has completed their public consultation phase on their draft decree to enforce their personal data protection laws. So there's a movement there. Y'all may have heard, and I think we may have mentioned that Colorado Privacy Act actually amended to protect minors' data online. So that one's old news. But here's some movements in California that we've all seen, and I think these all passed. The Digital Age Assurance Act, the health, the privacy for health data location and research, the California Opt Me Out Act, the account cancellation on social media platforms, the amendments regarding data brokers, data collection and deletion, the health and care facilities information sharing and the data breaches customer notification one. And the other thing I heard is that Tom Kemp is rolling out the phrase Cal privacy rather than CCPA or CPPA or any other acronym. He says, we are now going to be known as CAL privacy. Well, you know, someone's going to shift gonna shorten that to CalPry or CalPRI or something.
SPEAKER_03:CALPRI, yeah.
SPEAKER_01:Yeah, it's gonna be is gonna be something. But they did, let's say, they got a tractor supply company penalty. I think we talked about that. And then AI development. So I think we've mentioned a lot of these a little bit in passing, but maybe not in detail. Vietnam released a draft law on artificial intelligence. Namibia is currently drafting its AI law. Tanzania is preparing its national AI guidelines. United States, we've had, you know, California with the Frontier AI, the camp companion chatbot era interactions. That despite I hate the chatbot litigation with the wiretapping. Oh yeah.
SPEAKER_03:Well, we're seeing a lot of older wiretapping laws being applied to AI, aren't we? And to be honest, it freaks me out. Wherever I go on the internet at the moment, I'm being offered some sort of AI companion, rather. Trevor Burrus, Jr.
SPEAKER_01:And you probably can't opt out of anything on your phone when you open up a website and it tells you we do so and so. There the only thing you could do is close it or accept it or leave. There's no closing, there's no modified permissions. I mean, all of that just smacks of dark patterns. There is consultation. I think the EU is closing consultation by the time we post this website. So the consultation period on the high risk AI system providers will close November 7th. So if you didn't get it in, Too late now. Other iNews, Chat GPT may be the first AI model regulated under the EU Digital Services Act. We talked about the open AI there. And let's see what all do we have. There were some other AI and some child privacy acts passed over here on the East Coast that we did talk about, but I can't think of anything else large that is looming unless something breaks today, and I just have no idea it was coming.
SPEAKER_03:The only ones that I've spotted that I didn't hear on your list was Chile having an AI law coming in and they're bringing it into their constitution as well, which is an interesting. And I spotted that Sri Lanka has passed the new data protection amendment laws as well.
SPEAKER_01:Oh, we love that, don't we? We love anything that's Sri Lanka. I named a cat Sri Lanka before. I have a habit of naming animals after where I found them. I had a cat named Waffles and a dog named 8th Street. I named a cat Sri Lanka, but I've never been there.
SPEAKER_03:Oh, I love Sri.
SPEAKER_01:That means we need to go to Sri Lanka.
SPEAKER_03:Sri Lanka was amazing. And I suppose the last thing we then need to talk about is your travels, Kate.
SPEAKER_01:Yeah, we are next week. I will be in London. And there's a lot happening in London at the time. There's the, what is it, the compliance risk symposium. I'm not going to, but I've got friends that are going to be there. So Martin is going to be there. We're going to meet with lunch with him. We have the Picasso Awards. I'm so excited. On Tuesday night, that is a black tie affair. We've got custom bow ties. I'm hoping I get the custom sides.
SPEAKER_03:You're not wearing black tie. We're wearing serious privacy tie bow ties.
SPEAKER_01:That's right. Serious privacy ties. Look at me saying privacy like a true Brit. And I'm customizing a pair of shoes. I'm not going all out with bling. I'm just doing some glitter and stuff. But yeah, anyway, I'm customizing. I'll probably be a little overdressed because when I think black tie, I think pageant dress on stage, right? I mean, that's what I've got. But I'll go with the with the podcast colors. Whether we win or not, we're going to be a good-looking group. I'm excited about that. And then Privacy Space is happening on Thursday up in Limington Spa. I'm very excited to be there. The podcast is going to be speaking as well as I'll be speaking on the conflicts of being the AI governance officer as well as the DPO. And how did those line up together? It's going to be fascinating, utterly fascinating. Got to figure out what I'm wearing to do.
SPEAKER_03:I look forward to hosting you. I look forward to uh having you in there in the country, and I look forward to uh buying you a drink and introducing to a few professionals who will no doubt worship the grounds you walk on from all of that.
SPEAKER_01:Until they meet me and they're like, oh my God, this is really her. But my husband's coming with me. He's trying to get us to, you know, mark down an agenda. I'm like, we're just there. They're gonna have Christmas stuff up. We're gonna go see Christmas lights. We're gonna go to Hyde Park. We're gonna go back to Buckingham Palace, which I'm assuming is gonna be decorated for Christmas, and Kensington Palace. I don't know if I'm gonna make my way to Windsor or Oxford, but we're looking at, you know, what are we doing?
SPEAKER_03:So it's actually we're actually recording this. We don't often give out dates of records, but we're actually recording this on November the 5th, which is Guy Fawkes Night in Night Um, which is remember the 5th of November Gunpowder Treason and Plot, where there is no reason the 5th of November should ever be forgot. So we are recording this, you know, a good two months out from Christmas, the holiday season itself. And I went shopping today and I heard my first piece of Christmas music in the shops.
SPEAKER_01:Nice, very nice. So I am bringing, I'm probably, I'm saying I am, but I think I am, bringing some Halloween candy over because Paul loves the little Halloween packets of candy because they're small and they're not U.S. size. So I'm bringing some Halloween candy, but it won't be a whole suitcase worth like it was last time. I do believe that we might still have a seat or two open at our privacy space table that we're looking to fill. I'm sure Ralph is on top of that, so we'll have a full table there. I think there's 10 people. So it'll be a full table and everybody is under orders to show up in pink and blue.
SPEAKER_03:Pink and blue.
SPEAKER_01:Pink and blue. That's what we're doing. But I'm excited for that. So we'll have a wonderful episode that comes out next week, or it doesn't come out next week. It'll be recorded next week to come out just to talk all about, you know, when the American privacy officer went to Britain.
SPEAKER_03:And then the week after that is IAPP Brussels. And I think myself and possibly Paul are going to be there really at the fringe event. So come and find us in the brewdog across the road at Trust Stark's party in the Rock Cafe on the Grand Place in Brussels. That'd be fun.
SPEAKER_01:I couldn't justify staying for two weeks, though. I tried.
SPEAKER_03:Yeah. So if you want tickets to uh to to the trustark party in Brussels, come and seek me out on the socials as usual. And win so a pat couple of weeks, and we'll be bringing it all to you on the Serious Privacy Podcast.
SPEAKER_01:Yes, and I'll bring stickers. Freak stickers. I'll bring my favorites, which is why so serious privacy. Uh with the Joker on it that Paul absolutely hates. So I found an unopened pack of them from our first year. They're very large. They're like three inches. I'm bringing them with me. These are these will never be printed again. Paul would stomp on my toes if I ever tried to get these again. So these are heirlooms, telling you.
SPEAKER_03:And as ever, my concept for short episode failed utterly. That was serious privacy.
SPEAKER_02:This wraps up our podcast for this week. Please do share the episode with your friends and colleagues because we love to get more listeners. And join the conversation on LinkedIn or on Blue Sky. You'll find us under Sirious Privacy on both platforms. You'll find Kay as Heart of Privacy, Ralph as IGR O'Rion, and myself as Europol B. Until next week, goodbye. Goodbye.
SPEAKER_01:Bye y'all.