Serious Privacy
For those who are interested in the hottest field in a technology world. Whether you are a professional who wants to learn more about privacy, data protection, or cyber law or someone who just finds this fascinating, we have topics for you from data management to cybersecurity to social justice and data ethics and AI. In-depth information on serious privacy topics.
This podcast, hosted by Dr. K Royal and Paul Breitbarth, features open, unscripted discussions with global privacy professionals (those kitchen table or back porch conversations) where you hear the opinions and thoughts of those who are on the front lines working on the newest issues in handling personal data. Real information on your schedule - because the world needs serious privacy.
Follow us on Twitter: @PodcastPrivacy or LinkedIn
Serious Privacy
A week in Privacy
On this week of Serious Privacy, Paul Breitbarth of Catawiki and Dr. K Royal bring you a week in privacy, covering women in privacy like Liz Denham, Annelies Moens, Ludmila Morozova-Boss and others, OpenAI, Meta and noyb, holiday scams, new legislation, and even book reviews, Ready Player 1 and Your Face Belongs to Us from Kashmir Hill. Tune in for some fun.
If you have comments or questions, find us on LinkedIn and IG @seriousprivacy, and on Blue Sky under @seriousprivacy.eu, @europaulb.seriousprivacy.eu, @heartofprivacy.bsky.app and @igrobrien.seriousprivacy.eu, and email podcast@seriousprivacy.eu. Rate and Review us!
Proudly sponsored by TrustArc. Learn more about NymityAI at https://trustarc.com/nymityai-beta/
#heartofprivacy #europaulb #seriousprivacy #privacy #dataprotection #cybersecuritylaw #CPO #DPO #CISO
Please note this is largely an automated transcript. For accuracy, please listen to the audio.
[00:00:00] Paul: Quite a few weeks have passed since we last brought you up to date on the global developments in the data protection and privacy community. So there we go. Not a weekend privacy episode. And this week, you'll hear all about AI generated female conference speakers. LLM training data, the role of women in privacy, more in general, you'll get some book tips.
We talk about five side recertification, adequacy decisions and new legislation. So Fasten your seatbelts and get ready for takeoff. My name is Paul Breitbart.
[00:00:42] K: and I'm K Royal and welcome to Serious Privacy. all right, Paul got an unexpected question for you. Cause I'm just going to dive right in. Cause this is kind of a week in privacy for
me and you to catch
[00:00:52] Paul: Oh yeah, we've got a lot of catching up to
[00:00:54] K: Despite the messy week we had, which frankly came out pretty well, I think.
One, what skill do you wish you had learned that would make your life easier right now? Professional or personal, what skill do you wish you'd learned that would be really helpful?
[00:01:11] Paul: travel?
[00:01:12] K: Ooh, I'm not sure that's a skill, that's a superpower.
[00:01:17] Paul: No, is that also a super power? Hmm. For a long time I would have said scenography. Shorthand?
[00:01:24] K: Mine is shorthand because I take handwritten notes.
[00:01:28] Paul: but I don't take as many notes anymore
[00:01:30] K: Ah.
[00:01:32] Paul: that has become slightly, yes, slightly less useful.
[00:01:36] K: I am back to taking notes because I bought a Remarkable 2.
And do not have it connected. Although, I have heard that if you connect it to your own drive, that it does not go through a Remarkable system. It goes straight from the device to your own drive. But I don't know if that's true or not. If anybody's from Remarkable Listening, let me know.
So at this point it is not connected. I use it like I used my erasable notebook before. I just take lots of notes on it and go there. But
[00:02:05] Paul: So yeah, shorthand for you, sometimes, sometimes I think about speed reading and I've tried speed reading in the past, but I didn't really like it.
No, no, no, no particular skills come to mind that I really feel that I'm lacking, but maybe some interpersonal skills
[00:02:23] K: Okay Let know what skills paul needs to have because apparently he's already got add wrangling down pat I think after we did the the Nordic privacy arena, someone came up after it was like, you know, I've always wondered listening to the podcast. Now, after seeing y'all in person, Paul, how do you manage this?
And he gestured to all of me. I'm like, Paul manages this just. but but
[00:02:48] Paul: You don't need to be managed, you manage yourself.
[00:02:51] K: well, there are times if I've had, I literally remember when I worked at Align Technology, I must've had a bad day. I had nothing but chocolate and caffeine. I was talking to someone, I'm like, you know what? I really want to slap myself right now.
So I'm going to shut up and leave your office because I am bouncing off the walls. It was crazy, but let's catch up in a week. There's a couple of things that are. Top of mind for me that are really hot and I'm gonna bring one of them up to you fake AI Generated women speaking at a conference now I came across it in LinkedIn when people were posting hey, you don't need fake AI generated women There's a lot of women out here by Heidi sass a lot of women out here that are real.
We're real. smart We're talented you can and they started tagging people in it. So that's how I first heard about it So I had to go look it up And here's the story behind it and I'll, I'll post a link to it as we normally do. So this was Devternity, which I have no idea what Devternity is, but it must be a, a conference like I did development university or something.
I don't know. Edward Sizovs admitted on social media that one of his speakers that he featured in the conference was an auto generated woman with a fake title. Now, apparently this came apart and he admitted this because he was being challenged about some of the suspicious profiles on his conference websites that appeared to be generated by AI.
He denied that the fake profile was intended to mask worse than expected level of diversity of speakers and he refused to apologize. This was all on X or form or Twitter. I don't know. I don't even log on anymore. It's not even on my phone. But this created a storm. It's been picked up by news everywhere.
It's all over LinkedIn. You don't need to create. Fake smart women to have a woman speaker at your conference. Even a woman speaker in tech, in cyber, in privacy, in whatever. There are women out there.
[00:04:53] Paul: Oh, absolutely. And usually you are the one rolling your eyes. Now I, now I am doing that because this is, this is just not even, this is just plain stupid. and this has, has nothing to do with, with being right with, with
[00:05:09] K: Right?
[00:05:09] Paul: it's plain stupid.
[00:05:11] K: Yeah. Plain stupid. I like that. just plain stupid. That's a great summary of that. I mean, women. We all know that privacy for so long has been a woman friendly field. You could even say at some point it was a woman dominated field, and we've phenomenal women on the podcast talking about coming up in privacy and being in a women dominated field.
Now, I have my own theory over this, and it's because when, in the U. S. in particular, probably not in other countries, but in the U. S. in particular, because privacy came up, in the healthcare field and it came up in the education field that the people who handled the records that privacy applied to were women.
They were the medical records clerks or they were the student file clerks and so they raised up through privacy in that field a specialty for that. I could be wrong, I don't know.
[00:06:05] Paul: Well, I hate I hate to say it. No, I think, I think you are wrong. But I was, it was, I thought, I don't think you are wrong. It was probably also being a bunch of senior male managers deciding that somebody junior should do this because they didn't want it.
And at the time women generally were more junior than men. that's,
[00:06:26] K: they did more hands on work with actual files and records, because back then they weren't electronic, largely.
So they would,
[00:06:34] Paul: to it, even though it doesn't really make sense.
[00:06:36] K: yeah. But now, so many job descriptions, and I do watch job descriptions, as I said before, because it tells me what's going on in the field. I can watch the jobs that are open, like the one on, in CERN.
I can watch the job openings, see what required skills they're having, what kind of experience they want. Cause we know for a long time there, we desperately needed people in privacy and there just weren't enough qualified. So very junior people were getting very senior roles. And now we have all different kinds of roles, but what I'm seeing more, more and more often now, especially for the high senior roles, the chief privacy officer role is that they want someone with legal experience and y'all know my opinion.
I don't think you have to be a lawyer to be a CPO or be senior in privacy. And not all lawyers are cut out to do it, but they're asking for it now because the laws are getting so prevalent and so complicated. They want someone who knows how to navigate that. Now, the bad thing of that is law is a male dominated field.
So if you're going to make privacy a law dominated field, you're going to start getting a lot more men in it. And I think we've. I think we've started to see that, that we're seeing a lot more senior males transition over into privacy, and they're not cut out for it.
[00:07:57] Paul: No, that's true. And at the same time, I also think that it's not just a legal profession. It's also becoming more and more an engineering profession again. look at the work
that, Dennedy is doing, but also that Jelena Kljujic is doing at Cisco and, and some of the others, And that's also a lot of women doing that work.
So let's hope this, this will remain a gender balanced profession.
[00:08:23] K: Yes. It doesn't need to be dominated by one or the other.
[00:08:26] Paul: no, because in the end that makes it stronger. And, and you mentioned just now that there are a lot of really great women in this profession. And as evidence to that, while I was in Brussels a couple of weeks ago, I attended the 2023 Women at Privacy Award Ceremony.
And four awards were handed out this year to Andrea Jelinek as a sort of Lifetime Achievement Award. She has now stepped down from all of her public roles. Laura Linkamees from Privacy Laws and Business who's been writing about privacy updates since forever and does a great job at that.
and then also Nataliia Bielova and Kari Lauman who received awards. Also really good at what they do. And then a whole range of others being, being nominated including representatives from DPAs like Marit Hansen from one of the Northern German DPAs Marie-Laure Denis from the French DPA but also our podcast co host Rie Walle for all the work that she's doing to make privacy news publicly available for everybody and understandable.
[00:09:31] K: Yeah. Absolutely. And we can call out others. Annalies Moens in Australia just won the Privacy Award for IAPP Australia. Pretty amazing. The efforts by Carmen Marsh to highlight women in cyber security, including privacy lawyers or cyber secure lawyer professionals as well. And then, Ludmila Morozova-Boss. She does top cyber news magazine. Fabulous effort.
[00:10:00] K: She does the Magnificent Seven, the women in cyber. And highlighting women in this very technical field. But you hit it right on the head there, Paul. It's also the technical aspect of it because you and I were bemoaning a year or so ago about how companies were rolling the data privacy rolls or the data privacy functionality.
Under the information security or the IT roles because they were already handling data. So they should be the ones over privacy, right? I don't necessarily with But that's also one of the reasons why we're seeing a lot more men in it is because privacy is starting to roll up Under a lot more of that information security or IT roles too, and we're still seeing that Thankfully, not as much as we did, but the ones that are already established that way seem to be continuing that way.
So I just think it's, it's interesting that, that we have this, we didn't intend the podcast episode to be about women in privacy. But there's a lot of phenomenal efforts out there to highlight women in privacy, in cyber law. Join the movement. Don't be creating AI women.
[00:11:08] Paul: No,
[00:11:09] K: they're probably half naked and look like fairies anyway. Or
[00:11:12] Paul: So,
[00:11:13] K: goddess, I've seen that too. Lord help us.
[00:11:16] Paul: yeah, it, it almost, no, it does remind me of a book I've just been reading Ready Player One. I'm sure
[00:11:24] K: yeah,
[00:11:24] Paul: many, many moons ago,
was
[00:11:28] K: the movie.
[00:11:29] Paul: but, yeah, that's still on the list somewhere for the Christmas vacation,
[00:11:32] K: The movie's not as good.
[00:11:34] Paul: No, that's usually the case with books. But the book is really nice and also a bit dystopian.
Living our world completely online in an avatarized world and, well, not sure that that's the world that I want to end up in.
[00:11:48] K: No, and you know, that is something that we have to start looking at for the newer generation who really are digital natives. They are born working in the digital world and virtual reality and everything that's taking over. And I think some of the things that people talk about, and we've talked about this before, the newest generation doesn't care about privacy.
Yes, they do.
[00:12:13] Paul: Oh
[00:12:13] K: They,
[00:12:13] Paul: absolutely.
[00:12:14] K: They, just deal with it differently than we do, but they do care. But speaking of that, that's some of the news headlines we've seen this week is companies getting in trouble for what they're doing with children's data or children users as they have it. And so we're starting to see some more news headlines around that, which is very interesting.
I'm sure you have one that's popping to mind. I'm the one I'm thinking of is meta,
[00:12:39] Paul: Mm obviously. Well, no, I was actually still with my mind with the the, the great women in privacy and the
lack of women in certain roles. I mean, we cannot not talk about the open AI saga.
[00:12:56] K: Yeah. Oh, oh what? I'm sorry. That's really a drama now. I mean, it's like, that's going to be one that goes down in the history books.
[00:13:05] Paul: It will be, there will be a movie at some point, but
will be a very male dominated movie because obviously Sam Altman is a man all of his director's team is male the Microsoft team is male, and the new board of OpenAI also seems to be male, at least to a very large extent, so, Yeah,
[00:13:26] K: is one thing that they do need to fix is women on boards.
[00:13:30] Paul: yeah, so that's one of the reasons why I brought it up. It, it all seems like the old boys network bringing other back together, giving each other roles. And, whatever happened there at, at OpenAI, I, I still don't know. And, and to some extent, do not care because whatever they are going to do, I don't trust their software.
and
[00:13:53] K: either
[00:13:54] Paul: ethics.
[00:13:54] K: I
don't
People laugh at me. I still don't use it. And I hear that last week, late last week or over the weekend, Amazon announced their AI
platform. Which seems, which is supposed to be more directed towards enterprise, more focused on privacy and security. But I also have heard criticism that it just cannot measure up to.
[00:14:15] Paul: Yeah that that's probably true. I think Google is doing a fairly good job in finding the balance between privacy on the one hand and quality on the other hand. No, then maybe not, but I've said it before on this podcast in the Brussels episode, but, You know, the, the big problem with the foundational models remains that the training data was just the foundational models.
The training data was just taken from here, there, and everywhere, and without asking anybody's permission or authorization. And that's a concern. And that's also one of the final points still on the table for the new European AI Act. What to do with these foundational models and
[00:14:57] K: Yeah.
[00:14:57] Paul: should be regulated or not.
[00:14:59] K: They've already ingested the data before years later being told they shouldn't be doing it. And it never occurred to them whether they could or not. The data was publicly available. It just wasn't publicly legal. To grab, but it was available. Like LinkedIn is the perfect example. The data on LinkedIn is publicly available.
If you're a member of LinkedIn, you can pretty much see it. The thing is you're not supposed to violate their terms and scrape the data and use it.
[00:15:27] Paul: No, well, there are a few things there, because to some extent, the fair use exemption under the U. S. copyright legislation, and also some scraping limitation provisions in the European copyright legislation seem to imply that under certain circumstances it might actually be legal to collect all this data to train a model. yeah, I mean.
[00:15:50] K: I wouldn't argue that fair use goes that far.
[00:15:53] Paul: No, maybe not, but that's for the courts to decide, and
[00:15:55] K: Yeah, but people here.
There's a fair use exemption, and they don't go read the details or understand them to understand the fair use exemption doesn't apply to them.
[00:16:05] Paul: Well, it's the same like people hearing there is such a thing as a legitimate interest and then
saying, oh, but we've got that. And
when we...
[00:16:13] K: interested in having that. So yeah, I qualify.
[00:16:16] Paul: Yeah. The other book I've recently read, and you've probably read that too by now, is Your Face Belongs to Us from Kashmir Hill.
[00:16:23] K: Yeah.
[00:16:25] Paul: Also, very good and very scary stuff.
[00:16:29] K: Yes. Very scary the more you know. She puts it in terms that the average reader can understand, but it still requires a little bit of, of expertise in the field to understand the nuances behind it. Which is, which is natural. I mean, how many times have you watched a movie or a show in your profession and been like, Oh they don't stick IV needles in that way.
When they see it. So this, this is no different. There's no false. That's not what I'm saying. I'm just saying it, it does take a little bit of expertise to understand the nuances behind some of what you see.
But I think even the, Average person with no experience in privacy or technology would understand that this is scary.
[00:17:10] Paul: Oh yeah, absolutely. And it's written really like a novel and not
like a non fiction
[00:17:17] K: Easily ingestible.
[00:17:19] Paul: Yes, I agree.
[00:17:21] K: Easily ingestible.
[00:17:22] Paul: what other big topics do you
[00:17:24] K: what other big topics do we have? Well one of the things I will say is we still are coming out with the state I know I've been promising y'all that all year long. I I know it's coming.
It's
going to come before
the end of the year. do it before Christmas. And then don't forget that Paul and I typically take January off, take a little bit of a break and then we start the next season back on. Global Privacy Day.
[00:17:51] Paul: Data protection day.
[00:17:52] K: Oh! Something that's not privacy, but I still found interesting that hit my news this week, is the United States Supreme Court has adopted a code of ethics.
[00:18:03] Paul: Yes, I read that. How many
years after they should have done that?
[00:18:08] K: just thought it was pretty, pretty interesting. Definitely thought it was pretty interesting to watch.
[00:18:16] Paul: And basically everything they've been doing for these past decades is suddenly also well within their code of ethics.
[00:18:23] K: so for our... Hey! for our side, looking at it, one of the, the, the ones that hit, and this was today that I just saw this, was on the U. S. Senate Select Committee on Intelligence, introduced legislation this was standard, renewal of the Foreign Intelligence Surveillance Act, Section 702, which y'all may know is the one that's really under fire for it.
But apparently there are some competing proposals for what the red lines would be, which I'm sure happens all the time. But it's really interesting, how this one is coming together. And I think we've had Caroline Lynch on our show. And she was one of the lawyers working that actually worked on this bill.
So I've been dying to reach out to her and go, Ooh, give me your thoughts on this. . Usually it's not much of a, of a newsmaker. So I thought
that one was really interesting.
[00:19:19] Paul: yeah.
[00:19:20] K: Right. Pretty much. And so I think this is where you can say that this is actually having a difference. and I like that. Some other things that I've seen happening is Google is still saying that in 2024, they're really going to attack the cookie problem. They've been saying it for a while. It's been up, it's been down, and then it's back and forth, but they're saying they're really going to look at
[00:19:41] Paul: Yeah, the, the first percent will be switched off in January all Chrome browsers and the likely from September onwards they'll retire all the third party cookies in Chrome.
[00:19:55] K: I'm not going to hold my breath. I'll believe it when it happens.
[00:19:59] Paul: That's true. At the same time with the privacy sandbox now being widely available also in Chrome, I think they are confident that they have a good alternative.
[00:20:10] K: Yeah,
[00:20:10] Paul: That's their words, not necessarily what I agree to.
[00:20:14] K: yeah. soMe of the others is the, the meta one that I mentioned, 33 U. S. State Attorneys General put in privacy violations against meta. This was October lawsuit brought by these attorneys general. And what they're talking about is Instagram, Instagram accounts belonging to those under 13 were not properly shut down and they're still collecting the data from that.
And then interestingly enough, this actually made the news that United Airlines is looking at serving targeted ads the data. Right? I don't know. It says on the mobile app or planes entertainment system. So yeah, I guess on board. The plane, but I think the mobile apps are, you know, just as you use their app.
So it's saying that information such as flight history or their rewards points could be used to target ads, but you can opt out of data tracking.
[00:21:09] Paul: How
about opting in for a
[00:21:11] K: right? Why not opting in saying, Hey, maybe I want this does remind me there are some changes to Apple coming. I believe Apple 17 is supposed to have the phone to phone connection that you, you automatically share the data.
You can turn that off as well. I don't even know why that one hit the news as a story. Maybe people just like to talk about Apple. I don't know. But then also in Uruguay I thought this one was interesting they recognizing data protection adequacy with South Korea.
[00:21:42] Paul: Yes. Well, Uruguay has been known to follow the list from the European
[00:21:47] K: Yeah.
[00:21:48] Paul: and with South Korea being recognized, of course, some time ago that would now be followed by Uruguay.
[00:21:54] K: Okay. That would make sense then, but it still has to be formally adopted for them to do it. So I just thought that was interesting as well. I am still wondering where we're going to see the cross border privacy rules still grow. Starting a few years ago, there was a lot more talk about rolling this out. I guess this was before Covid shut everything down, looking at making it a more global program.
Haven't really heard a lot more about that since then.
[00:22:22] Paul: Well, there has been some
progress
made be Yeah,
[00:22:25] K: the big splash
[00:22:27] Paul: the global CBPR framework was adopted earlier this year in 2023.
so there is now the, the framework to do so. obviously it's still very much a a, a US-based program, although the UK has now joined as an associate country. And other countries, Australia, Canada, Japan, Korea, Mexico, the Philippines, Singapore, Chinese Taipei, and the U S.
all of those are now part of the global CBPR forum, which is basically the equivalent from the binding corporate rules.
So it's a certified authorized in company data transfer mechanism. thAt can be used, and that would then ideally also be recognized by the European Union if you do a combined BCRC BPR approach.
[00:23:24] K: Still not hearing a lot about it,
[00:23:26] Paul: No, and I'm still not convinced that this is the way
[00:23:29] K: right?
[00:23:29] Paul: it's a self certification mechanism, and you know I'm not a big fan of those. But we'll see.
[00:23:35] K: Yeah,
[00:23:35] Paul: Time will tell.
[00:23:37] K: I'm not I'm not sure that Self regulation is is really the way to go,
right?
[00:23:45] Paul: so in other news It seems that another gap on the map seems to be closing soon because Ethiopia has approved its first data protection bill for submission to Parliament. So the government has approved it and it will now be subject to Parliamentary scrutiny. Looks like a fairly standard GDPR based data protection bill.
First glance, no real specifics. Obviously, Ethiopia is not the biggest economy in Africa, but still, it's nice to see that the grey spots on the map are slowly disappearing. also, also quite relevant is that the UK has yet again proposed amendments to their post Brexit data protection and digital information bill that are now being scrutinized in Parliament as the UK government calls it, these are a lot of common sense changes to the legislation that was already tabled Thank you.
Earlier this year. Oh, yeah common sense because it would allow the government to access for example Your bank data to see if you are not defrauding if you if you get a pension from the government or some other sort of of social benefit Other measures about reassurance and support to families as they grieve the loss to a child so that there is a data preservation process for social media and things like that, because social media now is not forced to maintain that data or retain that data and those could be also important in certain coroner investigations, will be more use cases for biometrics to strengthen national security, including international exchanges.
So yes, you can imagine that all of this also, in my view, is very much common sense.
[00:25:35] K: I don't know what Paul's about bringing common sense to I'm, I'm just saying, y'all, y'all need, y'all need to question this like I do, but I will say also, and I know we're coming to close to the end of the episode as well, but wanted to make sure that one. Y'all please be strong advocates, share out your tips for online shopping for your loved ones, because this is the time of the year where people are targeting, bad actors are targeting vulnerable populations. And it is something that weighs really heavy on me because it doesn't even have to be a vulnerable population. So many people are, are sucked in by scams by things, by getting emails saying, Your order has a problem. Please click here. And they're so frantic in the holiday season and busy with so many things.
They're not thinking about it. So please be an advocate. I know bringing it up at Thanksgiving dinner was probably not an option, but you know, the more visibility we bring to this, the more that we can warn people. I mean, I think one of the, the. best scams out there is when someone will call you and say, Hey, you've got a problem with your order that you placed.
But I know that you need to make sure that I'm the right person. So I'm going to send you a code so you know, I know who you are and that I'm legitimate. Just tell me what that code is. That's trying to bypass your two factor authentication. They're not sending you a code. They're clicking your password.
And I will say it works cause probably two or three times a day. Now I get Your g code is blankety blank blank blank now on my two factor, so it works. Make sure you do it. Make sure you do it. But I, I have to throw that out there because I know we all have our jobs and what we do, but we are the people that are paying attention to this in the open world as well.
So be an advocate on that, on that perspective. Sorry, had to throw in the public service announcement in there, Paul. As I'm looking at also what's coming up for the next year, Paul and I will both be at the IAPPDC, the Global Privacy, what do they
call it, Summit, Global Privacy Summit, Paul and I will both be there, so make sure, and we're going to look at putting on a coffee or something as well for fans so we'll be looking at what we can do there.
We probably need to start looking at scheduling that. I'm thinking something low key, you know, whatever that we can do might be
really
cool.
[00:27:59] Paul: January to do all the planning for next year.
[00:28:02] K: There we go. So we'll have that. We'll have stickers as well. We'll definitely bring us some stickers. If you've got ideas for the stickers you'd like to see, let us know.
Last year, the ones that we did were Good to the Last Mic Drop, Why So Serious Privacy, which had the, the Joker on it. What were the, oh, the one that we love with the logo that Paul rocks and K rolls her eyes. And the last one was, Ah, Are You Ready for the Unexpected Question?
[00:28:30] Paul: You never are.
[00:28:32] K: You never are.
[00:28:34] Paul: Absolutely. There is there is one more shout out and that that's also bringing back our conversation to where we started today about strong women in privacy. And that is that Carly Kind was appointed as the new privacy commissioner for Australia. She is currently based in London as the director of the Ada Lovelace Institute.
Working on data and AI for the benefit of people in society. And she used to be at Privacy International before that. But this is a, a really strong nomination for the Australia Privacy Commissioner. Really happy to see that, and she still has a bit of influence also over the new Australian data protection legislation that should be going through Parliament also in 2024.
So congratulations to Carly Kind.
[00:29:20] K: Very, very cool. Love seeing that. So, with that, Paul, have we finished another week in privacy?
[00:29:27] Paul: We have, unless you still want to talk about Meta's pay or okay plan. Obviously, there are some complaints being filed already.
[00:29:35] K: Yes.
[00:29:36] Paul: brought their first complaint and has already indicated that there will be multiple complaints being filed. In court against the pay your okay
model.
[00:29:45] K: heard much about what it's like. I don't know people that are participating in it.
[00:29:50] Paul: I haven't heard anybody who's paying. I've heard quite a few people who've just accepted the ads because they've been seeing the ads since forever. still have not, I still have not been making my choice. I'm still waiting for a response from the meta DPO.
[00:30:05] K: your question.
[00:30:06] Paul: To answer my questions. I've tried four times now to suggest that I really want an answer to my specific questions and not some,
template answer and not some reference to a blog and so
[00:30:18] K: I'm shocked and appalled.
[00:30:20] Paul: yes, so am I. I.
had not expected this at all. But I'll keep you posted. As soon as get a real answer, you'll hear
[00:30:26] K: All I
[00:30:27] Paul: well,
[00:30:28] K: is a real answer to a substantive question.
[00:30:31] Paul: all I want for Christmas is a real answer to my questions. Exactly. And on that happy note, we'll wrap
up another episode of Serious Privacy. Like and review us in your favorite podcast app or on your favorite podcast platform.
Join the conversation on LinkedIn. You'll find us under Serious Privacy.
[00:30:49] K: Not always serious.
[00:30:51] Paul: You will find the podcast on social media as Podcast Privacy. You will find K as @HeartofPrivacy and myself as @EuropolB. Until next week, goodbye.
[00:31:01] K: Bye, y'all.