Serious Privacy

A long week in Privacy with Paul and Dr.K

September 13, 2023 Dr. k royal and Paul Breitbarth Season 4 Episode 33
Serious Privacy
A long week in Privacy with Paul and Dr.K
Show Notes Transcript

On this week of Serious Privacy, Paul Breitbarth of Catawiki and Dr. K Royal discuss a lot of events in a long week of privacy as they lead up to attending the Nordic Privacy Arena, including The Norwegian Data Protection Authority ordering Meta to stop personalized advertising, Fairplay and the Center for Digital Democracy (calling for investigation related to Google ads and COPPA), Gatekeepers, China, S. Korea, SmartCars, Mozilla’s study on connected cars, OpenAI’s enterprise offering, and more.

If you have comments or questions, find us on LinkedIn and IG @seriousprivacy @podcastprivacy @euroPaulB @heartofprivacy and email Rate and Review us!

Proudly sponsored by TrustArc. Learn more about NymityAI at

#heartofprivacy #europaulb #seriousprivacy #privacy #dataprotection #cybersecuritylaw #CPO #DPO #CISO

Please note this is largely an auto-transcript. For accuracy, listen to the audio.

[00:00:00] Paul: Meta Google Tesla by tens. This is the week of the gatekeepers. The first part of the European digital markets act went into effect limiting the power of big tech players and the money more transparency and interoperability between services. And many of these companies also hit the news for other reasons. I like sex in cars and web scraping that you will hear about in today's episode. My name is Paul Breitbart.

[00:00:38] K: And I'm K Royal and welcome to Serious Privacy. It is quite the week this week, Paul. I think you and I are lamenting that we don't have huge privacy news, but everything else seems to be discombobulated this week.

[00:00:55] Paul: Well, I mean, you are, you are moving house. I am moving house. Uh, although my move is until the spring, but you are building a house. I'm buying and rebuilding a house. So,can we, can we say that our minds were elsewhere?

[00:01:09] K: Absolutely, 

So, unexpected question. I so want to ask, if you could live anywhere in the world, where would you live? But only because I'd 

[00:01:19] Paul: heh heh. Anywhere with a roof over your head? Heh heh heh heh heh.

[00:01:22] K: Anywhere but here. Um, let's go with that. If you could live anywhere in the world, where do you, for a year, let's do that. If you could live anywhere in the world for a year, where would you want to live?

[00:01:33] Paul: No considerations on, on budget and work and whatsoever.

[00:01:39] K: New York. Wow. That's unexpected. I wasn't expecting that one.

[00:01:44] Paul: Yeah, I mean, for a year, that's still a bit of a dream to live somewhere in Manhattan or in Brooklyn and then experience the New York lifestyle from within. I think, yeah, I think that would be cool.

[00:01:58] K: That sounds pretty fascinating. I'm still going to go with the same place I'd like to visit. I don't know if I'd like to live there a year, but I'd like to try New Zealand. New Zealand.

[00:02:08] Paul: Might be a bit complex for the podcast, though, with all the time zone differences.

Even more than we currently have.

[00:02:16] K: Going to say, but we are going to manage to be in the same time zone at the same time in a few weeks.

[00:02:23] Paul: a half weeks.

[00:02:24] K: Two and a half weeks. We're going to the Nordic Privacy Arena. Paul and I are absolutely thrilled and honored to be invited. Uh, I've got my flights finalized again. I had them finalized and then I moved and I don't want to go back to Phoenix to fly there.

So we're flying from Charlotte here, uh, which is a much shorter flight. So, uh, they are now rebooked, uh, so we can get there. But, uh, I'm excited to be there. We've never been to Sweden. We haven't done a lot of international travel, so we're excited to be there. And for anyone that's going to be there in person, please make sure, give us a note, give us a heads up that you're going to be there, uh, so we can make sure that we connect, we'll be on stage.

I think the end of the first day, 

[00:03:04] Paul: The beginning of the second day.

We'll be on stage the beginning of the second day, but we're always looking for corridor interviews. 

[00:03:11] K: So we'll be saying what was the best things we learned yesterday? And then what are we looking forward to today? So we might call a few of y'all up on stage. You never know, but give us a heads up if you're going to be there. And my husband who does the introduction to our podcast and the, what do you call it?

The outro, the, the out reduction, the ending, the exit, uh, he will be there as well. So we've asked permission that he can come in and do his introduction live. He has no interest in attending a privacy conference whatsoever. But he would enjoy that 

[00:03:41] Paul: Then it's good for him that it's just a morning session, right? He can do the intro and... Listen to the podcast, do the outro, and then be gone.

[00:03:48] K: and be out of there. Exactly. Exactly. That's what he'd love to do. So that's wonderful. But for news then, um, I say that it's kind of been slow because it doesn't seem to be anything earth shattering. But this isn't quite as bad as the popcorn week where everything was just everywhere. We have got a few significant things that have happened.

Uh, that if you have not been paying attention to, you really do need to pay attention to. Some of them are some call for comments or some input. So make sure you're paying attention, but nothing that, you know, would get us all excited and go, Oh my goodness, I can't manage to, okay, I'm going to take that part out.

[00:04:25] Paul: Meh, meh. I think, I think actually there is one thing I'm still really excited about. Um, that is, that is, uh, probably about, or possibly about to happen. And that is the continuation of the case of Meta versus the Norwegian Data Protection Authority. Um, as you may recall from a couple of weeks ago, Um, the Norwegian Data Protection Authority, in light of the Court of Justice agreement with the Bundeskartellamt, the competition office in Germany, ordered Meta to stop the personalized advertising at their, uh, granular level, so building all the profiles and using those profiles, um, for online advertising without any specific granular user consent in place.

Um, META said, no, we're not going to do that. Uh, we are subject to the Irish DPC. So, um, we are going to appeal this decision from the Norwegian DPA. They went to court in Oslo and the court of first instance has ruled today, uh, September 6th, um, that META, um, has lost the case and that the Norwegians are in their right to impose the processing ban, um, under the urgency procedure.

Um, so the processing ban is still in place. Meta will have to comply and given that they haven't complied for quite a few weeks now, they will have to pay up, um, because they are, uh, imposed, uh, I believe it was 100, 000 Kroner, about 90, 000 euros or dollars, um, uh, 1 million Kroner, so 89, 000, uh, um. A week, um, for non compliance.

Um, so that is something that they will have to pay, um, at least for the foreseeable future, um, until they start complying, um, the European data protection board will convene on the 19th and 20th of September. Um, so also just in under two weeks. Um, and this will be one of the topics on their agenda. So they will also be discussing what to do.

With the meta case, um, and it could very well be that a majority of DPAs wants to follow,

Norwegian decision, thereby overruling the Irish

[00:06:41] K: Oh, absolutely. They're probably watching it with eyes wide

[00:06:45] Paul: So yeah, this is, this is going to be interesting because this could be the first serious wrecking ball to the online advertising, behavioral advertising, uh, industry.

Um, and there are more decisions slated for this fall, including on Google Ads and, and some others. Um, but this will be the first one to be decided upon. Um, and I'm very curious to see what that debate will bring.

[00:07:10] K: That sounds like one to, uh, Wait with with baited breath. I'm looking forward to seeing what this is gonna be. Funny enough, it seems to be moving quickly. I know it doesn't seem to be moving quickly, but it really does. Maybe it, maybe it moves slow up into us now, and now it's gonna move quickly,

[00:07:26] Paul: Well, I mean, for... Um, European regulators, it's moving quickly. Um, I think this discussion has been ongoing for so many years that it somehow also doesn't feel as quickly. Um, and then, of course, we also still have the FTC and U. S. Congress looking into the Google advertising side. Um, where they don't rule out that Google ads need to be split up from the rest of Google.

So also there, it, it, it might still become very interesting in the next couple of months. To see what will happen

[00:07:58] K: yeah. And I mean it was children's groups that were also calling for Google's pro, uh, for the FTC to probe into Google's ad targeting as well. So I think that had a lot to do for it as well. The, um, it was a request for investigation. We spoke about this, I think, uh, a few months ago as well, but Fair Play and the Center for Digital Democracy, um, actually called for an investigation because of the kids data that was being impacted with that.

And as y'all know, that's one of the things I like to pay attention to.

They say that it really impacted, um, what they'd done. There was a separate research that Fair Play and the ad buyers, uh, backed it up. It ran test ad campaigns on YouTube, uh, which selected users on the basis of various attributes and instructed Google to only run the ads on the made for kids channels.

Uh, should have resulted in zero placements, and instead there were thousands of ad placements. And Google says that these were contextual, not targeted. But regardless, and I don't know if most people listening understand the difference between the contextual and the targeted. You and I had spoken about that before, but the contextual means that you're placing an ad based on the.

page or the site or the information the person's looking at. It is contextual. So if you're looking up electronics, it's going to show you an ad for an electronics store. If you're looking up renting apartments, it's going to perhaps show you an ad to buy a house. Uh, different things like that. So contextual advertising.

Um, and there's been a big debate on whether or not that would be a lot more private doing contextual advertising because then it wouldn't be tracking users across sites to profile and learn their behaviors. But apparently they were one of the ones that helped call for investigations into Google Ads and the FTC is listening to this.

They weren't the only ones by far. But the FTC is listening, but they seem to be racking up more check marks on the on the bad side of the report card than they are on the good side of the report card, so that will be exciting.

[00:10:04] Paul: Yeah, no, I fully agree. Um, and while we are on the topic of ads, um, One of the things that will be regulated more strictly in the european union is also online advertising because Also today, uh, again, September 6th, uh, the European Commission has announced the designations for the gatekeepers under the Digital Markets Act that will enter into force, uh, in a couple of months time.

Um, no surprises as to who are the gatekeepers, so the companies with real digital market power. So we've got Alphabet, Google's mother company. We've got Amazon. We've got Apple. We've got ByteDance, we've got Meta, and we've got Microsoft. 

[00:10:49] K: Aw, surprise! 

[00:10:51] Paul: Very big surprise, obviously. Um, but you see that, for example, for social networks, we've got TikTok, Facebook, Instagram, and LinkedIn, uh, for these companies.

For instant messaging, we have WhatsApp Messenger. For video sharing, it's YouTube. For search, it's Google Search. Browsers, Chrome and Safari. Operating systems, Android, iOS. And the Windows PC operating system. For intermediation, Google Maps, Google Play, Google Shopping. The Amazon marketplace, the Apple app store and the meta marketplace, um, which I've never seen anybody use, but apparently they still meet the threshold.

Um, of large use cases in, in Europe. Um, and for advertising it's Google, Amazon, and Meta. Um, still under investigation, um, whether they meet the thresholds are Microsoft Bing, Microsoft Ads, uh, Edge and, uh, Microsoft Advertising and also Apple's iMessage to see whether the use cases for those are also meeting all the large scale standards and that should be concluded, um, somewhere between now and the end of the year.

and also Apple's iPadOS could still be, uh, part of a further investigation. Gmail, Outlook, and the Samsung Internet Browser have not been identified, um, as core platform services. and that means that for all of these companies, they now have the next six months for a full list of, um, activities that they need to implement.

Um, so they need to start providing interoperability. Um, they need to allow their business users to provide access to the data that they generate. As part of their services, um, there needs to be more independent verification on advertising. Um, also the contract, the contracting, um, should be made easier.

Um, and they can't keep virtual monopolies in place anymore. Or for example, to prevent consumers from. Linking up to businesses outside of their platform. So this will have a consequence, uh, for example, for Apple's App Store. Um, but it might also mean that from WhatsApp Messenger you need to be able to also reach people who are not using WhatsApp.

So it could be that there needs to be, um, somehow, um, uh, a communication possible between somebody who's using WhatsApp and somebody who's using Signal or iMessage.

[00:13:25] K: Interesting.

speaking of signal, you reminded me of something that made my little privacy heart go pit a pat, uh, as we were taking my youngest daughter shopping a few weeks ago, as we were leaving the state and she's transitioning over to doing her medical. Oh, I don't know what you call them.

The short term things before they do residency here on the East Coast. She needed East Coast clothes. As they were arriving at the store, they started saying, why are we using Facebook Messenger? Why aren't we using something more private? Why are we using something that they can use our data? I mean, signal's private.

I mean, privacy's kind of like a right that we have, right? Why are we letting companies use our data? Why aren't we like protect, more protective over our privacy? And I was like, Oh my goodness, y'all are actually talking about privacy. So they were

[00:14:16] Paul: So you are listening in the end to what you are saying. Mm hmm.

[00:14:19] K: they started with the question of why are we using Facebook, uh, Messenger rather than something more private like Signal. I said, uh, convenience and user preference because I tried to get them to use something more private years ago and they refused.

So it made my little heart go pit pat. They're like, I'm, I'm fairly sure that they're just scraping all this data and using it for things We don't imagine it like advertising. I'm like, you're fairly sure. I know they're doing that. You dofu, what's the pub? What's the plural of doofus? Doy Phi. So these are adults, young adults in their, I would say right at 30, 29 to 32 range.

[00:15:02] Paul: But still, I mean, that they start, it's better that they start realizing now than never, right? I

[00:15:07] K: Right. I was very proud of it. It was like, wow. The fact that they're like, it's like a right we have. And I'm like, yes, it's exactly like a right you have.

[00:15:16] Paul: So, very good.

[00:15:18] K: And I told them I was going to use their, their conversation the next time I had the opportunity to brag on young adults being privacy conscious and aware. So that was really good. Right. Exactly. I said, I wouldn't use their names, but you know, whatever. Uh, I didn't use their names, but it is my youngest daughter.

Actually it was both my daughters. Yeah. That's not, that's not, yeah. Divulging any at all. It was both my daughters. I was very proud of them. We do have some other news since we're over in Europe right now. So, um, didn't, um, commissioner reindeer that just stepped down. Didn't he take an interim role that I heard?

I was just looking for the story.

[00:15:56] Paul: Well, it's more complicated than that. 

[00:15:59] K: Uh oh. Okay, it's, it's him and Herova, right?

[00:16:04] Paul: yes, yes and yes. Um, it's actually the competition commissioner, um, Margrethe Vestager, the Danish commissioner was also executive vice president for the commission in charge of competition. She has taken a temporary leave of absence because she is in the running to become The president of the European Investment Bank, and you are not allowed to run for another office while you are in the European Commission.

There are very strict rules, um, about that, so she had to step down temporarily and her, but because it is a temporarily,

Because she's stepping down temporarily, Denmark is not able to nominate another commissioner yet, only if she would fully resign. Like the Dutch commissioner did a couple of weeks ago, because he is running for national parliament and hopes to become the next Dutch prime minister.

He resigned, so he can be replaced. She only steps down temporarily, so that means that her post needs to be filled. And that means that her portfolio... Um, will be covered by other commissioners. So the Justice Commissioner, Commissioner Reinders from Belgium, he takes responsibility for the competition portfolio, uh, and also for, um, for the Digital Markets Act.

And Commissioner Ureva, who is responsible for fundamental rights, and was responsible for privacy in the previous commission, um, she takes some of the responsibility, um, uh, on the digital single market.

[00:17:31] K: Okay.

Thank you. Thank you for explaining all that. I just saw there were some, uh, interim appointments. I'm like, I need to dig more into that. Absolutely. We have that. So, uh, before we jump over here to the U. S. There was a small blurb on the South Korea government approved the enforcement decree for the Personal Information Protection Act.

Um, this happened yesterday. I think it was. So, they announced the new enforcement ordinances will enter into force next week, so about the time that this, uh, episode comes out, and it includes unifying the standards for processing personal information. Um, so more developments there. I think I also saw some enforcements in China for something or some guidance that came out in China.

They found that there was an academic database that was illegally collecting, uh, personal information. If you haven't paid attention, China has been going through quite a few of enforcement actions lately. Um, in addition to what Paul and I had shared a few months ago that we learned that they were just showing up at companies door and going, we're here to advise you on what you're doing for personal data and so do make sure that you're paying attention to some of these enforcement actions as well.

I know a lot of companies are still looking at whether or not they're going to stay in China. They're going to outsource to a local resource in China or how they're going to approach that now that we're seeing more, um, information come out. Anything else? Before I jump here to the U. S.

[00:19:01] Paul: Yeah. There is actually a global thing happening and that is.

[00:19:04] K: we got?

[00:19:05] Paul: Well, there is, there is a group of data protection authorities led by the information commissioner's office out of the UK. Um, that is, um, starting to pay more attention to web scraping. you know, that, that a lot of organizations say, Oh yeah, but information that is publicly available on the internet, we can just use it.

As has been the custom in the U S for, for many years, um, less so since. CCPA was first introduced, but still, these data protection authorities, which includes Australia, Canada, the UK. Hong Kong, Switzerland, Norway, New Zealand, Columbia, Jersey, Morocco, Argentina, and Mexico. So all within the framework of the Global Privacy Assembly.

Those 12 have now issued a joint statement on web scraping and data protection and basically have said Um, that websites and especially social media sites should be much more careful in what information is shown publicly, um, and how that also can be scraped automatically. Um, it is, I think, um, a first step towards, um, more specific guidelines out of the, uh, Global Privacy Assembly, but also an initial warning that more enforcement might be coming.

On these issues, um, we've already seen in the past quite a few cases, especially for Clearview AI, um, where images were scraped from the internet. Um, we have, of course, quite a few cases ongoing, especially in the U. S. against OpenAI for web scraping to, uh, train, uh, the generative artificial intelligence, the large language models.

And now there is also a big warning, especially towards social media companies, um, and other websites like, uh, marketplaces, um, where, uh, individuals personal information would be available, and that they should also come up with technologies that would prevent, um, uh, the automated scraping of all the information from those sites.

[00:21:09] K: Which for all we know, a open A. I may be the ones that come up with the technology to prevent it.

[00:21:15] Paul: it might be with the backdoor for themselves, I say cynically, um, But for example, also you you recall the cases against LinkedIn. I think they were settled in the end in the US but there were companies who scraped all of the data from from

LinkedIn to 

[00:21:29] K: for marketing purposes and 

[00:21:31] Paul: For marketing and advertising purposes and all of that 

[00:21:35] K: Which went against linked in terms. 

[00:21:37] Paul: hmm. It breaches

[00:21:38] K: It may have been publicly available, but their terms that you have to agree with to use their service indicates that it's not publicly available just because it's publicly Viewable or accessible does not make it publicly available. You do have to agree to a site's terms before you join it Hopefully those terms don't say that you automatically consent to their privacy practices, but they do tell you Of limitations of what you can do.

So there is that. Um, so I'm glad to hear that. Um, I think, I think that should be the case just because again, something's publicly accessible doesn't mean it's publicly available and can be used for additional purposes other than what people gave that information to that website to use. And I'm not surprised that it's bubbling up, giving the, um, controversy around whether or not, uh, works developed through AI are copyrightable.

Which our court here decided they were not, uh, which I think is the right way to go. I faced a lot of flack for that. A lot of people think they should be copyrightable. I'm sorry. Maybe in this regards, I'm a legal purist. I think it needs to be created by a human to be copyrightable. 

[00:22:49] Paul: Oh, I'm going to be the lawyer here. I'm going to be the lawyer here and say it depends. Because I can imagine you write a prompt that is so specific. Um, and where you, or where you use the AI that is integrated in Photoshop or Illustrator to enhance images, but the basic design is still yours.

[00:23:07] K: I think that's a little different. That's not created totally by it. But wasn't there a  case in, oh, you're gonna make, you're gonna make me look this up. Animals creating paintings and everything. Are those copyright.

[00:23:19] Paul: Yeah,

[00:23:20] K: Monkeys, elephants, dogs. There's all kinds of ones that create paintings and different things. And yeah, we might have to go back there and

[00:23:27] Paul: but they have been doing that for decades. I mean, I remember from the videos from the 1980s where they gave a donkey a paintbrush or a donkey, um, an elephant a paintbrush, uh, to start making works of art and also monkeys and those were sold for much, for a lot of money.

[00:23:45] K: were they copyrighted? 

[00:23:46] Paul: Probably not because an animal doesn't have human legal rights.

[00:23:51] K: So there you go. But while we're on AI, um, I'm sure all of y'all, uh, heard this already, but OpenAI unveiled the chat GPT enterprise, which is a version that allows companies to use it. Because this has been coming up in a lot of companies. How can we use AI? They legitimately want to be able to do it. They say it's going to have increased security and privacy.

 using a lot of security encryption at rest and in transit 

opened its bandwidth to connect to him. No, no, no. I'm just saying they announced this is coming out. And as I've said before, there is not open AI. AI is a solution in search of a problem. Companies want so bad to use AI solution because It does save time and efficient and can the scope of what it can look across.

Hospitals and medical have been using this for a long time, um, to be able to pick up on early signs of certain types of diseases and markers because they can analyze thousands upon thousands upon thousands and find similarities that a human simply would not be able to do. Uh, so. I don't see a problem with that.

Uh, so they, they can learn. The large learning, what do they call it, the large language model, um, actually can learn so much faster than humans can. So, um, there are good things, but it is a solution in search of a problem, and everybody wants to find a problem for it to solve, and so open AI is coming out for one, built for companies to actually use.

So, yay, go forth and conquer.

[00:25:31] Paul: And do be careful with your, with your terms and conditions and copyright claims that you may get.

[00:25:38] K: Yes, make sure you run it past your privacy office, please, and run it past your legal office, and run it past your compliance and your ethics office, and run it past your security office. Make sure it's right.

[00:25:51] Paul: an agreement.

[00:25:52] K: Yes, please, please negotiate an agreement. And here's the thing. If you're not stopping your employees from going to open AI or chat GPT or any of these available, they're doing it.

They're doing it without you. Um, if you try to block it, they're just going to find other ways of doing it. So the only way to be able to beat, defeat, work with these types of new emerging technologies as they come out is to work with your personnel and your colleagues, educate, educate, educate, find a way for it to work.

Um, or if you decide it doesn't work for your company, make sure people know that and make sure they understand why understanding is key. Uh, once you understand that's half the battle. So understanding is key. If you just tell them they can't do it because you don't like it, they're going to do it. If you tell them you don't like it because it offers X many number of risks and put this in jeopardy and your clients are going to cancel your contracts and all of that, that might actually impinge on them that maybe they shouldn't do it.

So just do all that. Okay. What else do we have? Um, anything else from you? I don't have a whole lot on my side that we hadn't touched on yet.

[00:26:58] Paul: So I have to, well, I have one more and one that you actually mentioned before we started the call. No, the, um, the Fitbit complaint that was raised by Noib. so another complaint from Max Rems's team. Um, and this case, uh, is against. Fitbit, or basically against Google. Um, because Google now owns Fitbit.

and basically if you have a Fitbit, a fitness tracking device, smartwatch thingy, You, if you are in the European Union, you have to provide consent for data usage. Um, it can, to some extent, classify also as sensitive personal data because it gives a lot of insights about your health. And all of that data is sent to the United States.

Um, and you are not able to, uh, consent to that, and you are also not able to opt out of that. they filed a complaint, and I'm just wondering whether this might be, um, the backdoor request that will see the, uh, data privacy framework come before, uh, the European Court of Justice.

[00:28:05] K: We might. There have been some scandals over the years involving Fitbit. One was, uh, in the military with a secret base overseas. Uh, someone hacked into someone's Fitbit and tracked their movement. Was able to identify where the base was. Yep.

[00:28:20] Paul: And that was, that was a hacking attack. Um, the similar watches from Polar were actually just publicly accessible. You didn't even need to hack them. You can just log on a website and get all the data from users and then just find out. Where they were located, and not just around military bases in Afghanistan, uh, but also from, uh, security personnel, people working for the N S A, the C I A, the F B I, um, US senators and, and all of that.

Um, they really dug deep. Um,

[00:28:52] K: Yeah.

[00:28:53] Paul: up that, uh, that research piece

[00:28:55] K: We'll put that, we'll put that story on there for it. So, um, the other one, uh, the one that I just mentioned, the ICO one. So the ICO has a call out for comments. They're producing guidance on biometric data and technologies and the first phase they've already published.

The second phase, which is biometric classification data protection. We'll include a call for evidence early next year. So if you've been looking at that, the consultation that is open right now will run in through October the 20th. So you have got time, we'll put you a link for that. So it's an ICO consultation on the draft biometric data guidance.

So feel free to go, not feel free, do, please do go pay attention to that and look at that if you have an interest in that whatsoever. Uh, that's really good. The other things that came up this week, the California Privacy Protection Agency released draft cybersecurity audit regulations.

Um, pretty fascinating read if you haven't looked at those or not. And if you haven't heard, I know you have heard because you're a fan of the podcast and we've talked about it. The SEC has released their cyber security rules. If you haven't gone and looked at that, please go do so.

Um, that you have that. And there was another one that I have, yes. So this was with the NIST. Uh, so the National Institute of Standards and Technology here in the U. S. They have a final version of their draft cybersecurity guidance related to HIPAA. The Health Insurance Portability and Accountability Act, it is spelled with one P and two A's.

It is not a female hippo. Um, they will publish it later this year. Um, it includes specific resources, of course, as they usually always do for small entities, and they also clarify certain terms. And so make sure that you're looking at that. they're going to adjust their appendices.

So they have their security rules and specifications crosswalk, which has been almost impossible to understand unless you sat down and analyzed it carefully and cross mapped it yourself and drew lines and squiggles. They're going to make it a little bit easier to understand. Uh, and they're also going to adjust the HIPAA security rule resources.

Y'all all know I'm a big fan of HIPAA. Uh, so make sure that you're looking at those as well. I

So, recently I was speaking to Jason Cronk. We had him on the show. We were actually talking about how dense The NIST privacy rule is and how it's not really user friendly whatsoever. Um, I've always been an ISO girl because they're easy to use.

They're easy to understand. I'd like to become a NIST girl. So I'm trying to wade through this, but I have to tell y'all it's dense. And I know NIST has working groups. I should probably look at joining one of the working groups. But they're not user friendly. Have you ever tried to understand, other than the cyber security framework, it's actually pretty friendly.


[00:31:48] Paul: Well, I mean, I've, I've, I've read both the NIST and the ISO frameworks, but, um, for me, the problem with ISO is that they are not accessible unless you pay over a thousand euro to get access to a document with guidelines, and NIST is publicly available, um, and both of them, um, Are not most useful if you already have a clear law to work with

[00:32:12] K: Yeah. Now that is true. So I'm hoping, yeah, you're right. Being accessible is one thing, um, the other is just the way it's presented. So where NIST was just saying their cybersecurity and presenting at the crosswalk in the HIPAA, it's redesigning how it's presented so people can absorb it, uh, more easier. I think originally they were created by very, very technical people, uh, for government,

they were not user friendly for the average person working. In any kind of technology or with any kind of data, which is, you know, if you're working with personal data, you could be everywhere from a frontline customer service to HR, to marketing, to whatever it's, it's people of all different walks. And so it needs to be user friendly to be able to absorb and understand what the standards are. So I'm, maybe I will dip my toe in that pond, who knows, 

[00:33:08] Paul: Well user friendliness is always important. That's for sure. I have one more and that is a study done by the Mozilla foundation 

[00:33:18] K: Yes! Yes! 

[00:33:20] Paul: on electric vehicles and they have been looking into how privacy friendly Um, the, um, uh, electric vehicles actually are, um,

[00:33:33] K: And they're not.

[00:33:34] Paul: they're not,

[00:33:36] K: They needed to study that, though, to prove to people that they're not, because we kept telling them they weren't, and nobody listens. They're not.

[00:33:43] Paul: exactly. So, um, they actually looked at electric vehicles on a range of issues, um, and, um, they have looked at 25 different car brands, um, and all 25 failed the test. Um, it may or may not surprise you that, um, Tesla scored the worst. Um, they really failed on all levels. Um, French manufacturer Renault,

French manufacturer Renault actually scored best, but still failed.

Um, uh, on the, uh, on the test, they looked at, uh, data usage. They also looked at, Uh, user control at, uh, tracking and also the data security and also the usage of AI. Um, and also looking at all the various terms and conditions, um, that he would need to accept when driving a smart car. and basically you would do that on the small screen of the car or in an app.

[00:34:45] K: how many average smart things are there in a car? 


[00:34:49] Paul: Well, too many. I mean, there are cameras, there are sensors, there are microphones, uh, there's, there's everything. and there are some, some quite serious, um, uh, serious things. Um, for example, Hyundai says in their privacy policy that they will comply with law enforcement requests, whether formal or informal, um,

[00:35:12] K: Nice.

[00:35:13] Paul: of course, is not something that he, that he would want.

[00:35:16] K: But here's the thing all the others comply with the law reform law enforcement request

[00:35:20] Paul: oh yeah,

[00:35:21] K: without telling you that

[00:35:23] Paul: probably. Yeah, well at least they're honest about it, that's true. Um, it can be worse though. Um, Nissan, um, uh, actually says, Um, that, um, uh, they also collect data about your sexual activity. Um, also Kia mentions that they can collect information about your sex life. And six other companies say they can collect your genetic information or genetic characteristics.

[00:35:51] K: So in other words, no more backseat sex.

[00:35:54] Paul: uh, apparently, unless you want it to be in the hands of the likes of Elon

[00:35:58] K: My god now the cars have cameras. So please people no more backseat sex. No more front seat sex. No sex while you're driving

[00:36:04] Paul: Well, I mean... That is the problem that they've been raising in San Francisco, right? With the self driving cars that people are actually using to have sex. so that was a story in the San Francisco Chronicle a few weeks ago. Um, and that is of course even more documented because then there are also cameras inside the car for security reasons.

[00:36:27] K: there's cameras everywhere for every position. They're probably collecting the DNA from the seats. People at my, you may, I almost think I may pause for you is water right there, but yes.

[00:36:37] Paul: Well, I mean, you can, um, you can imagine that it's only a matter of time before we see the first video leaked of somebody having sex in a self driving car.

[00:36:48] K: Oh, absolutely. Whether it's leaked by the car, people, the rental car, people, the people who own the car or rent the car or the

[00:36:56] Paul: Or somebody who just hacks the camera because of lousy security.

[00:37:00] K: Right. Exactly. It could be any and all of the above.

[00:37:03] Paul: So on that happy note, we'll wrap up another episode of Serious Privacy.

[00:37:08] K: Not so serious today,


[00:37:10] Paul: I'm always, I'm always going to, no, I'm almost wanting to say people go have sex but do it in your bedroom, but with the listening devices there, Alexis and all those kind of things, might also not be the best idea,

[00:37:23] K: Stop sex altogether.

[00:37:25] Paul: or build a Faraday cage in your house.

Just saying. Okay, without kidding, um, thank you all for listening to yet another episode of Serious Privacy. Join the conversation on LinkedIn. You'll find us under Serious Privacy. You'll find Kay on social media as Heart of Privacy and myself as your old Paul B. Like and review us in your favorite podcast app and until next week, goodbye.

[00:37:50] K: Bye, y'all.