Serious Privacy
For those who are interested in the hottest field in a technology world. Whether you are a professional who wants to learn more about privacy, data protection, or cybersecurity law or someone who just finds this fascinating, we have topics for you from data management to cybersecurity to social justice and data ethics and AI. In-depth information on serious privacy topics.
This podcast, hosted by Dr. K Royal and Paul Breitbarth, features open, unscripted discussions with global privacy professionals (those kitchen table or back porch conversations) where you hear the opinions and thoughts of those who are on the front lines working on the newest issues in handling personal data. Real information on your schedule - because the world needs serious privacy.
Follow us on Twitter: @PodcastPrivacy or LinkedIn
Serious Privacy
That week when Spongebob made Privacy News
On this week of Serious Privacy, Paul Breitbarth of Catawiki and Dr. K Royal discuss APRA resuscitation, #elections and #AI and #sensitive political data, NY Child’s Code, the CA and French MOU, the Norwegian DPA being forbidden to impose daily sanctions on Meta by the Privacy Board, government surveillance drama (criticized by Signal), and even a little SpongeBob. And congrats to ZoomInfo for the first #TrustArc #AI certification.
Tune in for some living, learning, and laughing.
If you have comments or questions, find us on LinkedIn and IG @seriousprivacy @podcastprivacy @euroPaulB @heartofprivacy and email podcast@seriousprivacy.eu. Rate and Review us!
Proudly sponsored by TrustArc. Learn more about NymityAI at https://trustarc.com/nymityai-beta/
#heartofprivacy #europaulb #seriousprivacy #privacy #dataprotection #cybersecuritylaw #CPO #DPO #CISO
Please note this is largely an automated transcript. Please listen to the audio for accuracy.
[00:00:00] Paul: Hi everyone! Summer is finally here in the Netherlands after what seemed like an endless fall, winter, and back to fall season with rain, rain, rain, and more rain. Suddenly, it's 28 degrees Celsius. I don't know what it is in Fahrenheit, but it's high. It's hot, it's sunny, it's blue, and probably by next week, we'll be back to full.
In the meantime, the world is looking at the UK, at France, and the US for elections. So we'll talk a bit about election data, but of course we'll also catch you up on what's been happening in the world of privacy and data protection. And although we said during prep that this was a quiet week, of course it wasn't a quiet week.
As always, my name is Paul Breitbart.
[00:00:52] K: And I'm K Royal and welcome to Serious Privacy. So today, the unexpected question. If your pet could talk, what would they say? And the reason why I went with that one is because my pet would say, please God, get us in our own house. I am so ready to be out of this one.
[00:01:12] Paul: I want my peace and quiet, please.
[00:01:14] K: My big cat sleeps under the bed now because there are two little kittens here that he doesn't want to be around. So it's like, Oh, I'm sorry, baby. We're trying.
[00:01:25] Paul: Well, I guess Higgins would today tell me that can you make sure that it's a bit cooler because I think it's too warm
[00:01:33] K: It's too warm.
[00:01:34] Paul: Just imagine a brown labrador with a thick thick skin and then These very warm temperatures that he is not used to And otherwise he probably would say please feed me because he's a brown lab
[00:01:47] K: Please. Yeah, exactly. Please feed me. Okay. So we've had a few things happen in the past week.
One of the things that Paul and I have debated talking is, do we talk about the revisions to the American Privacy Rights Act? So it's not an act
[00:02:03] Paul: Well, apparently it's still alive because they revised it.
[00:02:05] K: Yeah, exactly. So it's not a bill. It's not a law. It's proposed and there are some amendments made to it and y'all have heard our opinion on whether we actually think it's going to get passed or not this year, which don't listen to me for anything in an election year.
Apparently it could get passed and
[00:02:21] Paul: What's the deadline? What, what does the potential sky time schedule look like? Do you have any idea? Because I mean, us parliament somehow only our part time parliaments, maybe at the federal level, they have a bit more, but, at some point they will go into summer recess and then
will they come back before the elections or will they immediately go into election recess?
[00:02:45] K: They'll probably immediately go into election week. This is crazy. I've never seen an election year like this and it's an election in several countries. So Paula and I will talk about that before, but let's not before. Paula and I will talk about that in a little bit as well. We have a little bit of updates on that, but let's, let's talk about the APRA.
APRA. I don't know what we're going to call it. If it's ever gets passed, we're going to call it a miracle. but the first draft was released in April. If, if you need any insight on what the first draft was, you, you can go read the one in April. The biggest criticisms that OPERA had were the same things the criticism always been. It's the preemption, it's the private right of action. It actually wasn't a horrible bill. It actually had some meat to it, which I was afraid that anything that succeeded would not have any meat to it.
This actually did. I was pleased to see that, but of course now it's been revised. And so what do you think about some of the revisions? What are some of the ones that stand out to you?
[00:03:43] Paul: So for me, what mainly stands out is that they are going even more strongly after online advertising, which seems to be following the trend that we've also seen here in Europe, not so much in terms of legislation, of course, but mainly on enforcement. there are expanded obligations introduced for data brokers.
There are new definitions for contextual advertising, first party advertising, and also redefined definitions for targeted advertising. So that seems to be a big part of of this update. The other one mainly relates to algorithms and automated decision making.
[00:04:24] Paul: So there are some changes to the impact assessments that need to be done when you deal with algorithms. Also what is a covered algorithm will be changed and we'll hold off on discussing all these definitions until the text is final, because they keep changing for the foreseeable future. Highly likely at least,
[00:04:44] K: Well, and they were also looking at taking out the expanded COPPA protections, right?
[00:04:48] Paul: yes and what I thought interesting is that there will now be an opt out offered for so called consequential decisions following automated decision making or algorithmic decision making. Basically what under the GDPR would know would be known as a decision with a legal or otherwise significant effect.
So it's not as a consequence of the, well, it is also as a consequence of the algorithmic decision making, but it needs to be a significant effect. And you can opt out of that and also allow for a human review.
[00:05:22] K: Yeah. Also require you can request that the decision to deny or accept your opt out not be made by automated means, but has to be made by a human, which is really interesting of how a company would
[00:05:36] Paul: you listening Meta?
[00:05:37] K: write. How would you put that into play? It's very, it's, it's very interesting. So we'll see how far that gets down the road as we go.
But since we mentioned it, let's talk about the elections. Yes, we have elections coming up. Believe it or not,
[00:05:53] Paul: well, this weekend, the first round of the snap parliamentary elections in France, next week, Thursday, the 4th of July U. S. Independence Day, but also the elections in the United Kingdom, and then somewhere the first Tuesday in November, I believe the U. S. has a few elections,
[00:06:11] K: Yeah, we have a few coming up, which by the way, we get a lot of questions. This is interesting. I went to state bar conference a couple of weeks ago when Paul and I were both doing our traveling and the clerk checking me and said, okay, can I, can I ask you a question? The question they wanted to know was, can Trump pardon himself if he wins the election?
And he said, we're not supposed to talk politics with guests. I said, well, you're not actually talking politics. You're asking a legal question. Can the president
[00:06:41] Paul: irrespective of who he is, pardon himself of convicted crimes or herself? Let's, let's pretend at some point we get a woman. Can a president pardon him or herself of convicted crimes? No,
family members for that matter.
[00:06:56] K: but here's the thing. The conviction in question is a state conviction convictions. I think there's 34 of them is a state conviction. So a federal president cannot pardon a state. Crime. It would have to be the executive officer of that state.
In this case, New York, the governor who is Democrat, not likely to pardon.
But does that mean if the next governor is Republican that they could? Yeah. So, but the president can only pardon federal crimes and only governors can pardon state crimes. convictions. So there's your little update there. A question that I think has been on the top of a lot of people mind. Well, if he's elected, he'll just pardon himself.
No, it doesn't work that way. And there is a lot of interesting conversation going around in America about, can we have a president that is convicted of a felony? It's not forbidden. There are some other crimes mentioned in the the constitution, but apparently the constitutional experts say it does not address being convicted of a state felony.
So, okay, there you go.
[00:08:01] Paul: Somehow the founding fathers didn't look at this scenario when they wrote the constitution in 1776.
[00:08:07] K: yeah. And you would have thought. That they, considering that they addressed it in other areas that maybe, maybe they thought good behavior or whatever the terminology in the Constitution, constitutional experts don't hold me on this, but whatever that terminology is in the Constitution for the good behavior or not other behavioral things, didn't contemplate felonies of this this sort.
So, there you go. There's, there's how American politics are sitting right now. other things that we have had happen
in the news lately.
[00:08:37] Paul: Well, no, let's talk a bit more. Let's, let's look a bit more about the elections because both the CNIL for France and also the ICO for the UK have given out very specific guidance to the political parties on the use of personal data as part of the elections. If we look at the CNIL they made their announcement because of course the elections in France were announced on the eve of or no, so the, the, the night after the European election results were announced.
So they immediately put out their their guidance, but also gave a summary report. Of what happened during the European elections and they have registered 167 notifications and two complaints during the European election campaign and they have also sent out four formal notices to political parties to remind them of their legal obligation.
And they are also inspecting or investigating the campaign of one particular candidate probably for misuse of personal data. They talk a lot about the use of artificial intelligence that,
[00:09:45] K: Yeah.
[00:09:46] Paul: Artificial intelligence is used during the election campaign. It should be recognizable as such and it should be honest
[00:09:53] K: Now we have that in the U. S. as well.
[00:09:55] Paul: yeah so that there is no misinformation to the voters, which is of course also very very important And they have also put out specific guidance for the voters on The the various rights and what they should know
So
in any case,
if
[00:10:12] K: and are the
voters paying attention?
[00:10:13] Paul: I don't know, because I'm not currently in France, but they make clear, for example, that if there is any communication from a candidate, it should be clear who that candidate is, who is responsible for that communication.
And also that here, if candidates have collected their information, That they should also provide data subject rights, such as the right of access and understanding of where the data is coming from and whatsoever.
[00:10:38] K: And what you're doing with it and who you're sharing it with and all of that good stuff. The same data rights apply. And this is special categories of data because these are political
[00:10:48] Paul: Yes,
[00:10:48] K: data.
[00:10:49] Paul: exactly. And similar information is provided in the UK by the ICO, so they also say to the voters you should expect clear privacy information if a political party is using your personal data, which should be easy to understand, and political parties are entitled to receive a copy of the full electoral register, you That is something that exists in the UK, like in the U S you register to vote, you register if you want to certain affiliation.
And all of that information can be used by parties. Parties must be transparent if they are using profiling techniques also social media advertising. Explaining how information from a petition or a survey will be used, especially if candidates go canvassing and also collect information at the door on your positions.
All of that information should be documented and made transparent. And also for the UK specifically there is an obligation now to use a voter ID, a photo ID before you are allowed to vote. That has been something that has been the case in many European countries for many years so that you prove your identity.
When you go to the polling station. But in the UK, that is still something something fairly new in Northern Ireland. That has been the case for many years, but this is the first general election where voters across the UK will need to show a formal photo ID. So that will probably lead to a lot of discussions next week for people who have forgotten. In the Netherlands, by the way, one of the party leaders forgot their ID when they came to vote, so they were sent back home before they could
[00:12:26] K: Well, I mean, is it not natural to think that you would have to prove you are who you say you are to go vote? I mean, if you mail in an absentee, which I don't know how big the absentee ballots are over in Europe. But here, if you mail in an absentee ballot, you have to sign the outside across the, the signature line.
And they supposedly compare your signature to the signature on file for your identification. So someone could fake that. But when you go to vote in person, yes, you, you have to show you are who you are in order to vote.
[00:13:01] Paul: Yeah, but I think the novelty here is a photo ID. So you could show any kind of identification, and now it needs to be, with a photograph on it. So,
a national identity card, a
[00:13:13] K: I think when I was younger, that might've been the case. yeah. That you had to show something that shows who are, you could bring your utility bill or something to prove you're a resident, but it doesn't prove your
[00:13:24] Paul: Exactly. but that also means this very specific guidance that was put out by both the CNEEL and the ICO that political parties and candidates have explicitly been put on notice.
Because this was not just a statement on the website, there were also accompanying letters to the parties and to the candidates. meaning that we will likely also see some enforcement. As mentioned for the European election, the CNIL already announced that they have started several investigations.
And we may see more of that. which is, of course, interesting because, as you rightfully said, this is sensitive personal data.
and I'm actually curious how that would work in, in the U S under the various state laws is voter registration data. Would that be covered or is it only, companies, private entities and not so much political parties.
[00:14:14] K: Well, I mean, that is interesting because. Some states exclude or exempt government entities and nonprofit entities from the state laws. Now, Colorado is an example. They don't exempt nonprofit. Most of them do. Most of the political organizations would consider themselves nonprofit. They would be under some sort of nonprofit structure.
So they would, most likely be exempt from most of the state laws. Colorado is the only one I can bring to mind that does not exempt nonprofit because, you know, here's the thing with, with they exempt and this was the Colorado AG when he was talking, would we take that in considerations for small nonprofit organizations that are state based that, you know, just, you know, they rescue dogs.
Yes. They would be exempt. Would we exempt United Way? Probably not. We would expect them to have the resources to adhere to the law. But California's is phrased interestingly. And I think a lot of people have forgotten. They don't exempt nonprofit across the board. They exempt nonprofit that does not offer a business. Or commercial advantage or purpose to its members. And one of the, one of the examples is such as the bar association is that established for a commercial benefit for its members. So a political organization might very well. fall under the not exempt from the exemption kind of thing.
But I don't know that anybody's looking at that level of detail or if the regulations they promulgated since have just reinforced the language of nonprofits being exempt. But originally it was not all nonprofits exempt. It was whether or not they offered a commercial benefit to its members. So that was interesting.
So, but there are some that do adopt the, basically the same language as the special categories of data, the political and the philosophical beliefs and leaning, they do roll that into the definition of sensitive data. So how that's going to impact elections in those States. I haven't heard anything about that yet.
So that might be something I need to go do a little digging on.
[00:16:26] Paul: Yeah, and there is only one real enforcement authority apart from the Attorneys General at state level, and that is the California Privacy Protection Agency which by the way signed an MOU, a Memorandum of Understanding, earlier this week with the French Data Protection Authority, to help each other with investigations and to reinforce their understanding of novel technologies,
[00:16:50] K: Isn't that interesting?
[00:16:52] Paul: It is, it is the first, it is the first memorandum of understanding apparently that Ashkan Soltani has signed on behalf of the C of the CPPA with a foreign regulator.
[00:17:04] K: I think California is preparing for its success succession from the union,
[00:17:09] Paul: well, that might be their only hope when it comes to data protection. Because right now, I don't know how, during an investigation, the French Data Protection Authority would be able to share personal data with the CPPA, given it is highly unlikely. That the CPPA is signing up to the EOUS data privacy framework.
And I don't believe that standard contractual clauses are part of the Memorandum of Understanding. So what legal basis would there be to share information during an investigation outside of the European Union?
[00:17:46] K: Yeah. Out, outside of larger political frameworks.
[00:17:50] Paul: Could that maybe be a risk based approach that suddenly the CNIL is embracing? I don't know.
[00:17:56] K: Very interesting. Very interesting. Inquiring minds want to know more about this one, that the whole circumstances surrounding that one, it, for all we know, it happened over a glass of whiskey at IAPP. Who knows, right?
[00:18:10] Paul: it's, it's very possible. I know that the memorandum was signed in Paris. So in any case Sultani was in Paris for that. It could have come up indeed during the IAPP in DC or other conversations. Also, it doesn't bode well for California Big Tech if the French and the Californians are going to partner on this and if the Californians are going to have similar MOUs
signed with other authorities.
So,
also here,
[00:18:40] K: new technologies.
[00:18:41] Paul: You may want to keep an eye out on the Global Privacy Assembly in Jersey. because I expect more memorandum of memoranda of understanding to be signed there between multiple DPAs. It's typically the occasion for such things.
[00:18:56] K: Nice, and it would have to be over a glass of wine given Napa Valley in California and the French wine country So
[00:19:04] Paul: Yes, although I, I, I'm not sure about Californian whiskey. There is French whiskey, which is pretty decent. So, I, I have never heard of California whiskey.
[00:19:12] K: I'm sure there is there are there are whiskey connoisseurs out there. I am not one of them Y'all know my favorite wine is whiskey But I'm still not a big drinker. Okay. Let's, let's just put that on table one every other year or so. And I'm good. but we do have a few other things that have happened this year, this year, this week, we do have a few other things that have happened this week.
Some of them really interesting. Well, let's stay over in Europe for a little bit. So the Norwegian Data Protection Authority was barred from imposing daily fines on Meta and other international companies, ruled by the privacy board. so
[00:19:52] Paul: And now you want to know, what is the privacy board?
[00:19:55] K: right,
[00:19:56] Paul: so Norway, Norway has an interesting model because they have a two tiered system. So they have the Data Protection Authority that does the actual investigations and also proposes the sanction. But then there is the Privacy Board that also serves as sort of an appeals committee which is an independent authority as well, with different people in it.
And META appealed the daily sanctions. So basically this is a compliance order under financial penalty. And for every day that META would not comply with the compliance order to stop behavioral advertising they would forfeit a penalty and that would grow day by day by day.
[00:20:37] K: When we talked, when they first put that in place, how that was an unusual tactic to take, it was more like an American tactic that we would
impose a penalty per day, and that it was unusual. So apparently it is so unusual. They're not going to be allowed to do it. Although the ban is still in effect, they've essentially been handicapped from enforcing their ban because there's no additional penalty.
For the length of time they go from it,
[00:21:01] Paul: Yeah, the, the basic decision stance. So the ban on behavior based marketing behavior based advertising on Facebook and Instagram remains in place. But also the chair of the Norwegian DPA, Lee Nicole has said this surprises her, and although the Norwegian law says that daily fines can be imposed when businesses breach data protection laws.
Or the GDPR the board has now decided that exceptions must be made to this when it comes to international companies How that exactly works? I’m not completely sure and apparently also the Norwegian dpa is not completely sure because they claim this takes away from us an important instrument in dealing with large international companies and This means also that We can impose daily fines on Norwegian business, but not on international players.
And that is an unfortunate discrimination. So they are calling on the legislator to actually help to clarify this issue.
[00:22:03] K: Well, that will be an interesting one to watch. There's no doubt about that Let's see. What else we have over we caught we talked about your European Data Protection Board elections, which were good Let's see. What else is there's quite a few going on I'm just trying to comb through them which ones we want to talk about we talked about the APRA Let's talk about this one here.
So the, the New York, because we talked about New York earlier, so there is now a New York Child Data Protection Act. it's called groundbreaking legislation, protects kids online. Safe for Kids Act prohibits collecting personal data from minors without consent. It makes New York a leader.
As you may know, Vermont actually vetoed a kid's code online. New York is actually picking it up. But what's really interesting about this in states wanting to pass these extra protections to address youth mental health is that the U. S. Surgeon General, Dr. Vivek Murthy, pushed for warning labels on social media platforms to alert parents to potential mental health risks to teens.
He emphasized congressional action, comparing it to past public health campaigns such as against smoking, tobacco, alcohol, different things like that. So saying that this is a public health concern for mental health risk to teens that tech companies need to improve on safe safety features, and they need to share internal data on health impacts where they have it.
So I thought that that one took me by surprise. I know there's been a lot of conversation about how bad social media impacts youth. And this is not just in the U S Japan limited online time. For youth, of course, parents are always able to override these, but trying to restrict the amount of time that youth spend online, given the overwhelming concern, not necessarily just about child sexual abuse online, but their mental health in reading bullying comments or participating in bullying others.
Or the hateful things that could be said online or the types of actions that go viral. Remember the whole tide pod thing and please don't eat the tide pods. They can kill you. Well, that was a, a tick tock challenge. And there's a lot of these challenges Asking teens to do things. The one teen who was a significant driver in convincing her boyfriend to commit suicide encouraging him on social media to do so.
So there's a lot of concern about the mental health of kids online for heck. Let's just say there's a lot of concerns about adults online too. However, they seem to think that we have the capacity to make such judgments for ourselves. I'm not convinced that we are. But.
[00:24:49] Paul: alone for our kids.
[00:24:50] K: Persons are smart. People are stupid. You put us in that pack mentality and anything can happen.
[00:24:58] Paul: so while we're at children's data, can you tell me a bit about Spongebob and privacy?
[00:25:04] K: Oh my goodness, this SpongeBob thing. So yes, so the California attorney general back in California again So Rob Bonta is making headlines left and right but this was in conjunction with the Los Angeles city attorney Heidi Feldstein Soto. They announced a settlement with tilting point media with the mobile app game SpongeBob crusty cookoff.
It says that they violated the CCPA They violated COPPA. They had strict injunctive terms to ensure compliance with data protection laws, including contain obtaining parental consent and proper configuration of the third party software. So it's interesting, a 500, a half a million, let's put that in terms, a half a million dollar settlement with Tilting Point Media.
I have no idea how big a company Tilting Point Media is, but Collecting and sharing children's data through the mobile app game Spongebob Krusty Cook Off. I have no idea. I know what Spongebob is. I have no idea what the game is. But that would have had to come to someone's attention through a complaint or something.
Or, the Attorney General's office did say last year they were kicking off and focusing on mobile app. Applications and their privacy activities and everything like that. So it could have come into their attention for that. I'd have to dig more into the story, but there you go. Even Spongebob is not safe from violating data privacy laws and being held accountable.
So take that to the bottom of the ocean, you big yellow spun pants wearing thing. You don't know what
[00:26:38] Paul: Okay, I have absolutely no idea what you just said, but I guess, well, I mean, I've, I've seen it. I've seen the images, of course, but I've never watched it because,
[00:26:49] K: I haven't watched the show either unless my children forced me to sit through it, and that's even in just bits and pieces. I'm not a, I'm not a Spongebob fan. I'm glad my grandchildren are not Spongebob fans. I, I don't know. I think it's stupid. But I think there's a lot of things out there that are stupid that would probably get me in trouble with some of our fans.
Like Ren and Stimpy. And Dumb and Dumber. And Some of these other things that I just think are, well, no.
[00:27:17] Paul: Okay, well for me, when I look back at my youth, it's things like The Fraggles and
[00:27:24] K: Fraggle Rock.
[00:27:25] Paul: Fraggle Rock, exactly,
[00:27:26] K: was Wile E. Coyote and Bugs Bunny and Roadrunner and Elmer Fudd and everything. So I got the classics on my
[00:27:33] Paul: yeah, I mean those as well and Those as well. So yeah, all these modern cartoons, they just don't mean anything to me. But that's, that's with me. I mean, I have no kids, so I'm not forced to watch any of that.
[00:27:47] K: Well, there you go. So there's another one. So Meredith Whitaker, the president of Signal has been criticizing governments for exploiting public distrust of big tech to justify encroachments on user privacy under the guise of addressing legitimate concerns. Now y'all know what Sigl, Sigl, y'all know what Sigl is, right?
So but she's arguing that the efforts to undermine the encryption and expand surveillance capabilities only exacerbate the core issues of data collection and manipulation by tech giants. So here's the thing. Okay. This is where government is wanting to have a backdoor into messaging. Paul and I talked about this a few weeks ago on the show. I'll make sure that I pull up the, the link for y'all on that one. But this is where the government is pushing that things shouldn't be encrypted end to end because then there would be no visibility for any type of surveillance.
And sometimes that surveillance. is deemed necessary by government to prohibit terrorism or, human trafficking or something along that line. So she's saying that they're exploiting this fear to encroach on privacy for individuals and Paul's comment back then, and I'm sure it's the same right now is,
[00:29:01] Paul: that if you create a backdoor, you create a backdoor for everybody and you can't rely upon the good intentions. It's a very slippery slope and that is a continuous discussion because, as you know, we spoke about CSAM earlier this year, the European legislation that's forthcoming. It is still forthcoming.
The European member states were supposed to vote on it last week and to approve the legislation. But it was pulled from the agenda at the 11th hour because apparently there was no majority to support the legislation. So it has been postponed. It will come back probably later this year, but more negotiations are needed.
Which actually In my view, is a good sign because if it would have been voted now, it would have been adopted and that doesn't mean that's the end of it, because parliament also still needs to needs to approve it but yeah, for, for now, it was pulled in the European Union.
[00:29:56] K: And government doesn't need entry into all of our communications,
[00:30:01] Paul: no, I mean, our smartphones have really become our most intimate device the most intimate thing that a human being can own. It knows more about ourselves than we do ourselves, probably. So to provide access to law enforcement or a government and have people scan and then automatically report to law enforcement content that may be on your phone that may or may not be breaching the law that is a dangerous precedent.
And of course when you are talking about child pornography, yes, it should be fought. But it is the slippery slope, and I'm not gonna repeat the whole discussion that we had in the day, but. What about certain countries fighting LGBT consent maybe a right wing politician who wants to go after climate or Palestinian activists? So,
[00:30:54] K: where do you draw the line? where and where can you, where can you enforce the drawing of that line? That is an even more difficult question. So it's not that we're saying that government shouldn't have access. We're saying that it shouldn't be unfettered access. There should be some
[00:31:09] Paul: checks and balances.
[00:31:11] K: in place, checks and balances and they should be real.
If, like I said, if you go pull the transparency reports for mobile phone carriers and different social media platforms and look at their transparency reports of how many requests they get and how many they fulfill. They can't do it in real time. They have to do it on a delayed basis. But how many they get, how many from the national security agency?
I mean, this is, there's a lot of requests people, and sometimes it's just. I know law enforcement do the best job they can, but sometimes it's, we think maybe these 10 people, but we don't know. We want to look at their social media so we can get a better idea. So they don't even have a, it is this person, 90 to one
[00:31:55] Paul: Yeah, absolutely true. So one final thing and that's a small shout out to ZoomInfo for obtaining the very first trustee responsible AI certification. you've probably heard us refer to these in the ads that are around these episodes thanks to our sponsor, TrustArc but the first company has actually done it. So well done ZoomInfo for that.
[00:32:18] K: Yep. Congratulations. And with that, we bring our episode to an end.
[00:32:23] Paul: And if you like these episodes, do share them with your friends and family and of course your colleagues. Join the conversation on LinkedIn, you'll find us under Serious Privacy, you will find Kay on social media as Heart of Privacy, and myself as your old Paul B. Next two weeks we have some guests for you in store, so look out for those.
We won't reveal yet who we have, but they are both exciting guests. until next week, goodbye. Bye.
[00:32:48] K: Bye y'all.