Serious Privacy

If it's broken, fix it (UK breaches with Ralph O'Brien)

August 24, 2023 Dr. k royal and Paul Breitbarth, Ralph O'Brien Season 4 Episode 31
Serious Privacy
If it's broken, fix it (UK breaches with Ralph O'Brien)
Show Notes Transcript

In this episode of Serious Privacy, Paul Breitbarth of Catawiki and Dr. K Royal connect with Ralph O’Brien (together in a long time) to discuss some of the data breaches happening in the UK, such as a breach of the electoral records and police officers in Northern Ireland, historical breach statistics, and more.


If you have comments or questions, find us on LinkedIn and IG @seriousprivacy @podcastprivacy @euroPaulB @heartofprivacy and email podcast@seriousprivacy.eu. Rate and Review us!

Proudly sponsored by TrustArc. Learn more about NymityAI at https://trustarc.com/nymityai-beta/

#heartofprivacy #europaulb #seriousprivacy #privacy #dataprotection #cybersecuritylaw #CPO #DPO #CISO

Please note that transcripts are largely automated. For accuracy, listen to audio.

[00:00:00] Paul: By now we know that data breaches occur on a daily basis. Just look in this morning's or yesterday's newspapers and you are sure to come across one reported breach at least. And when you read the reports, oftentimes they are reported to be the result of complex or sophisticated cyber attacks. But more likely it's just human error.

In recent weeks, a couple of data breaches in the UK gave rise to more serious headlines, also because of the consequences, to the people involved, police officers, and the general public, the voters. We would like to know more about these breaches and the fallout in the UK, so who else could we better invite than our very own British correspondent, Ralph O'Brien.

My name is Paul Breitbarth. Hehehehehehehehehehehehe

[00:00:54] K: and I'm K Royal and welcome to Serious Privacy. great to see you again, Ralph. I can't keep the smile from my face when I know you're going to be on the show. Cause I know it's going to be fabulous. So, all right. Unexpected question. I literally just flipped a random page. Here we go. What do you lie about?

[00:01:12] Ralph: What a fabulous question. 

[00:01:16] K: And you have to tell the truth.

[00:01:18] Ralph: Yeah, I lie about lying. No, that's a very good question. 

Yeah. You're obviously trying not to, I guess. I think you work in data protection and privacy and there are some things you want to keep private and some things you are happy to be in the public domain.

I think data protection is actually sometimes about managing boundaries, right? And so for me. I've led a very exciting past, none of which I feel comfortable talking about on the podcast, but you know, there's a reason why we're data protection and privacy practitioners, right?

[00:01:52] K: Right,

[00:01:53] Paul: Yeah, I fully agree, so... No obvious lies, and certainly no obvious lies that I'm gonna share out loud, because then what's the purpose of having lied about them in the first place? But there are differences in what you share with whom and why. And I think, even though you always try to be true to yourself everybody has different persona that they use depending on, on where they are and who they meet.

You probably have a different persona in the office than you have with your friends, than you have with your family, than you have with a stranger that you meet at the airport.

[00:02:27] K: right.

[00:02:29] Ralph: I think managing

boundaries, you know, 

[00:02:31] Paul: But what about you, K? What are you lying about?

[00:02:33] K: This comes from a book of five years worth of stuff, so I was looking over what do I normally Lie about, and I have to tell you, I only did it for three years. So I'm filling in the last two years as we go.

2016, I lie about my name. 2017, my work turnaround time.

2018, my name. the only other thing, because it depends on how much chocolate you give me. I'll tell you 50 different stories about how my name came to be with just one letter and no middle name. There, there's a bunch of different stories around that, but I also lie about my age. I tell people, used to tell people.

And I look damn good for my age. I've had to up that by eight or two now because people believe it now. And they're like, Oh my God, you do look wonderful. What do you use for skincare? Okay. I'm 85 and I look damn good for my age.

[00:03:27] Ralph: There's a, a beautiful quote from the Batman franchise where the, where the Joker says, I prefer my past to be multiple choice which I think is, which I think is a beautiful statement.

[00:03:39] K: I love it. I absolutely love it.

[00:03:41] Paul: For sure. But K, what's wrong with claiming that you're 39?

[00:03:45] K: Cause nobody believes it. I mean, what's this? What people say they're 29 and holding or whatever. It's like, okay, nobody really believes that, but you know, if you're 75 and you look damn good for your age, people are like, girl, yes, you look good.

[00:03:59] Paul: Oh, well, I'm happy to claim that I'm 29 for the 13th consecutive year.

[00:04:04] K: But it's only 13 years now. Wait till it's 30 years.

[00:04:08] Ralph: 21

[00:04:08] Paul: No, no, I was claiming 29, not 39.

[00:04:11] Ralph: Do you know, this is lovely.

I, I, I'm always asked to come on the podcast when one of you isn't here. So it's so lovely. Can I just say to be on the podcast with you both?

[00:04:23] K: didn't think about that, Ralph. You're absolutely right. And of course, we asked you to come on for a topic that is near and dear to your heart. All these flippin breaches that the UK's been having. Which, you know, I was pulling up stats on UK breaches to see, you know, how many of y'all been having in the past.

And, How does this one stack up against the ones that you've had? And frankly, you're, according to the stats I'm pulling up until this year, your last big breach was in like 2020 and there was only like two of them.

[00:04:58] Ralph: Well, I don't think the ICO would say that.

[00:05:00] K: Well, probably not in the thing, but major large breaches that really make it, you know, and most of it seems to be credit card breaches and things like that.

Like the British Airways one back in 2018 or the 900, 000 were impacted by Virgin Media, the 9 million by EasyJet. What is it with travel things there? Y'all, do y'all just like throw your data at the travel companies and challenge them to do something with it?

[00:05:26] Ralph: Or is it that data breaches are now so common they fail to make major news, you know, because, you know, with the volume and scale of data in society at the moment, you know, data breaches are a daily event. Do we just tune them out in the public? Do we kind of forget there's a risk and a consequence?

That's an interesting question, perhaps.

[00:05:46] Paul: Well, but certainly for these big breaches, and let's be clear, we're talking about the Northern Ireland Police and also the UK electoral roll that have both been breached, And, and quite a few other bigger breaches in, in the wake of that, that were also reported. But those, these two of course, they became headline news for a reason, because they are so serious.

[00:06:09] K: Yeah.

[00:06:10] Ralph: Yeah, 

[00:06:11] K: And I mean, to be frank, the electoral rolls, and Paul and I mentioned this briefly because we knew you were coming on, the, the electoral rolls to us wouldn't be big news unless it has how you voted, having the, the person and what party you're registered to and everything isn't big news here in the U.S. Nobody would care.

[00:06:30] Ralph: Yeah, I think, yeah, different breaches, very different consequence. In fact, I was telling a story the other day about my father. My father had uncovered some social services files down the back of a cabinet that he'd bought, a filing cabinet that he'd bought. And I, you know, he phoned me up and I told him to contact the local DPO.

Apparently they were at his door in like 30 minutes. But I thought for every person that was just like my father, that would just report it and get the data back There might be someone else who would go to the press, right? And funny

[00:07:01] K: There's probably a million others that would go to the press.

[00:07:04] Ralph: exactly, and I was talking with my colleague Rowena Fielding, who I'm sure you know, and would be an excellent candidate for the podcast, by the way.

She was talking about the data hazards, data hazards. And I thought that was an exceptional thought process because, you know, you can have the same breach, but a vastly different consequence. So what is, you know, and with these breaches, both the electoral register, name and address the police. Northern Ireland breach, name and address.

Now you might say name and address, not that great a consequence, but in consequence, name and address of a serving police officer in Northern Ireland, given the political situation, that's actually real risk of physical harm, right? And so what could be considered, oh, it's just name and address. The consequences could be vastly different

[00:07:53] K: Absolutely. You put it in context of what it's relating to. Absolutely. Could be very significant difference.

[00:08:00] Ralph: yeah, and you know, when you look at the two as well And I say to that, you know, they're vastly different reasonings the electoral register. No one's really come out and said it yet People are talking about technical and you know, what really interested me about the one with the electoral register is the lead time between the breach itself, the, then the lead time between them noticing the breach, the lead time between them reporting the breach to the ICO, and the lead time before it came public.

Now, as we all know, there are  legal limits. 

[00:08:38] K: Yeah, explain those lead times for our listeners because there could be some listeners that may not recognize them.

[00:08:44] Ralph: Yes, well, as we know, legally, there are supposed to be, you know, without undue delay or within 72 hours you know, the sort of statutory time limits but with the electoral registered data breach, what I found interesting was there was a lapse of, the time between the breach being found and notified.

So just, just to be clear, this is all voters between 2014 and 2022. this is, this is, you know, eight years of voters open to hostile actors as far back as August. 2021. So August 2021 was when the breach occurred. So that is, you're well over two years ago between that and the public being notified, which

[00:09:33] K: which they supposedly notified it to ICO within 72 hours, but if there is a risk of harm to the privacy rights, the reputation of individuals, they're supposed to notify the individuals immediately.

[00:09:47] Ralph: exactly. and what's really interesting about this as well is, you know, this is a capable, the list of the top suspects, I'll put it that way, are supposed to be foreign nation state. Right. So this is going to be a hostile state or a criminal cyber gang acting with you know, a significant amount of capability here which is very different from the other breaches we see in Northern Ireland, which has more to do with with human error.

So this select will register one able to access full copies of the electoral register held for the Commission for research purposes and to enable permissibility checks on political donations. So it's the name and address of anyone in the UK registered to vote between 2014 and 2022 and it also included the electoral commission's email system.

accessible during the attack. Now, you could argue this is of low consequence because the electoral commission, you know, can be inspected by the public, the register can be given locally to, you know, political groups and the electoral registration offices, but it's not supposed to be then used for any other purposes such as commercial marketing purposes.

I mean, years ago in the UK, we actually have an ECHR case, a European Court of Human Rights case, where a man called, I want to say, Robert I can't remember the name of the gentleman, but he essentially said, well, you know, The fact that marketeers can get hold of the electoral register and use it for marketing, you know, that's a breach of my rights for a free and fair vote for example.

And you know, and so the UK instigated as a result of that ECHR decision, a sort of an opt in to the saleable register. They're not in to the saleable register, not even opt out, they're not in to the saleable register. But now essentially that data is out there for anyone to use for a number of secondary purposes, let alone the reputational damage to the electoral commission and faith people might even have in a

[00:11:41] K: right.

[00:11:42] Ralph: System.

We have here, right?

[00:11:43] K: Right, right.

Which all the news reports say that there's no indication they were trying to influence or have any type of impact on the democratic system and the election processes together, but they don't know. I mean, they're guessing.

[00:11:59] Ralph: Yeah.

[00:12:00] Paul: They're guessing, and also it could be that the inflects will only be seen in the next election.

Which is due to be held in late 2024. Mm

[00:12:09] Ralph: this is what's fascinating about data breaches. You know, the impacts are not always immediately obvious, you know, they could be long term. It could be long tail damage and distress over a longer, larger period because you never know what that data is going to be used for or correlated against in the future, you know let alone the fragility of the democratic process as well.

So who knows? Who knows what, what impact? Yeah. So they can't be certain motives. They can't be service of what they learned, what they were truly seeking, but the very fact they had had to compromise, you know, is going to be certainly concerning for the with the chilling impacts on you know, democracy.

in the coming future when we come to our elections. So that is perhaps more impactful than people think. Yeah. When, when they think about, oh, it's just names and addresses.

Should we turn to the next show?

[00:13:05] Paul: Well, I mean, I'm still fascinated by the fact that this, that this breach was

already taking place such a long time ago. I mean, in 2021. We're 2023 now, so that's a long time. Supposedly was reported straight away, but apparently it wasn't serious enough to report to the public immediately, but it was done now and I don't understand that notification strategy from the Electoral Commission, but I also don't understand it from the Information Commission as well.

[00:13:37] Ralph: Yeah, interesting enough, I mean, you know I've got some quite strong opinion on the current information commissioner set up, which I probably won't repeat here, but but but yeah, it is an odd decision, you know, you are, you are rightful, you know, why keep it quiet for so long unless they were advised by the National Crime Agency or for national security reasons to keep it quiet until the investigation was completed,

[00:14:00] K: Nothing in the news reports that, that they were asked to keep it quiet for any particular reason.

[00:14:06] Ralph: Yeah, which does beg the question why, why the notification came at this point rather than any other point within that, you know, two year period. So, yeah, just really fascinating. And I think sometimes the breach itself is fascinating. The response is fascinating, but so too is that sort of notification or regular response side of it.

So

[00:14:24] K: Well, I mean, let's speculate here. John Edwards has been at the ICO for how long now?

[00:14:30] Paul: Two years.

[00:14:31] Ralph: yeah,

[00:14:32] K: Was he there when this occurred? Because I'm kind of picturing, you know, they're all sitting around a table, they're discussing the updates, and somehow someone let slip, well, you know, it's kind of like that breach with the electoral system way back in 2021.

We didn't notify people because there wasn't any risk of harm and someone sits up and goes, what? What do you mean the electoral system had a breach? We need to tell people. I mean,

[00:14:56] Ralph: I mean, I don't really want, yeah, I don't want to speculate myself, but what was quite interesting was watching the television interviews, watching the television interviews. So there was a number of television, the number of television interviews with people like BBC and channel four and your number of radio interviews around these whole data breach scenarios.

And, you know, the ICO has got some interesting and consistent messaging around. Yes, it's important that people have the right to expect their personal data is kept safe and not disclosed when it shouldn't be. It can show how small errors have major consequences. But, you know, they kind of go into this holding pattern language of, well, we can't really comment because we're investigating the matter.

It is of serious concern, but we don't really know the full extent yet. You know, we've got to work with them to establish the risk and mitigations. And, you know, the ICO has got enforcement powers it can use if it so chooses. Thank you. Now, whether or not it chooses to use them, however, is a matter of some debate, and we all know that John Edwards has a policy of, let's just say, light touch enforcement.

So whether there will be any enforcement or penalty John, John Edwards, especially as these are public sector data breaches, John Edwards has been very clear that he doesn't like taking money out of the public purse. So, you know, in terms of financial penalty, there probably won't be one. We can probably expect some sort of strongly worded reprimand perhaps, perhaps at worst some enforcement activity in terms of an enforcement notice, but I'd be very surprised if we see either anytime soon.

[00:16:27] Paul: Hmm

[00:16:28] K: Yeah,

[00:16:29] Ralph: And that manifests itself in different ways. So, I think as a consequence of this you know, you'll be, I was listening to the podcast and you both covered the decision of META to, for example, rely on consent for you know, using its more sort of direct marketing and profile activities in, in the EU.

Now you have to remember politically, the UK has left the EU and META is not going to do that in the UK. Now, I think that's a very interesting situation because they've obviously taken a risk based decision to say, the EU is going to punish us if we don't do something, but the UK, even though the legal situation is, well, exactly, but our law is essentially equivalent.

Our law is essentially the same. So why not do it here in the UK? And the only answer I can come up with is. We don't think we're going to be penalized here in the UK. So, so, so my worry is that the ICO's enforcement position may well lead to individual harms by companies who, you know, aren't going to take the same measures as they would do in the EU.

That would be my concern.

[00:17:32] K: right.

[00:17:33] Paul: So let's take a, a quick look at Northern Ireland because, publishing or publishing a list of. All the names and addresses of police officers. And this was admittedly just a human mistake as part of a freedom of information request where the information was not double checked before it was published.

I think that's happened also a lot in government organizations and probably there are dozens and dozens of documents on freedom of information request websites that have similar information. Don't go looking for it, but I guess it will be there. And they found it probably within, within half an hour, I believe but the damage was already done.

Mm

[00:18:13] Ralph: Yeah, we've seen, we've seen quite a lot of these sort of accidental publishing breaches. You know, the a couple of years ago in the UK, the cabinet office issued the New Year's Honours list, but didn't issue the New Year's Honours list. They issued a separate document that had far more substantial information on it about the individuals considered for honors and why and why not, right?

So, and that was only up for a short period of time, but long enough for people to grab it and use it. The problem is with data breaches is they're somewhat of a Pandora's box. What's known cannot be unknown. Once something is out there, it can't really be brought back in. And, you know, the damage or the hazard is done.

And by the way, this is not the only police force that have come out and said the same thing. Less than a week later, Norfolk and Suffolk. Constabularies you know, on the mainland U. K. They too said that they'd done a very similar thing in Databricks. So obviously they'd looked at the Northern Ireland decision, looked at their own FOI responses, and realized they'd done the same thing.

So, I mean, FOI is brilliant, right? public sector transparency law, isn't it great that we can speak to our public sector, understand how our taxpayers dollars are spent. And, you know, get information about how the public sector is processing our data. I think those transparency laws and F. O. Y. Are brilliant and beautiful things.

But What I think has happened in these cases is, you know, it's almost that classic Excel spreadsheet error where someone was used some sort of pivot table or some sort of calculation and Excel spreadsheet to get to the statistical anonymized numbers, release the Excel spreadsheet, but obviously the underlying data.

It's still there, you know, even though they thought they were releasing anonymized statistics. 

[00:19:48] Paul: tab, or somewhere hidden, or... 

[00:19:50] Ralph: Yeah, exactly. So, you know, this is human error. This is due diligence. This is, you know, there's not a lot you can do about human error and due diligence apart from training and reinforcement. And I'm sure now that the police after this is going to be as hot on it as hotter on it than most people.

But you know but what, what really highlighted this case is the potential consequence. As you know, in Northern Ireland, the political situation is that where you were serving police officer is at risk of life and has to check underneath their car every morning in case there's a device there, you know due to the history of what's happened.

You know, so the consequence for these people just to have their name and address associated with the fact that they are a serving police officer, you know, who's going to great lengths to perhaps hide it in their community, you know, will result in a great risk to that individual physical harm.

Thankfully, we haven't heard of anything resulted in physical harm, but we've certainly heard of a great deal of concern, distress. 

[00:20:46] Paul: Yet. 

[00:20:47] K: when it comes to political feelings, people aren't always rational. So, if they're going to go after a cop because of the political situation, they may go after the cop's family too.

[00:21:00] Ralph: Exactly. So we've had.

[00:21:01] K: do this aren't rational.

[00:21:03] Ralph: Yeah, whether or not they're unsafe or not, it certainly will contribute to a great deal of anxiety, stress, a feeling of not being safe. And in many cases, you know, people are looking at potentially moving or moving into other accommodation as quickly as possible. So even though you can say on the surface of it.

Yeah, it's just names and addresses, you know, the, the the cost to individuals, I think, and the harms to individuals here go beyond you know, Oh, my name and address is exposed into something much more existential, you know, much more much more concerning in terms of the, the safeguards and the changes they'd have to make to their life as a result of that fear whether or not that actualizes

[00:21:46] K: Right,

[00:21:46] Ralph: or not, you know, yeah.

[00:21:48] K: right. That reminds me back when all and I were assessing the PIPL when it first passed and they had a translated term of special categories of people or people who were. whatever in, in a special light. And we kept trying to push, what does the term actually mean? Speaking to people in China, trying to say, what is the actual translation?

And the best example we got, it's people that based on their context, it would offer a special harm to, such as police officers,

[00:22:22] Ralph: Yeah.

[00:22:22] K: that if you release their data in context, it could offer a lot of stress or it could offer a harm to those individuals.

[00:22:29] Ralph: and there are some police officers who might be doing some very specialized work such as undercover or you know, which exposes them to significant risk.

and then just to add insult to injury, a day later we had the storm months, which is the local parliament in Northern Ireland who managed to send an email the following day, including the addresses of 77 people, including birth mothers and adoptees, as part of a, part of a report that exposed harsh conditions at mother and baby homes.

So, as you know, Adoption records are an area that is very legally protected for good reason. and so you know, it was a genuine error. It was human, human error. But obviously, again, there are significant consequences for, you know, peoples who people of that who are working in sort of laundries and work houses in Northern Ireland between 1922 and 1990.

And that also included, you know, 77 email addresses and information on, on adoption and adoptees, survivors of historical institutional abuse. So, you know, there's, there's, there's a number of data breaches here that followed up. And then another one. From the same police force in Northern Ireland,

[00:23:44] Ralph: yeah, where a police officer put his laptop and notebook on top of his car and drove off which led to, obviously, they then said, yeah, it's a classic, you know, I've left my laptop on a train before you know, thankfully managed to recover it but, you know, this, this police officer, obviously, they wiped the laptop pretty swiftly, remotely, but obviously the information in the physical notebook, there's.

No chance of recovery or, you know, as well, so it just seems to be data breach after data breach after data breach over the past couple of weeks mostly in the police sector, but obviously they've got the Electoral Commission. You've got the Stormont one as well in Northern Ireland. Yes, it is a flurry of data breaches over here in the UK and, it does make me wonder what's driving it.

[00:24:30] K: at like some privacy laws or data protection laws or something.

[00:24:34] Ralph: Or we could weaken the ones we've got, I mean, who knows? Ha ha ha ha

[00:24:38] Paul: Well, I mean, okay, okay, that's not completely fair because also the US had their fair share of data breaches  in recent 

[00:24:45] K: Oh, we have  more than our fair share. Let's be honest here. We're just, wild wild west when it comes 

[00:24:50] Paul: let's just take a small look at the Moovit data breach. A service provider which to date has over 600 organizations worldwide impacted, including the education sector, some high schools, some universities, two state healthcare systems run by IBM, also pension 

[00:25:09] K: Oh, there's lots of government entities in there. Education, health institutions across the board, finance institutions. I mean, you go look at the list of entities that have known to be impacted. To give it credit, some of those are double dipping. The entity might be a client of one of the financial institutions and that's how they were impacted.

So some of it may be double dipping, but it's a little bit different of a situation because this wasn't something that someone, you know, left a notebook on top of their car or released something according to a FOIA.

This was a vulnerability in their system that, yes, they should absolutely have known, but we all know that humans are the ones that program. So there's always going to be a  weakness. 

[00:25:51] Paul: was the highly sophisticated cyber attack.

[00:25:54] K: It was a vulnerability. But, the thing is, the ransomware group took complete advantage of that vulnerability.

And people are falling victim to ransomware because of the Russian ransomware group grabbing that data from the vulnerability.

[00:26:09] Paul: Yeah, and we had a similar one in the Netherlands last year, where a market research company either themselves or their underlying software also caused a massive data breach for the national rail service, for television and internet providers for Heineken, for insurance companies. And also the Dutch Golf Federation.

So, so there, just using a market research company and all those people that were surveyed on how satisfied you are with their services all that data was breached as well.

[00:26:39] K: Well, and here's the thing, when it comes to, and I'll use Moveit because it's a perfect, perfect example of it, because of how we capture vendor data. So vendors, service providers, processors, where will you want to have companies that are really on top of, it might actually catalog fourth party data, your vendors, vendors, but what if it's your vendors, vendors, vendors, vendors, vendors, vendors.

So fifth party, sixth party, seventh party, eighth party, 

[00:27:05] Paul: It's impossible to check.

[00:27:07] K: No one keeps that kind of genealogy and technically to keep that kind of genealogy, you kind of have to follow the data too, which means companies really need to know where their data is going and, and who's touching it. But I have a feeling we haven't seen the last of the move it for the next three, six months.

We're going to see, oh, eventually some company determines that they're a victim. It take, 

it's going to take them a month to comb through their records, figure out who their clients are, then their clients have to comb through their records in two or three weeks, figure out who their clients are, then their clients have to. And by the time you get there, we're six months up the road, nine months up the road before you figure out your 16th party data had a breach that impacted you.

[00:27:49] Ralph: It's a huge problem. I mean, I was dealing with a client today. I was dealing with a client today. I was looking at their, I was speaking to their legal counsel, and their legal counsel sort of threw their hands up in the air and said, look at all these third parties. You know, we've got third parties in HR and benefits providers, and we've got, you know, Google for mail for this, and Salesforce for this, and, you know, I mean, how possibly...

Can you gain assurance on that number of providers when you're relying on them for your products and services? And because, because, you know, my argument was great. You've got a contract. Now what's, you know there's this problem I've got in data protection that I call kind of legally papering over the cracks, like you say.

The lawyers have redlined the contract and everybody signs it and walks away whistling, but what actions are they going to do differently as a result? How does that protect people? How are you getting assured of that? Great, you've done a DPAA, but Yeah, but what actions are you doing as a result of that DPAA?

You know, great, you've got an 

[00:28:47] K: right. 

[00:28:48] Ralph: transfer agreement, but how does that actually protect people? 

[00:28:52] K: Yeah. Form over function. right? 

[00:28:54] Ralph: and it does worry me that we are a little bit overwhelmed. 

[00:28:57] K: Well, and let's not forget, Ralph, that lawyers are sometimes the worst offenders, because how often have we asked a lawyer to fill out a vendor questionnaire or to put in a contract they're going to follow the privacy and data protection laws, they're like, Oh, I don't need to do that.

Privilege applies to this. Really? Because privilege has nothing to do with privacy and data protection laws.

[00:29:19] Paul: Plus it doesn't apply in Europe in most countries for in house counsel.

[00:29:23] Ralph: True.

[00:29:24] K: It's like, yeah. And so you, you have to get to the point where like, okay, you can argue all you want, but my contract with my client says that I have to push these requirements down to every single person who touches the data. So sign the darn thing and shut up. And they seem to understand that contractual requirement more than they understand they're actually subject to privacy and data protection laws.

[00:29:46] Paul: Mm hmm

[00:29:47] K: Lawyers effectuate 

[00:29:49] Ralph: well, if I've got one mission, it's to try and pull my clients away from paper based legal compliance and into actually thinking about protecting individual harms and whether their paper based tick box compliance version of data protection really actually.

Yeah, it really actually achieves that. That's why I'm doing a lot of work in the Privacy by Design field at the moment because I think, you know, adding features and functionality into products. If technology is the problem, perhaps technology is also your answer. And I say this, I know, ironically, on a podcast, but we're talking about human errors, so.

There we go.

[00:30:21] Paul: So, yeah, how can we avoid human error?

[00:30:25] K: Kill all the humans.

[00:30:26] Paul: Okay easy enough

[00:30:28] Ralph: Yeah. Yeah. You know, I mean, it's interesting because 11 solution could be automation, but then automation introduces a whole load of other problems with scale and volume of harm and taking the human out of the equation and their capacity for empathy. And 

[00:30:45] K: Right.

[00:30:46] Ralph: so Hopefully, 

[00:30:47] K: it really is training. It's training, training, training, but I think before even the training or maybe alongside the training, what comes is to find something that resonates with the person so they can actually internalize what it really means. The phrase, treat my data like you would treat your own, where some of these people don't give a crap about their data and they throw their social security number on a check that we used to do 30 years ago.

so they, they really don't. So that is immaterial. And I say, you have to give the same message 50 different times in a hundred different ways before finally something clicks with the person they go, Oh, now I get it. And you have to find that, that aha moment with the person. And you have to stop overworking people on a salary.

[00:31:33] Ralph: overworking people is a classic. Yeah, people trying to do things that are rushing at full tilt. But you know, people who are overwhelmed definitely. But funny enough, I was dealing with a company yesterday, and they showed me their data protection training. And they said, everyone's done the mandatory data protection training.

I said, okay, what's that then? And they showed them a four minute video on the GDPR, and they'd filled in a Google form. to prove that they had watched it. And, 

[00:32:00] K: And we all know how that works. They've got the video playing and they're doing their work over here and click to the next slide. They're doing their work over here and click to the next. Yeah, we know how that

[00:32:08] Ralph: I thought, yeah, and I find a lot of these computer based training courses are actually very, very good at telling you what the law says, but not what you have to do inside your business. You know, yeah, very, you know, does, does some, yeah,

exactly, you know, does the person in the call center really need to know about the GDPR or do they need to know how to, you know, manage a certain type of request or, or, or validate an individual or where to put the documents, you know, it's a different sort of training they need, right, you know, other, they don't need to be a

[00:32:37] K: Well, it's,

[00:32:38] Ralph: expert, like,

[00:32:39] K: it's visibility to your privacy person too. Cause I know I was first privacy person at a, at a major global company one time, and I asked permission to be able to go to their other facilities, Mexico, Philippines, Europe, and they asked why I said, I need to see what they're doing. They're like, how does that help you?

I'm like. Because what we're telling them to do and what they actually do in practice are probably two different things. At a call center, are they writing down that credit card number on a post it note? Yes, they are. Are they keeping a written log of everybody and their phone number so if they get disconnected they can call them back?

Yes, they are.

[00:33:15] Paul: hmm.

[00:33:16] K: You know, what exactly are they doing in practice on site that you can look at it and say, you know what, skip all the stuff in the knowb4 or the NaveX or the whatever training. Let's talk about don't write down credit card numbers,

[00:33:32] Paul: Yeah, it can be as simple as that and for these kind of freedom of information requests that that caused the breach in the Northern Island Police, I think you should have a full rights principle on the final document Maybe even prescribe that the document that is being published Is not in a spreadsheet file or a

CSV, but can only be a locked PDF so that you cannot access anything that's behind it because you want to avoid human mistake.

And that is not just for public organizations that get an FOIA or FOI request that is also for private sector organizations, sharing information about their users with an advertising company or whatsoever. And you cannot always lock down all of the data, but you need to be careful. What you share with whom, even if you share data sets like that within an organization that you look, especially when there are spreadsheet, if there are no hidden columns, if there are no additional tabs that may

[00:34:32] K: Right.

[00:34:33] Paul: additional information that should not be shared with everybody in the organization.

[00:34:37] K: Or doesn't incorporate the file or data sets from other documents that are connected.

[00:34:42] Paul: Yeah.

[00:34:43] Ralph: yeah, it's,

[00:34:44] K: a lot that can be done, but who has time to do it? I mean, let's be honest,

[00:34:50] Paul: Well, I mean, for most companies, data is their most valuable assets. 

[00:34:54] Ralph: Yeah. Yeah. 

[00:34:56] K: Saying you didn't have time to take someone's privacy into account is no excuse. No excuse

[00:35:03] Paul: Not in 2023.

[00:35:05] K: Nope. 

[00:35:06] Paul: So on that happy note, we'll wrap up another episode of Serious Privacy. Thank you everybody for listening. Thank you, Ralph, for joining us again. As always, a pleasure. If you want, join the conversation on LinkedIn. Find us under Serious Privacy. You'll find K on social media as @HeartofPrivacy. you'll find Ralph as at @IGROBrien and myself as @EuropaulB. Until next week, goodbye.

[00:35:32] K: Bye

[00:35:32] Ralph: Bye. Bye all.