Serious Privacy

Keeping a [low] Profile: A look at Profiling

Dr. k royal and Paul Breitbarth Season 5 Episode 3

Send us a text

On this week of Serious Privacy, Paul Breitbarth of Catawiki and Dr. K Royal of Crawford & Company examine profiling in all its contexts and glory. Join us for a lively conversation.


If you have comments or questions, find us on LinkedIn and IG @seriousprivacy, and on Blue Sky under @seriousprivacy.eu, @europaulb.seriousprivacy.eu, @heartofprivacy.bsky.app and @igrobrien.seriousprivacy.eu, and email podcast@seriousprivacy.eu. Rate and Review us!

Proudly sponsored by TrustArc. Learn more about NymityAI at https://trustarc.com/nymityai-beta/

#heartofprivacy #europaulb #seriousprivacy #privacy #dataprotection #cybersecuritylaw #CPO #DPO #CISO

Please be aware this is a largely automated transcript. For accuracy, listen to the audio.

[00:00:00] Paul Breitbarth: A special offer for just for me or a mortgage denied because what actually profiling an automated decision making have quickly become very common whether or not data protection law allows it or not. And of course not all profiling is bad and not all automated decision-making has a detrimental effect. But we should talk about it more, especially now that the court of justice after European union has provided more detail under responsibility for an explainability of automated decisions in the so-called Schufa case. So that, and more in this week's episode. My name is Paul Breitbarth. 

[00:00:49] K Royal: And I'm K Royal, and welcome to Serious Privacy. Today, Paul and I are going to cover a particular topic. And today's topic is going to be online behavioral advertising, not, not online behavioral advertising, profiling behavioral 

[00:01:04] Paul Breitbarth: and automated decision making.

[00:01:06] K Royal: Exactly, profiling and decision making, but you know, they get it from targeted ads, but hey, whatever. anyway, so we're going to talk about that, but first the unexpected question. Ooh. What is your current cell phone wallpaper and why?

[00:01:25] Paul Breitbarth: I can show it to you. I think you can see it. This is a tree from the Savannah in Kenya.

[00:01:33] K Royal: Ooh. 

Oh, 

[00:01:35] Paul Breitbarth: With a blue and purple shade over it and why because this is my default screenshot setting For do not disturb because while we do how we record my phone is set to do not disturb And I like something nice and quiet and iPhone lets you do then automatic by color mode So it's purplish and bluish

So that's, that's the default while we record.

Typically during the workday I have an image of the kaki building just to make sure does is work. And in my spare time I have random pictures show up. I have about three dozen pictures and they rotate throughout

[00:02:15] K Royal: Nice. I am not near that sophisticated. I have two phones. only because I really love androids and so I keep it even though it's not my phone for calls anymore. I keep it for reading and playing games and social media and all that good stuff. The wallpaper on that is the picture of the costume for How to Train Your Dragon 2 where they had dragon armor made out and I have been attempting to make that as a cosplay.

I have to take the time to, to get in and make it. And I have a lot of it made and I've posted it to social media where I've made the, the helmet and some of the pieces of the armor. I just have to pull 'em all together and finish 'em. So that's one to keep me motivated to do that. On my work phone, and it's funny, my personal phone is now my work phone because some complications with the, the other, so I downloaded, you know, they had to.

Impose all the work restrictions on my phone and I just realized my wallpapers not there. I had set my wallpaper to be the QR code from LinkedIn. That way I don't have business cards. That way when I'm somewhere and someone wants to see it, I don't even have to share anything. I just show them the phone with the QR code and they can scan it and boom we can be connected.

It's not there anymore. I didn't realize that. So when I downloaded the corporate protocols to manage the phone, I lost the wallpaper. So I got to figure out how to get that restarted on there. So there we go. . Anyway, let's talk about online behavioral profiling. And what that means for privacy, what it means for people, what can people do to impact this?

Because it's becoming a bigger and bigger issue. We had a conversation on LinkedIn that Jeff Jockish started. Talking about and he does a lot of studies on data brokers and the data they have and We're not going to get into the topic of whether people should own their data or whether privacy rights should apply despite ownership.

That, that's a whole different topic, but we are going to talk about how do people collect the data to make a profile of you and all the ramifications and the laws and the issues around that. Not all of them. We're only going to meet for 31 minutes. So there we go.

[00:04:24] Paul Breitbarth: So, how do you feel about being scored? 

[00:04:27] K Royal: I don't know. I mean, If I take it from just a general individual perspective, what do I think if a, if a company ranks me really high, or I get some sort of social score like they do in the people's Republic of China, I would feel competitive. If the score was known to me, like the social scoring if it was known to me, I would be very competitive and try to do the things to get the high score.

Cause apparently the high score, the low score impacts what you can do in the country. And it impacts job interviews and resources and whether or not you can travel and different things like that. So I would be competitive and want the high score. Otherwise, for other companies, not for country social media scoring, if a particular company was scoring me based on behavioral, I don't know how I'd feel.

Would I care? Would I care if I'm scoring high on Amazon or scored low? I don't know if Amazon does scoring. I don't know. How should I feel?

[00:05:30] Paul Breitbarth: I don't know. I mean, do you, do you like to be judged? Do you like to be graded? I mean, I never liked exams to start with. And getting all these scores feels a little bit like. Taking a continuous exam. I mean,

[00:05:46] K Royal: That you don't know what the questions are and you don't,

you can't 

[00:05:49] Paul Breitbarth: but I mean, for every single thing that you do nowadays, somebody asks you, Oh, can you please review this?

and that starts to get annoying, to be honest. And I mean, I'm not so much me being the one being reviewed, but every time you are in contact with customer service for anything,

[00:06:08] K Royal: Or even if you're on an app that you're use using it, it's not even customer service, it's just using the app or the site 

[00:06:14] Paul Breitbarth: So how would you rate us? How 

[00:06:16] K Royal: feral chat boxes that 

[00:06:17] Paul Breitbarth: exactly, how did we do some, some radio DJs here in the Netherlands were also a little fed up with that. So earlier this week during the morning show, they actually asked the questions to each other's partners. How would you rate your partner? And it was pretty funny.

So they had this whole review form. Why did you purchase this partner? What are the implementation issues? And if possible, would you recommend to purchase this partner again? But then about your love relation, it was actually it was actually funny.

[00:06:49] K Royal: It is. And I'll say that this also builds into something in the privacy class I was discussing last night. We discussed this a little bit as well, is phenotyping. They are now using your online behavior and activities. How long are you online? What sites do you go to? What activities do you engage in?

Are you gaming for 10 hours a day? They're using it as a phenotyping which is, you know, observable, the official definition, a set of observable character, observable. characteristics or traits of an organism. It covers their morphology, development processes, biochemical and physiological properties, behavior and products of behavior.

So now they're adding your own line behavior into the phenotyping. And so that, that can be whether it's health related or others, but now it's, it's more formal. You know, this is, this is an official type of research on organisms is your online activity and behavior. So that is a little different from the profiling, but it essentially is, it's studying this.

This is a major part of our lives now. So how do you go into and how do you treat the behavioral profiling and who's profiling you? 

[00:08:05] Paul Breitbarth: I mean, let's, let's first make the distinction between profiling and profiling, 

because there are, there are different kinds and for me, the default is of course, to go back to GDPR and profiling occurs twice. You have it in, in Article 4 in the definitions and you have it in Article 22 when it's profiling in the light of automated decision making with a significant effect.

When you look at the, the, the profiling fair and square, that is mainly the slicing and dicing of data. Which typically should be fine. When you just look at a dataset and try to draw analytics from it to, I dunno, improve your product or to improve your customer service or to prioritize complaints or whatsoever, that is the profiling in a way, or even creating marketing segmentation.

That is all fine. That's profiling. It is a data processing operation, obviously, but this is not data processing with, I would say, a significant effect.

[00:09:10] K Royal: Yes. Yes. Agreed. And all of the U S laws that have passed the omnibus privacy laws, which I think there are 15 now. Applies to the companies that are subject to the laws and you have the right to opt out of one automated decision making and yes, decisions do come from the profiling and you have the right to opt out of profiling being not just being tracked, but you have the right to opt out of profiling and targeted advertising.

Companies aren't very good at this now. They're trying to bring back things like the global privacy control, which if y'all remember the do not track more than a few years ago, not successful. So they're trying to say that companies have to honor any browser setting that consumers set. Now here's the thing, you browse on your phone, you browse on your, your tablets, you browse on your laptops.

Do you have it syncing in the browser? So a browser setting, not necessarily a computer setting. They have to be able to honor that. But here's the thing. What if you're setting. is directly opposite something you've opted into like loyalty cards or loyalty points. That's the biggest concern that seems to come up.

The company in some states it's default to the company has the right to overwrite it or default to the settings overwrite it and the company has the ability to reach out to you to see which one you want to do. How does that work in practice? You know it's really interesting. You're, you're opting out of being or profiled or your behavior tracked while you're shopping on a site.

And I can tell you, here's an example of one. I get points on my American card for shopping and if they have an arrangement with the company, every website I go to pops up and says, Ooh, you can have five times points pop shopping on Amazon activate now. What if I had a global privacy control setting that I didn't want to be tracked?

Then that wouldn't pop up. Maybe I want my, American extra 5. 5 times shopping points, 

[00:11:15] Paul Breitbarth: Well, then you should make an effort and go into the settings of the website and switch them on 

[00:11:19] K Royal: Right. And the average consumer isn't usually aware of these things, much less how to do them. How do you go into your browsers? How do you set a global privacy control? But then you have the problem on the company side of, Oh goodness, the laws say we have to do this by X date. And some of those dates are coming up pretty quick.

How do I do it? And we've already seen enforcement actions out of California about companies not, honoring that even though they know that it's a nascent technology that's not really active yet. So we're running into a complication of the laws conflicting with technological capability. And not just the technological capability of the companies to honor the GPC, because that's one issue, but the technological capability of the GPC actually working.

[00:12:10] Paul Breitbarth: Yeah, and then you also get through to, to data hoarding, data collection, you see more and more supermarkets, for example, that say, well, you only get our promotions if you also have our loyalty card where we 

[00:12:22] K Royal: Right. 

[00:12:23] Paul Breitbarth: your data, 

[00:12:24] K Royal: Loyalty cards are evil. I'm just saying

[00:12:26] Paul Breitbarth: Well, I mean, is, is, is, is that fair processing? Because there is a lot of data that you then need to provide in order to get your discount.

Again, it's pay or okay. are these, and I'm curious to see whether those kind of examples will actually also be included, in the opinion that we'll get from the EDPB. So not just social media, but also these kind of payments.

[00:12:50] K Royal: Right. Well, and then you have to take into consideration what if the profiling has the wrong information? So a lot of the profiling and the data brokers that, you know, help the companies create the profiling and they, they go into that. They're also collecting data from things like the mother of all breaches that has 26 billion records in it.

So they're trying to connect the identifiers in the records to get more information from you. And the information may simply be that there was a breach that compromised your username and password at LinkedIn. Maybe it doesn't have all your LinkedIn information, but now they know that you're a LinkedIn customer and there is information they can publicly scrape off LinkedIn, whether they should or not they can.

And you don't know if they've done that or not because these are data brokers that don't tell you what they have, even though by law. They should. They don't. So what if 

[00:13:46] Paul Breitbarth: No, because they tell you that they cannot find you because you are, the 15, 000 attributes that they know about you are not specific enough to single you out of that database

[00:13:58] K Royal: but they can sell it to a

[00:13:59] Paul Breitbarth: and they can't put your name and address on it, so it's fine, then they don't know who you are. Well, I'm sorry if you know 15, 000 things about me, that's probably at least 7, 000 more than I know about 

[00:14:11] K Royal: Exactly. I doubt if I could list 2000 things. We should probably have a competition one day for people for a

[00:14:16] Paul Breitbarth: Oh, I'm pretty sure that 2, 000, I'm sure I can if you, if you take the time but 15, 000, hell no. But if you know 15, 000 things about me, then I'm pretty sure that you also are able to find my name.

[00:14:30] K Royal: Yeah. and could pick you out of a crowd. 

[00:14:32] Paul Breitbarth: and in any case, tell me what information you have about me. And where it's coming from.

[00:14:38] K Royal: And again, the data might be co mingled. 

[00:14:41] Paul Breitbarth: Yeah, I mean, you are K Royal from North Carolina. That's fairly easy. It's pretty sure that you are the only K Royal in North Carolina. Just like I am the only Paul Breitbart in The Hague, The Netherlands. 

but if you are called John Johnson and you live in New York 

Who says that all the information about John Johnson in New York relates to you?

Could be to the five or six other John Johnsons in New York.

[00:15:08] K Royal: Well, and I'll share that one of the reasons that we moved here, and it's South Carolina, but that's okay, I'd rather be in North Carolina. One of the

reasons we moved 

[00:15:16] Paul Breitbarth: in the South.

[00:15:17] K Royal: No, I don't want to be in the South. I don't want to be anywhere where they say y'all naturally. That was my one rule. And look at me, I'm in South Carolina.

Anyway, enemy's beautiful area here. Don't get me wrong people. I just don't want to live in the South. I fought for 30 years to get out. But one of the reasons my daughter moved here is because a lot of her childhood was spent here because her father lived here with his second wife. They're divorced now, so I, I don't know how we do this ex stepmother thing, but it's, she's part of the family.

Her name, her middle name is Kay. Her last name is Royal. There is a big possibility that records for and I'll let me give, make up a name, Margaret. Margaret Kay Royal is confused with K Royal. And this is why I actually dropped my middle name is because now with all the prevalent online records, people were confusing K Marie Royal as Marie Royal. And I, to try to correct the record. So it's really me. It's K Royal. Well, no, it says Marie Royal. So I dropped the middle name deliberately and yes, it's been decades and I've never told my mom, but that's one thing. I have records in various names and when I go to buy a house, they print out all those names and I have to go through and say, no, I was never this person.

I was never this person. I was never this person. It's an error. It's me, but they had my name wrong because of my name. But what if it's co mingled with what, what if Margaret Kay Royal records are co mingled with my records? Cause someone tied us together for something in South Carolina. Now I have all of her shopping traits and online behavior associated with my profile. You can't one, to know that it's there or two. unseparate them. It's absolutely horrible effort to even try to go through on a very simple thing, like a medical record that gets co mingled because someone used your name and your insurance card and went and got treatment for something. And now it's on your medical record and your profile, and maybe it's available to potential employers to do it because it's a profile that have it.

And you don't know, to be able, and then two, to be able to prove how do you prove to a facility that that wasn't you when they have documentation that it was you. 

[00:17:37] Paul Breitbarth: Data hygiene. Yes. 

It's very, 

[00:17:39] K Royal: it's 

very difficult. So your profiles could be wrong. Now, my co professor, Professor Gary Marchant, always says he would rather be shown advertisements and directed to sites that are tailored for him based on his online profile.

No, I don't want that. 

[00:17:57] Paul Breitbarth: no, me neither. 

[00:17:58] K Royal: happy being shown generic ads.

[00:18:00] Paul Breitbarth: Well, I'm also not happy being shown generic ads. They are still annoying as hell, but I take them any day over the personalized ones.

[00:18:08] K Royal: Well, and at one company I worked at we got a complaint from a gentleman that said he kept opting out of all of our advertisements and cookies and tracker. And then he would write me and say he was still being shown the ad on social media. I'm like, You're being opted out of targeted advertisement.

We're not targeting you. These are ads we're buying like a billboard. You can't opt out of those. They're just there. So if we're buying ads on social media to show to the general public, you're one of the general public. So you have to understand that there's, there's that issue. Well, maybe you're not being shown the ad based on profiling.

Maybe it's just general advertise me and the person's buying a ton of ads.

[00:18:49] Paul Breitbarth: is, that is certainly a possibility, but I mean, if you, indeed, it is important that the profile that companies have about you. Is correct, especially if consequences are given to that profile, 

and 

[00:19:04] K Royal: significant legal 

[00:19:06] Paul Breitbarth: exactly, yeah, the significant effect or legal effect in Article 22 of the GDPR, similar legislation around the world, 

[00:19:15] K Royal: And 

in the U. S. states, by the way, they're defining 

[00:19:18] Paul Breitbarth: And in the U. S. states, absolutely. And in December, on December 7th, the Court of Justice of the European Union, for the first time, actually ruled on this very specific topic. It's the so called Shufa ruling. I'm sure a lot of our listeners have heard about it, have read it, maybe. But this is about credit rating.

And this was actually a guy from Germany who was denied a loan. Shufa is the leading German credit rating agency. Holds information about some 70 million people. So this is, this is a big player. And the question, well, one of the questions before the court was whether it was the lending organization.

Or the credit rating agency who was doing the automated decision making, the profiling that led to the credit being denied. And the court here said, well, it was actually Shufa. So the credit rating agency that is responsible for the automated decision making. So they are the ones that also need to meet the GDPR requirements under article 22 to actually make sure.

That proper safeguards are in place that there is a proper legal basis. And for loans in general, that there is a legal basis like that. But Shufa played the determining role in this automated decision making. Also. Decision was then interpreted quite broadly because just the output from the system, computer says no is already seen as part of a decision.

Even though the final decision on whether or not to, to, to give the loan was for the for the bank but based solely in this case on the recommendation from Shufa. And also Shufa knows much better what information they hold and why they come to that decision, the credit rating agency, because they have the data on the basis of which that decision is made.

So also for things as a right of access and a right of correction and a right of deletion. You need to be with the intermediary and not with with the bank. 

And 

[00:21:19] K Royal: exactly. And, and to reinforce that, this is a topic Paul and I have talked before about a lot of the privacy decisions are being put in the hands of the consumers and the consumers are the ones that know the least about what companies are doing with their data. And so the burden really needs to be on the entity that has the 

[00:21:39] Paul Breitbarth: yeah, so there needs to be legal basis. And in this case that could have been, for example, the contract where the the individual said, Hey, I want the loan. And as part of the contract of the contract to get the loan or the pre contractual phase is part of all of that you would go through the credit scoring, I think.

That is still fairly acceptable but then that decision needs to be transparent and ideally just have some human intervention, human explanation to also make sure that as an individual, you can understand why you are not suitable for a loan, what is the reason that you will not get it and why you have a low, a low credit score for, Whatever reason, 

[00:22:22] K Royal: And again, the significant legal impact seems to be disproportionate poor people to those who are not in the upper

[00:22:30] Paul Breitbarth: yeah, I think that that's easier to say from the u. s. Perspective because you have a much more experience with credit region credit rating agencies and credit statuses and things like that at least than I have here in the netherlands

[00:22:44] K Royal: Well, and did you know that here in the U. S., pretty much every service you have, whether it's a credit card or a bank account or whatever, will now do free monitoring of your credit score. Now it's not the exact same thing that the credit agencies do, but it's supposed to be similar. some of them are a I don't know, a 400 to 800 scale.

Some of them are a 300 to 700 scale different scales that are used. They tell you what they base it on, but I can go to all of my credit cards and click on what's my score today. Not to mention companies like Credit Karma or Nerd Wallet also offer tracking apps that will track your credit score. So I, again, as we talked in the beginning, I'm competitive.

I look at my score and I'm like, Ooh, what can I do to improve my score? And it tells me, you need to pay down this balance or you need to do this or you need to do that and everything like

[00:23:36] Paul Breitbarth: And I'm just sitting here with a blank face, and I have no idea what this would look like, and what kind of scores and how high or how low they should be, because this is not something that exists in the Netherlands, and I don't believe it exists anywhere in the European Union 

[00:23:50] K Royal: Yeah, 

[00:23:50] Paul Breitbarth: that level of granularity.

[00:23:52] K Royal: not nobody has it at the level that we have here. So it's not something that people are accustomed to. But I will also say the factors that go into it are also things that certain employers will look at and take into account when offering you a job. For example, if you want a job with the government, especially handling certain information, FBI, CIA, they're going to do a deep dig into your background.

And one of the things that they look for is, are you late on your income tax reports? What is your credit score? Have you missed payments? Do you have enormous debt for something? Do you like to gamble? Do you have lots of cars? Because any negative factors, any burdens For you or your family can be 

used against you to convince you to turn against the state.

They would 

start with something small. Oh, 

[00:24:39] Paul Breitbarth: It can corrupt you. 

[00:24:40] K Royal: yeah, your mother needs a particular type treatment and you can't afford it. We'll give you 5, 000 to go get your mother that treatment. If you just give us this little, very little innocuous file that doesn't really mean anything. They just want something little to get you started with.

And then they'll build up that little until they've got you so far in their thumb. You're basically a

[00:25:00] Paul Breitbarth: No, I mean that, that part I'm familiar with because while I was still with the Dutch data protection authority, I was also part of the supervisory teams for the terrorist finance trackings program, which is all classified information. So that meant that I had to be screened by our security services, to be approved to access state secrets, because that's basically what it was. 

[00:25:20] K Royal: would approved. 

[00:25:21] Paul Breitbarth: well, I was approved, but it took it was a serious screening because it needed to be at a certain level because of you friendly Americans requiring that. but that meant, I think I filled out a 27 page questionnaire. where 

[00:25:36] K Royal: Oh, God. 

[00:25:37] Paul Breitbarth: to list every single country in the world where I had ever been, ever traveled. 

[00:25:42] K Royal: ever traveled.

I had no idea. 

[00:25:45] Paul Breitbarth: well, indeed, that, that took a long time to come up, with that list. But also tell about all of your closest friends indeed, financial situation, many other things. Then you need to include a few references and also identify everybody who has a key to your house. because they may interview those people as well.

and 

then, they came to my house for a three and a half hour interview, three and a half 

[00:26:11] K Royal: Wow. 

Wow. Now, 

[00:26:14] Paul Breitbarth: they know every single thing about you.

[00:26:16] K Royal: well, and for to pass the bar exam or to be admitted to the bars here in the United States, each state does a character and fitness exam and Arizona is one of the most stringent character and fitnesses that you can go through. When they ask you to turn in the filings for every court case you have ever been involved in.

They mean including family court. So if you've gone through a nasty divorce or a nasty child custody or child child support payments or anything. You have to turn in everything. My entire filing was about six 

inches deep 

[00:26:52] Paul Breitbarth: Wow. And then that's assuming that you have retained everything 

[00:26:56] K Royal: right, right. , but when you've been through two nasty divorces based on domestic violence and order of protections, you get profiled. I'm pretty sure, yeah, you'll get profiled and I used to think it would stop me from running from politics because someone could bring my background up. Now

nobody gives a 

crap. 

[00:27:13] Paul Breitbarth: if that's the worst thing that happened to you 

[00:27:15] K Royal: Right? Nobody cares. Nobody cares. 

[00:27:22] Paul Breitbarth: are about 40 years too young to become president, so 

[00:27:25] K Royal: It's exactly. I need to be a little bit older, but hey, if anybody knows how I can get involved in politics and get some money and run for something here, please let me know. Actually one of the more prominent physician politicians in the U. S. right now is Jamie Raskin because he was leading the congressional hearings.

And I actually worked with him in setting up programs for law students to go into schools, Marshall Brennan Constitutional Literacy Program to go into schools. To help teach civil law to high school students. It was fantastic and I worked with him for years in setting that up at the school that I worked at, at the Arizona State University Law School.

So it was cool regardless. Side note, squirrel, but we're coming to the end of the presentation anyway. We're trying very hard to stick within the 30 minute mark for y'all. This is a very complicated topic. There are a lot of laws that apply in the data protection and privacy realm, but there are a lot of nuances that certain companies aren't regulated and the enormous amount of data that they can compile about someone online as Paul was saying 15, 000 data elements, 50, 000 data elements as well.

I've heard that, that also especially given social media. So it is something that is, it can have serious consequences for someone depending on the context in which it's used. It's something that the average consumer has zero insight into as to where to go to even find this information. Jeff Jockish keeps a, A database of data brokers and different things like that.

You can't get information out of them. It's near impossible. And you have to give them more information about you than what they might have in order to go through an identity verification process. So they know who you are and then they don't give you the data because 

[00:29:16] Paul Breitbarth: Yeah, 

[00:29:16] K Royal: have it connected. I call BS.

[00:29:19] Paul Breitbarth: no, I agree. I agree there. And, and maybe just before we, we wrap up as a company, if you are the DPO of a company that is involved in credit rating or fraud assessments that are automated or any of those.

[00:29:35] K Royal: that has significant legal 

[00:29:36] Paul Breitbarth: Yeah. Make sure you are as transparent about it as possible. And obviously you will not describe your full fraud algorithm on your website because then you also tell people, Hey, this is how you can avoid it or circumvent it.

But you do need to be transparent about it and you do need to give people the 

possibility to understand why a certain decision was made.

[00:29:57] K Royal: And put a human in the mix. 

[00:29:59] Paul Breitbarth: Exactly. Put a human in the mix that also helps you to avoid being fully subject to article 22 of the GDPR, because then it's not solely based on automated decision making, but then also the human 

[00:30:13] K Royal: Right. So, yeah. 

[00:30:15] Paul Breitbarth: in the, in the whole decision.

[00:30:17] K Royal: And, and one of the most common one that happens is applying for a job. Someone goes in, you have the application set up. You want 10 years of experience in privacy and someone says no and you automatically kick them out of the process because they said no to 10 years. They've only got nine years. They were being abundantly, you know, honest and you kick them out.

Anybody that you automatically kick out of a process, you should have a human in the mix. Now, I understand maybe you're a company that processes 100, 000 applications and you can't, but have a way of adding some protections to that or doing a privacy impact assessment on why you would rely on the automated decision to kick them out.

So as long as you do your assessments, you're transparent about it. You do your best to put a person in the mix or mitigate the damage that it might be. Then then you're trying to 

to 

[00:31:07] Paul Breitbarth: Absolutely. And on that note, we'll wrap up another episode of Serious Privacy. If you like our episodes, please share with your friends and family and colleagues. Join the conversation on LinkedIn. You'll find us under Serious Privacy. You will find K on social media as Heart of Privacy and myself as EuropolB.

And K is trying to make me laugh and it just doesn't work and she just cracks up herself. Until next week, goodbye.

 

People on this episode