Serious Privacy

Nomadic Privacy - A World full of topics with R. Jason Cronk

Dr. k royal and Paul Breitbarth with R. Jason Cronk Season 4 Episode 21

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 41:35

Send a text

In this episode of Serious Privacy, Dr. K Royal connects with R. Jason Cronk, infamous privacy professional extraordinaire. Topics include a brief weekly update on privacy events, The Rise of Privacy Tech (TROPT) and its summit, led by Lourdes Turrecha, privacy frameworks, such as NIST privacy and 800-53, ISO27001, appendix A, ISO 27701, his new podcast with the BBB, privacy abbreviated with Donna Fraser, his book with IAPP strategic Privacy by Design, his non-profit the Institute of Operational Privacy Design, and much more. In fact, no surprise, our friend Ralph O’Brien’s name came up in conversation as well as Deirdre Mulligan, new Deputy Chief Technology Officer for policy for the United States and privacy engineering a la Michelle Dennedy. We also touched on the Privacy Law Scholars conference along with one of our favorite young law scholars, Wayne Unger. In addition, Jason covers many of his activities, such as speaking at CERN

As always, if you have comments or questions, let us know - LinkedIn, Twitter @podcastprivacy @euroPaulB @heartofprivacy  and email podcast@seriousprivacy.eu. Please do like and write comments on your favorite podcast app so other professionals can find us easier. 

If you have comments or questions, find us on LinkedIn and Instagram @seriousprivacy, and on BlueSky under @seriousprivacy.eu, @europaulb.seriousprivacy.eu, @heartofprivacy.bsky.app and @igrobrien.seriousprivacy.eu, and email podcast@seriousprivacy.eu. Rate and Review us!

From Season 6, our episodes are edited by Fey O'Brien. Our intro and exit music is Channel Intro 24 by Sascha Ende, licensed under CC BY 4.0. with the voiceover by Tim Foley.

Please note this is an automated transcript and there may be errors. For the actual statements, please listen to the podcast.

[00:00:00] k: Hello and welcome to Serious Privacy. Today I have the delight of being here with R Jason Cronk. If you don't know his name, you actually should, Paul is off doing whatever Paul does. That's much more important than sitting here and chit-chatting with me. So I get to do an extra episode this week, and I'm so glad to have you on Jason.

It's an absolute delight.

[00:00:32] Jason: Thank you. K. I'm happy to be here. Longtime listener, first time caller. I, I've been so, I, I use your podcast boast when I'm driving long distances. 

[00:00:42] k: Nice. Thank you. 

[00:00:43] Jason: and back when I'm doing exercise or not necessarily like gym exercise, but on bike rides, I'll listen to your podcast you know, and the next thing I know, I'm an hour down the road or an hour into my bike ride and you know, all is 

[00:00:55] k: was gonna say, are you one of those that wanted us to cut back to 30 minutes so you weren't riding a bike for an hour?

[00:01:01] Jason: No, I think I actually submitted and said you should do longer.

[00:01:04] k: So, okay. Unexpected question. What is your favorite thing to do on a Saturday morning?

[00:01:11] Jason: Oh, geez. You know, it really, it, it really varies. So because because I travel so much, it really depends on where I am, what I'm doing, who I'm with. So it's, it's, it's, there's not a kind of, I don't have a standard answer, I would say. Probably my most likely thing to do on a a a, you know, on a weekend is go out and do a hike or a bike ride, To get out and, and enjoy nature again, depending on weather. Also sometimes I do that midweek because I'll be somewhere and I'll do that midweek because, and then Saturday I'll work. So it really varies.

[00:01:45] k: Right. Mine's easy. I like to sleep

[00:01:50] Jason: Yeah, I know a lot of people who've asked the answer and, and certainly if I've had a, a late Saturday or late Friday night then, then definitely sleep in Saturday. But my definition of sleeping in is like maybe waking up at 7, 7 30.

[00:02:02] k: Oh, I was gonna say, my sleeping in now is only till about eight. I have to force myself to stay in bed because with this job, My colleagues are mainly on the East coast or they're in Europe, and I'm starting my days at 5:00 AM in the morning, which anybody who knows me knows that is not a good thing. I'm definitely still sucking down the coffee on that first meeting and they know I'm not gonna be on camera till at least six, trust me.

[00:02:27] Jason: Yeah, I feel ya.

[00:02:29] k: Okay. So let's chat cause we have a ton of things we could actually cover. And one of the things I wanted to bring out is that you were able to go to the TROPT Summit and I was not.

So the rise of privacy tech held in, it wasn't in San Jose was, where was it? It was in

[00:02:46] Jason: so. 

[00:02:47] k: Cruz.

[00:02:47] Jason: Cruz. Yeah, so it's about, I don't know, 30, 45 minute drive. I took the 17 bus down. I flew into San Jose on Sunday, took the 17 bus down. I'm here for a couple of days. 

[00:02:59] k: Nice. 

[00:02:59] Jason: it was a great, great event yesterday, actually, right after this call Lourdes, who is the head of The Rise of Privacy Tech is picking up and we're going for a hike. So again, taking advantage of this. And then Saturday I'll probably be working

[00:03:11] k: Oh, give Lourdes a big hug. Hello for me, because I really hate that I couldn't make it. My schedule is all up in the air with the new job and the kids' vacation and everything like that. Oh, I wish I could have made it over.

[00:03:24] Jason: You know, it's such a great event, and, and I, I think she really did it even better this year. It, it's, one is very small and intimate, so it's good to meet people and talk to people. But two, she broke it up and, and kind of adopted some of the, the. The techniques from we've both been to a Privacy Law Scholars conference where they do kind of a round table discussion.

So they had a panelist that would talk about a certain topic area of, of interest to the rise of privacy tech people, which is usually like startup founders, privacy tech people, investors or buyers of privacy tech. And then we did 45 minute talk listening to the talk. And then we did 45 minute breakout sessions that were smaller groups.

I think there were . Three groups total, about 30, 30 people or so. And, and again, had, you know, had a, a moderator who was kind of like prompting us and asking us to discussing, and you'll like this, they actually started out an introduction, you know, every, go around the room, introduce yourself, and they asked a non, you know, kind of, I don't wanna say non privacy related question, but, but certainly something that was, 

[00:04:30] k: An unexpected question. Huh? 

[00:04:32] Jason: Yeah, yeah, yeah. What, what, what, what? You know, tell us something about yourself or you know, that nobody knows or something, you know, different ones. Everyone did different, different things. So it was interesting.

[00:04:41] k: Oh, I love it. I absolutely love it because I'm not sure what your role is with them. I'm a tech visionary. I think you have a different role. Is that right?

[00:04:52] Jason: I, I, yeah, I, I'm not not officially affiliated but I, I love Lordes and everything she's doing, and I, again, I, I love coming to this kind of more intimate environment as opposed to, you know, some of the events are like thousands of people 

[00:05:05] k: Oh, I know 

[00:05:06] Jason: like cat, cattle, 

[00:05:07] k: you learned that people were there that you really wanted to see after you get back.

[00:05:12] Jason: Exactly.

[00:05:13] k: Right, right. I've, I've seen that way too often, especially with the DC events where people come from so many different countries and you've never met them in person in your life, and then you find out they were there and you didn't have an opportunity to see them.

[00:05:26] Jason: Yeah. And, and even trying to schedule with people, you're like, oh, can you meet during this time? No, no, I'm doing it. It's like, yeah, you can't. But I did, I remember seeing you and Paul across the room at the at the summit. 

[00:05:39] k: like yeah, yeah, yeah. There he goes right there. Yeah, I know.

And I saw, I was supposed to meet with both Lourdes and Deb while I was there, Deb Farber, at the DC one wasn't able to make that happen. I mean, it literally, it's not even two ships passing in the night. It's, it's, yeah, just two rockets Passing because there's so much to do and so many people to see. But Paul and I are actually looking this is one thing we've started a proposal for sponsors. One of the reasons is to be able to allow us to host events at these events rather than doing a happy hour, cuz everybody does a happy hour. Maybe we do a morning coffee hour so the fans of serious privacy can come and that includes people that you would wanna see when you're at these big events. 

[00:06:24] Jason: Yeah, no, I, I think I, I have been to some sponsored like breakfast events. Obviously you know, they have breakfast at, at the one we're talking about, but I've been to some, some coffee I think. Oh, yeah, you're right. A ha not a happy hour would probably be good. One of the problems obviously is they, they, Try to cram so much in like two and a half days that it's, it's like you get invited to 12 different things and it's like you can only go to one.

Similarly, I went to a a A N workshop and that conflicted with something else. I went to Lourdes, had a little gathering and I went to that instead of something else. So it's, it's, you gotta pick and choose,

[00:06:57] k: You do. You do. And a lot of people are extroverts. I am not. So when I go to these, I know it shocks everybody, but when I go to these events, they exhaust me on so many different, I'm excited to be there, but they exhaust me on multiple levels.

[00:07:15] Jason: yeah, I, I had to force myself I was an introvert for years but then getting into privacy and, and even, even prior to that I forced myself to start introducing myself. Now, I'm, I'm not as Forward as I used to be used to when I was, years ago, when I was going to to some of the privacy events, I would literally just walk up to tables of people and be like, Hey, I'm Jason Blow, you know, and start, start talking with them.

Now I get enough people. Walking up to me just because of my infamy, I guess. No, just so 

[00:07:48] k: go with infamy. 

[00:07:49] Jason: yeah, enough people recognize me and, and so I, I don't, I'm not quite as, as outgoing as I used to be, but I still had to forge myself to be more extroverted than probably my natural tendency. I hear your voice. I just gotta listen. 

[00:08:02] k: they recognize my laugh 

[00:08:04] Jason: like, oh wait, that's k 

[00:08:06] k: and it is true, and I feel really bad because I see people and I am absolutely t totally horrible, horrible, with names and people will run up to me at these things and be like, who are.

[00:08:18] Jason: Well, okay, so I, I, I've been telling people this recently cuz I did the exact same thing at trope. I sat down at a table and, and I was like, oh, hi, I'm Jason. And, and they said, oh yeah, we know each other. And I'm like, oh. Oh, okay. So it, it, it, the, the thing is, so there's a, I can't remember the name of the, there's a, a, a, a name for like, how many people you can know. And it's like, there's like 150 people that you can know intimately, 

[00:08:41] k: outside of any context. Yeah.

[00:08:44] Jason: Yeah, yeah, yeah. That you just know them. And then there's a few thousand, maybe a thousand to 2000 that you can be like, I know that person that's K, that's Paul, you know, and you recognize them. But, and you may know one, like where they're from, what country or something like that.

Some, some you know relatively easy fact. The problem is, you know, you and I probably have gotten to a point I have like, 4,000 LinkedIn connections, 5,000 LinkedIn connections, something like that. So there are people I'm connecting on LinkedIn and I'm like, I have no clue who these people are. I mean, I knew it at the time I connected, but it was six years ago. And so they see my post because I'm very active, but I, 

[00:09:22] k: you may not hardly see anything from them at all. Yeah, that's true. That's very true. But let's go ahead and talk about some of the fascinating things we've talked about.

So, since we started with the rise of privacy tech and you told me a little bit about that, was there any particular conversation or topic that stood out to you? You can basically touch on any topic that someone throws at you and just wing it essentially. 

So was there anything that stood out to you?

[00:09:47] Jason: Well, I did like, I think the first year there were more people I talked to who were in. The compliance tech space and I think it was oversold. And so most of the like companies that were there that were doing privacy techs were not in the compliance space. This doesn't say that stuff that they were doing wouldn't help us compliance, but they weren't there like that program management compliance aspect.

They were more focused, more narrow. You know, so, so that was, that stood out a little bit. And just the variety of people like you said, I mean, there was just conversations all, all, all over the place in, in all different aspects because there were people from, there were people from startups, there were people from big companies.

There were people fr, there were certainly some investors. But there was like there was somebody from Consumer Reports and you know loft, there was a couple of lawyers there. So it was a very variety, you know, a much a very good variety of  people and a very good variety of conversations as well. I met somebody, I mean, she had been. A, a federal prosecutor for years and was now like in cybersecurity  department, but trying to learn, learn more about privacy and you know, trying to, trying to, I can't remember what company she was with, but it was a, again, interesting conversation at lunch.

[00:11:06] k: Absolutely. Absolutely. And before we go too far one of the things that Whit was suggested was that we try to cover some of the, the most recent news. Not that we necessarily talk about it, but just cover a couple of things so people who may not be paying attention to what's actually happening. Come on.

So, just some high level news that we're looking at is the US intelligent community is addressing fsa, again, the 7 0 2 reauthorization. If you were looking at making

comments on the h h s call for comments on the post Dobbs decision with more protections for rep reproductive rights, make sure that you get your comments in for that. There was Louisiana as. Kind of following Utah and requiring parental permission for minors access to that. By the way, there is a new principal deputy for the US Chief Technology Officer. This should be a name that's near and dear to most people in privacy. It's actually Deirdre Mulligan. So, yay, so excited for that. That'd be awesome. Maybe we could actually get deirdra on our call. That would be phenomenal. So we can look at that. There's a call out to Deirdra. I think she might listen. Ha ha. There is a new Connecticut bill that's on AI privacy and AI policy, so look for that as well. If you're looking at it. I think there's been a couple of breaches lately. Especially one over in the uk the UK with. Privacy vaccines. I think that was over in the uk. Jason, you may know, there was a bill from the FFC or a fine from the FCC on non-consensual robocall. So if you never thought there was an issue on robocalls and actually holding companies accountable, there's one there. Nevada passed a health data privacy bill. I have not gone and looked at it in detail. I think that was just the end part of last week. So I haven't gone and looked at that one. And anything else. Current news, I think Florida, I think Florida passed something.

I don't think it was a privacy bill. Was it?

A digital

[00:13:04] Jason: Yeah. Y it was, I mean, it was a, it, it, it has a, i I haven't looked at the bill actually, but it does have a lot of privacy implications. A lot of people are not calling it a privacy bill 

[00:13:15] k: Right, because it's like a bill of rights, not an actual privacy act, but Connecticut

[00:13:20] Jason: Yeah. 

[00:13:20] k: ahead and passed additional privacy provisions. Right.

[00:13:24] Jason: That I, that I don't know. 

[00:13:25] k: I think they did. 

[00:13:26] Jason: the, the, Florida bill was targeted at big tech companies

[00:13:30] k: Got it. 

[00:13:31] Jason: and I think there's a, a limitation of like, they have to have a billion dollars in sales a year. And, and we think about that as, as consumers are, that's a lot of money. There's only a couple, but there's the fortune, like every fortune 500 company is over a billion dollars a year in sales.

So that's, there's still a lot of companies that large companies that, that will affect, 

[00:13:51] k: right. right. So internationally to cover Paul's here, here, I think there was a fine Let me see. When was this to Spotify out of Sweden for 58 million, I believe it was for gdpr. They were closer to home. Microsoft got a cop fine for 20 million over at Xbox. is following the same gaming fines that we've been seeing. European commission is actually gonna seek Google breakup ad text, so good luck with that one. If anybody could do it, it would probably be Europe. And then something that speaks to me is there's a, a lot of looking at unlawful data processing, looking from the insurance perspective. So something that I'll be paying close attention to. So now back to our regularly scheduled programming. I wanted to ask you in particular, I know there's a lot of things you can talk about. We just got through saying that, but I really would like to you go to you. There was a couple of things that we talked out. I wanna touch on one. You do a co-host podcast with the Better Business Bureau privacy abbreviated.

So we'll make sure to give everyone a call out with that, with Donna Frazier. But I also wanted two in particular wanted to talk on the nonprofit that you're chairing, the Institute of Operational Privacy Design and your design process. then nist cause

I'm looking more at doing NIST. Now the company I'm with is a NIST house, and I've, I've never been a NIST house. I've always been an ISO house, so

[00:15:18] Jason: Yeah, so, well, I, 

[00:15:19] k: those. Which ones would you like to jump on first?

[00:15:22] Jason: yeah, let me, let me start with the, the last one, which you just talked about, because that, that is true that I see a lot of companies, you're either ISO or your nist. And, and there there's certainly overlap and there are differences

I in approaches, but there are reasons that companies do ISO or, or, or choose nist. And, and, and. Both of them have very strong cybersecurity related  standards or the cybersecurity framework. So, and that's typically what happens is because cybersecurity is kind of ahead of privacy, if you're an ISO or N shot from a cybersecurity perspective, you will. you will probably more likely gravitate into, into NIST. A few years back. I was with a company consulting with a company that was an early adopter of it. And so it was really lucky on my part because they essentially paid me  to 

[00:16:11] k: To learn it, 

Nice. I will say it's not easy to learn. Jason, the, the NIST Privacy

[00:16:18] Jason: well. 

[00:16:18] k: is something that just makes you scratch your head and go, huh?

[00:16:22] Jason: Yeah, so, so one of the things with NIST is you have a lot of very specific jargon that is specific to it. And, and unfortunately, so people willm through it and they will make assumptions about what they've read and they'll go try to apply it. But I, I, in my view, if you actually sit down with it and take the time to understand it there's a lot of value in doing it correctly.

And I've, I've do with that, with clients as well. Now to, to your point about learning it and, and, and, and this difficulty and kind of, there's a lot because it's very, I don't wanna say dense, it is very, in terms of again, the jargon is like, how do I 

[00:17:02] k: Yes. 

[00:17:03] Jason: this one very important 

[00:17:04] k: Yeah, this one little thing it says to do, how, what, what are they asking me to do? But I will say that in general, the ISO privacy, the 27 7 0 1 for privacy. It basically at its very heart, just says, oh, Take the oh oh one where it says information systems and include privacy systems, and it's like it, it doesn't. So I will say I'm used to working with that. The NIST actually does focus on not just saying, integrate privacy into your information systems and replace those words with this phrase. It actually is a framework for privacy, I will say that. So I'm gonna force

[00:17:43] Jason: Yeah. 

[00:17:43] k: to dig my way all the way through it.

[00:17:46] Jason: I am the author of the IAPPs textbook, strategic Privacy by Design. But I am also now working with a co-author on a new book about implementing your privacy program using the NIF privacy framework. So you look forward to that not to say wait for that because you may be holding your breath a little too long.

So you want to get dive in now? But there's certainly some benefit. Like I said, I, I'm a very big proponent of it, and I just think it needs a little bit more handholding you know, for people to kind of dive in. And so one of the things I wanna say is you don't have to and, and we're gonna try to emphasize this in the book, if you don't have to do everything so you can take it at a level that is Reasonable for your company.

I mean, even like an examples I've used in training I, I've gone through and at the very highest level, the, the, what's called the function level of the core is, is for like a flourish shop. Imagine a, a florist shop of like, you know, 20 people. And, and, and the example I use is they're operating in Washington DC and they're delivering to, delivering to like embassies or like, you know, people who, who may not want.

You know information about their flowers exposed you know, things like that. So they have a little bit more heightened concern about flower delivery than say the typical floor shop. But you can do it at a level. You don't have to hire, you know, a huge expert. You, You, 

[00:19:07] k: Kinda like the maturity framework. you can pick your level. of maturity.

[00:19:11] Jason: right? Basically, you know, I mean, you've done this before, I'm sure as well. It, it's like just a very high level. What data do you have? What kind of policies do you have about it? You know, what kind of control are you giving individuals? How are you communicating that and how are you protecting data that's like the five functions?

And it's like, you know, even for a flourist shop, you can answer those questions.  You know, you don't have to do the hundred outcomes. So there are different levels, and in fact, I don't know any company that is doing like the comp. They, they may even be attempting to do it, but they're not doing the complete thing.

They're kind of condensing and picking and choosing. And that's perfectly appropriate because it is a framework. It says you do not have to do everything.  Pick what's appropriate for your company at your level of risk sophistication and at your maturity level.

[00:20:00] k: Nice. Well, I will say I, I cheated a little bit on it. I'm using Trust Arch's privacy privacy Central. I'm mapping all my G D P R activities, which then goes and tells you what you have fulfilled in any of the other frameworks that you choose. So I'm answering it according to G D P R. And by the way, I've worked with the cybersecurity framework for years. So aside from iso and this altogether, the cybersecurity framework is simply phenomenal.

[00:20:32] Jason: yeah, absolutely. So there's been a, a number of things that I, I've been working on over the years that have kind of culminated in this creation of this nonprofit called the Institute of Operational Privacy Design. So, I've, I've clearly been in privacy by design for a while, and there were two things that used to bug me for years.

One was a lot of companies using this. Privacy by design is kind of a sales point, but to answer the question it's like, what are you really doing? And they may be doing something really phenomenally well, but they're not doing everything phenomenally well. Like, you know, maybe they've got, you know, one little privacy aspect is perfect.

So, so it, it, you know, I felt there was a need for kind of now. Very good friends with Anne Kaan who came up with the seven principles behind Privacy by Design. And I, I've been a privacy by Design ambassador for most of my privacy career, so over 10 years. And but, but the problem was you give the principles to engineers and they're like, what do you want me to do with this?

Right? So there was a missing, there was a missing step, and this is why I wrote my book and kind of what I've been working on the last 5, 6, 7 years. . And so that was, that was one. The second thing is I did a lot of vendor negotiations at the last job I had, and we asked for, you know, what are you doing for security?

They'd be like, oh, here's our ISO certification. Here's our SOC two, here's whatever it is. What are you doing for privacy? And there would be crickets,

you know, so there, there wasn't anything out there. Now, I started getting involved in the the iso standard, the, the program or project committee for the one that came out as 31,700 privacy by design and consumer products, or Consumer Goods and Services.

My, my personal take is I found it. One, very bureaucratic, and also just because of the number of people, sheer number of people involved. Even just in the United States, we had 150 companies at the, the technical advisory group. So it tends to be, it's like, what can we get? Acceptable to this large group of people.

And, and I wanted something that was more of a, of a gold standard rather than a, a, a baseline, you know, something that companies could strive to achieve and say, look, we've achieved this. So, so about two years ago, two and a half years ago, myself and a few others started this Institute of Operational Privacy Design.

Again, one of the reasons was I didn't. You know, I could come out with like Jason's certification, but who's gonna, you know, I mean, I, I have a somewhat of a name, but that's still, you know, people want some 

[00:22:54] k: Oh, you know, we need to lev, we have to levy the royal certification. I'm just saying I, I've been

[00:23:02] Jason: Right. Okay. 

[00:23:02] k: for years. We need a royal

[00:23:04] Jason: Okay. We're, we're returning, returning to the monarchy. I'm sure. Paul could, maybe Paul could support that . So so, so we came up with this Institute of Operational Privacy Design, and we worked for about two years on our design process standard, which the standard here is, is around your product design, your service design, whatever your design process from.

You know, ideation and thinking about it to actually sitting down and like constructing it and de designing it and developing it and then rolling it out and deploying it. And, and so what, what is the minimum things you have to, what are the components that you have to do, not to say that you're gonna ultimately end up with a privacy friendly product. but you definitely won't if you don't have this. This is kind of like ISO's 9,001 quality standard. It doesn't say that your end products are gonna be quality, but it says you have the management components to build in quality. So then the second standard we're working on, so we published that in January 2nd standard.

We're currently working on. is is one like a, a seal of approval. So the actual product or service to show that this product or service has been built with privacy of forethought and, and, and meets requirements. And the goal for this, I'll just tell you, and I know I'm trying to cram in a lot here, but one of the, one of the Things I was hoping to achieve eventually is to get approval by the European Data Protection Board under Article 42, which is certification schema  for Article 25, data Protection by Design and Default so that companies could use this as a showcase to say,  Hey, we are following the requirements of Article 25. Now, it's not a get outta Jail free card, but it at least is, is something in your pocket to, to show that you're trying to achieve

[00:24:44] k: Well,

[00:24:44] Jason: compliance.

[00:24:45] k: and there are companies out there who think there is a get outta jail free card. There is not. There is no such thing as a get outta jail free card. There is a document, document, document. Do what you say, say what you do. Be transparent.

[00:24:59] Jason: Right. 

[00:25:00] k: be comprehensive as possible. Be thoughtful. Be deliberate. Be considerate.

[00:25:05] Jason: Absolutely. So so yeah, so we're, we're working on that, on that standard. We're looking at a concept called assurance cases right now.

And the idea, so, so this has been used in the safety industry for, for decades now in like airline safety where they will make a safety case, which is a, a specific form of assurance case.

Basically. It's not a proof but it is an. , it is a structured argument that makes a claim. It provides an argument supporting that claim and, and evidence supporting those arguments. And it is a very structured, formalized approach to, to saying that you are not guaranteeing that this. Thing is safe, but you are assuring that everything has been done that you know can be done.

And this has been gen generic. This has been genericized in, in some cases in the security world for, for security cases. And now we are taking it and we're not the first, there's, there's, there's a couple of papers out there about making a privacy case. So we're using the same thing. We're basically, you're claiming that privacy has been built in, you're making an argument, a structured argument, and you're supporting that with.

With evidence. So this is what we're, we're looking at right now. It's a really I, I think it's an interesting, interesting way to go and we'll, we'll,  see where we, where we end up. So, and by the way, just a, just a nod. So going back to nist so you're probably familiar with NIST 853, the control set  of privacy and security controlled. Well, there's another one that you may or may not be. I wasn't. That is NIST 853 A, which is the assessment criteria for the controls. And in that, there is a section on making an effective assurance case. For the control. So basically using this concept of an assurance case, claiming the control is effective using a, a structured argument and a set of evidence to show that the, that the control is actually effective.

[00:27:04] k: I think at one time when I was working heavily at this, I was familiar with it, but I will say it's not something I would've called up to the top of my mind. I've been like, oh yeah, kind of like, you know, in the, the 27,000 series, it's the 27 0 0 1 a that you wanna go look at to actually get your, your standards to judge by  appendix 

[00:27:26] Jason: right. 

Oh, oh, I was just saying, yeah, I mean, it's the kind of thing it's like, unless I was looking for I, I was actually out searching for assurance cases in the privacy world and that's where I was like, came up. I mean, I've skimmed through 853 a when they came out with revision five flash year. But but I, it didn't absorb. right. And until I was actually like looking for evidence and articles and things about assurance case in the privacy world, this came up and I'm like, oh, so, so again, we're not the first people to do this. So, so we're not but, but it is still kind of like cutting edge 

[00:28:01] k: Yeah, we'll make sure that we get we'll make sure that we get links to that up there as well. And yes, if I neglect to put the links in when I do the description, please, y'all call me to account for it. I'll make sure I get those up. 

[00:28:13] Jason: Oh, oh, I forgot to mention, so your frequent co-host Mr. Ralph O'Brien  is one on, is on the standards committee of the Institute of Operational Privacy at the time.

[00:28:24] k: Yes. Nice. 

[00:28:26] Jason: So, there is a, so we have another tie  in there.

[00:28:28] k: we go. I love me some, Ralph. I'm just saying

[00:28:31] Jason: Yeah. No, he's a great guy. 

[00:28:34] k: he really is. 

[00:28:35] Jason: So we've got, yeah, we've got some great people on our standards committee. Great. A advisors you know, very well known. 

So this actually, I, I, I, if you'll permit me, this actually leads into, so one of our a advisors or advisor and committee. David Taylor is the one of the data protection commissioners at cern, Which is the international organization. And, and interestingly enough, they're in Switzerland. They are not governed by Swiss Law. They are not governed. by GDPR because they are an international organization and so they have their own data protection commission  of which David is one of the, the commissioners. All the commissioners are external just as a, as a check and balance, right? So but yeah, so, and I am going to be speaking at CERN next month.

I got invited to do a talk on, on privacy, so I'm gonna be attending. I'm attending the Privacy Enhancing Technology Symposium in Luan Switzerland. But I'm going to Geneva immediately before that to, to speak at cern

[00:29:38] k: Oh, that's cool. I am officially jealous now officially. That's really cool. I like that. I like that a

[00:29:47] Jason: Yeah. They, they, they won't take me in the, they, they can't take me in the. Like where the actual Collider is, because it's off limits, even the researchers, because it's in use right now.  I, I'm gonna get a tour of the facility and stuff. Yeah, that's what they said. So they, they didn't want like some, You know some bo some boon going through my brain and scrambling it or something.

[00:30:11] k: the only thing I can think of is Sheldon Cooper would love it.

[00:30:15] Jason: yeah. Yeah, yeah, yeah. 

So, and then, So earlier that week I'll be in, actually write up Paul's alley. I'm in, I, I'm not sure how to pronounce it. I think it's Delph and it's right next to the Hague in the Netherlands. And I'm keying the international workshop on privacy engineering.

[00:30:34] k: Ooh, that's cool. Do you plan to focus on anything in particular, or have you started 

[00:30:41] Jason: Yes, yes. So I don't have my whole talk, so I can't give it to you right now, but I can at least get some points out. So in 2013, I wrote a blog that was published on the I F P P website, and I said, is 2013 the year of the privacy engineer? So the title of this, my talk is, is 2023, the Year of the Privacy Engineer.

Because clearly I was being a little optimistic in 2013 that we were gonna move toward a more tech-based application of privacy. And I'm just gonna be talking about the different domains of privacy engineering cuz a lot of people, they think. only think about like data science and maybe IT infrastructure, but privacy engineering can cover physical infrastructure, like hospital design.

It can cover HCI or user interface design, like the interactions between users and interface. It can cover business process design. So there's a whole bunch of areas. And, and the other thing I'm going to kind of focus, focus on is. Kind of to, for people to watch out for the dilution of the term. Cause I, I run into a lot of lawyers who unfortunately see privacy engineering as anything technical. They don't understand about privacy.

Whereas in, in engineering, if you look to like the, the i e E, it's the application of math and science to. To a, a problem or, or embedding a system with some quality attribute of which privacy.

And, and so just because somebody is coding something you know it, it, you know, doing some coding or something like that, that doesn't make them a privacy engineer. And I'll give you, and, and part of this I, I'll, I'll give you a little hint. There were a couple of companies, at least one in particular that will remain unnamed, that their engineering department needed like non-engineers.

In privacy, they needed analysts and they needed people to do privacy compliance and stuff. But because they were engineering department, the HR department said you have to hire engineers. So they would slap the title of privacy engineer. On the job title, even though the job didn't cover privacy 

So there's been this dilution of the term and, and so I'm trying to like ring it back in and say it's, it's, you know, you've gotta be doing something scientific or mathematical or applying some type of. Scientific principles to, to solving this problem, not just because you're sitting there coding.And, and the same problem, the same software engineering has the same problem, right? Just because 

[00:33:09] k: Right, 

[00:33:09] Jason: doesn't make you a software 

[00:33:11] k: right. 

[00:33:12] Jason: Per se.

[00:33:12] k: and on the privacy engineering was Mickey Dennedy there at the Trop meeting she's

[00:33:18] Jason: No. 

[00:33:18] k: with them.

[00:33:20] Jason: she was flying off in a couple of days somewhere and one of, one of her co-founders of her business who's, it's like privacy code or something like that. They were there, but she was leaving and like in like two days to, to international travel, so couldn't make it. 

[00:33:38] k: a lot to talk about if you did.

[00:33:40] Jason: Yeah, so, so, yeah. No, no, she's, she's great.

I've known her for years and her co-author of the Privacy Engineers Manifesto, Jonathan Fox, I've, I've been on a couple of, well, I've been on panels with Michelle as well, but Jonathan also.

[00:33:53] k: Very cool. Well, when I asked you to come on, was there anything in particular that you wanted to make sure that you could share with our audience that we haven't touched on? I mean, you're just a man of many. N items, knowledge, items, and experience, and

[00:34:08] Jason: Yeah, no, I mean, there's, we, we, we've touched on a lot of the stuff that I'm involved in, in this privacy framework. The talks I'm gonna be doing the, the podcast, the institute, which is my passion right now and a lot of my focus I've been doing, you know, you and I saw each other in Phoenix for the, the BSI talk.

I, the last couple of months have been travel, travel, travel, travel. So I'm, I am. Currently nomadic, I have not had a permanent location since last August. 

Like I said, I was right before this trope event here in San Jose. I was in Denver at the Rocky Mountain Information Security Conference giving a talk. The week before that I was in Boulder. So I was in the same 

[00:34:47] k: Nice. 

[00:34:47] Jason: least at the Privacy Loss Scholars Conference. But it's just, and it's, it's starting to become a little bit taxing and I need to get back to actually doing work instead of just talking about it.

[00:34:57] k: I probably

had a few friends, friends at the Privacy Law Scholars Conference. One is a one that we've had him on the podcast before. He is a young privacy scholar that I respect. Huge. It's Wayne Uner. he came straight outta law school, got a visiting professor role. I think he had three papers published while he was in law school, and he graduated a semester early. So very much Aimed for and suited for the, the academic life. And so I'm, I, he wrote me something about being at the privacy law Scholars conference, and he said he definitely had imposter syndrome. I said, oh, if, if anyone was not an imposter, it would be you. 

[00:35:37] Jason: But so the other, the only other thing I can think about that I do want to just briefly touch upon that is, is one of my passions now is talking about privacy risk, and I won't go into a whole spiel, but but so the, the niche framework is also called a tool for managing enterprise privacy risk.

And, and part of my. travels around, especially when I've been talking to security group, cause I've talked to some local chapters of ISACA and ISSA and csa is, I talk about pr the privacy risk versus security risk. How, how they're related, how they're different. . And, and also some of the problems with a lot of the way people address risks or, or kind of do risk assessments with these low mediums and high, and these qualitative measures or, or ordinal measures where they're doing like one to five.

There's all sorts of Structural problems, there's human bias problems. And in fact, at RMA ise, it was interesting. I, during one of the breaks from the privacy thing that I was at, that I went over to their security and somebody was talking, and they were, they had a longer time to talk about this, but they were talking about like the different human biases.

Things like if you spill your coffee in the morning, you're more likely in a risk assessment to rate. Risk higher than you would, or or, or there're just all sorts of again, things that, that, We need to move in the privacy profession towards a more objective numbers based risk assessment and not the subjective kind of quality You think about your insurance company, right?

Your auto insurance company. Imagine if they said, okay, you're high risk. We're not gonna assure you or your, your, your medium risk. That's a thousand dollars a month because that's the only bucket we have. , you know, they have actuaries and statisticians and, and they get feedback. So they, they give. A price for their for, for their premium, or not premium, but their , yeah, they're premiums.

And then the next year they go back and say, well, were these people in this category really risky and fi and, and they redo their assessments using like regression analysis. E eventually we, in the privacy community and the security community need to get to that point. 

[00:37:53] k: More exacting. 

[00:37:54] Jason: much more sophisticated, not just throwing something at the wall and saying, oh, that's high risk,

[00:38:00] k: Right. Got it. I like that. I like that very much. So with that, Jason, thank you so much for joining me. I truly, truly appreciate it. For all the listeners out there, thank y'all for joining us as well. You can find us online. At any of your podcast apps, we're also on LinkedIn as serious privacy. You can find Paul online as Euro Paul B. You can find me at heart of privacy. You can find Jason at. 

[00:38:26] Jason: So , I know that privacy Maverick on Twitter, but I've kind of, I haven't posted in like six months. So that's yeah, . 

[00:38:34] k: We understand it. We'll give you his LinkedIn link so you can find him anyway. But with that, thank y'all so much for joining us again. And Paul usually says something like, goodbye for now or until next time. And I say, bye y'all. 

So 

bye y'all.