Serious Privacy

Mega Meta Privacy (Romain Robert and Gabriela Zanfir-Fortuna)

July 18, 2023 Dr. k royal and Paul Breitbarth, Gabriela Zanfir-Fortuna, Romain Robert Season 4 Episode 26
Serious Privacy
Mega Meta Privacy (Romain Robert and Gabriela Zanfir-Fortuna)
Show Notes Transcript

On this week of Serious Privacy, Paul Breitbarth of Catawiki and Dr. K Royal connect with Gabriela Zanfir-Fortuna, vice president for Global Privacy at the Future of Privacy Forum, and Romain Robert, Program Director and lawyer at NOYB to discuss the new landmark decision of the Court of Justice of the European Union: Meta v Bundeskartellamt, where the Court not only decided that competition authorities can use data protection when deciding on the potential abuse of a dominant market position, but also made some important calls on online advertising. Plus, Paul adds in some breaking news. 

Should you have any questions or suggestions, please reach out to us via or, or via Twitter at @podcastprivacy. You find us on LinkedIn as well - just look for Serious Privacy. K is on Twitter and threads as @heartofprivacy and Paul as @EuroPaulB.

If you have comments or questions, find us on LinkedIn and IG @seriousprivacy @podcastprivacy @euroPaulB @heartofprivacy and email Rate and Review us!

Proudly sponsored by TrustArc. Learn more about NymityAI at

#heartofprivacy #europaulb #seriousprivacy #privacy #dataprotection #cybersecuritylaw #CPO #DPO #CISO

Please note this is an automated transcript. For accuracy, please listen to the audio.

[00:00:00] Paul Breitbarth: Last week, we already briefly discussed the new landmark decision of the Court of Justice of the European Union, Mette V. Bundeskartellamt, the German competition office, where the court not only decided that competition authorities can use data protection when deciding on the potential abuse of a dominant market position.

But also made some important calls on online advertising reason enough to dive a little deeper into this case, especially now that the initial dust has settled and my guests today are Gabriella Zanthier Fortuna. Vice President for Global Privacy at the Future of Privacy Forum, and Romain Robert, Program Director at NOIP.

This is for me a little bit of getting the old band back together from our discussions in the Working Party 29 days, so I'm looking forward to geek out a little. My name is Paul Breitbart 

[00:01:01] K Royal: And I'm K Royal and welcome to serious privacy. 

[00:01:01] Paul Breitbarth: so welcome both! Good to see you, to hear you again after such a long time.

[00:01:06] Romain Robert: Good to see you. Hello. Hello.

Thank you for having us.

So for the unexpected question, if you had to open a business other than privacy, What business would you open? 

[00:01:18] Gabriela Zanfir: Wow. I, I actually have an immediate answer to that because I don't remember in what context, but I was thinking about it last year. And the thing I would open would be a library, library bookstore, a bookstore that also has a wine shop inside. So a wine shop bookstore. Oh, that, that's my, my go-to.

[00:01:44] Paul Breitbarth: Would we then also be able to read a book while having a glass of wine in the store?

[00:01:50] Gabriela Zanfir: E. Exactly. That's the point. You know how usually you have like a tea house in the bookstore or a coffee shop in the bookstore, a wine shop in the bookstore, you know,

[00:01:59] Paul Breitbarth: Much better idea.

So, well now, what about you?

[00:02:03] Romain Robert: would open a standup comedy theater, I think.

[00:02:06] Paul Breitbarth: Nice.

[00:02:08] Romain Robert: With wine as well. Of course.

Cause otherwise it's not that funny. Yeah. Something with wine and standup comedy. Yeah. And with book, maybe we can, maybe a John Venture, you know, like standup comedy books and theater

[00:02:21] Gabriela Zanfir: Yes.

[00:02:21] Romain Robert: and wine.

[00:02:22] Gabriela Zanfir: And why?

[00:02:24] Romain Robert: would be called a why not?

[00:02:26] Gabriela Zanfir: Yes, I love it. How about you,

[00:02:30] Paul Breitbarth: like a great idea. I think I would….bookstore was somewhere on my list as well, but probably right now I would say I would open a bakery, although I'm not sure whether I would look forward to the early hours.

[00:02:45] K Royal:  That's perfect and goes right along with this serious cooking show. 

[00:02:46] Gabriela Zanfir: We'll, we'll bring you into our cooperative, right, Roman. Sounds perfect.

[00:02:50] Paul Breitbarth: Well, probably I'll bake the, I'll bake the bread for the snack platters that you can serve in the wine bar, something

[00:02:56] Romain Robert: For the cheese and wine and everything, you see, we get a plan.

[00:03:00] Paul Breitbarth: So great plans

[00:03:01] Paul Breitbarth: we retire from data protection, right?

[00:03:03] Gabriela Zanfir: Yes.

[00:03:04] Romain Robert: Yeah. It would be one day.

[00:03:05] Paul Breitbarth: Okay, breaking news. This episode is all about the decision of the EU Court of Justice in the case Meta v. Bundeskartellamt. We recorded our conversation with Gabriela and Romain on Friday, 14 July. However, on Monday morning, 17 July, the Norwegian DPA suddenly issued a temporary processing ban on Meta for most personalized advertising.

Instagram. So only personalized ads that are based on valid and granular user consent or those that only use generic data shared by the user, such as their age and gender and location, only those data can still be used for ads. All other data not so much. And if Meta does not respect the processing ban, they can receive a fine of up to 1 million Norwegian Kroner Or around 89, 000 euros a day until they are indeed compliant. So according to the Norwegian DPA Meta's commercial interests are not and I quote in themselves of a highly compelling nature And there are a number of alternative advertising models that meta can rely on in pursuit of its commercial interests.

They also explain that the processing of personal data for behavioral advertising is complex and opaque, their words, and that most data subjects may not fully understand those processing operations. We've talked about that before many times because it is so difficult to understand also what inferences can be made.

So in the view of the DPA, it is META's responsibility to design a business model that is both lawful and viable, and it is not for the law to adapt to the business model. Finally, the Norwegian DPA makes very clear that things really need to change, and just updating underlying documentation and, for example, replacing references to a contract to legitimate interest, or now to consent, is not good enough.

In my view, this is a very big decision by the Norwegian DPA and one that seems to preempt the responsibilities of the Irish DPC. However, in the compliance order, the Norwegian DPA also explains that they had already requested that the Irish DPC issue a compliance order and a processing ban. in May of this year.

That request was denied. So with the decision of the court of justice in hand, the Norwegian DPA considered taking action had become too urgent and that waiting any longer was not in the interest of the Norwegians. So an urgency decision by the European Data Protection Board will also be requested. To validate the Norwegian decision and if the board indeed agrees that may have a longer and a broader effect than the Norwegian than the Norwegian order.

It could mean that the processing ban now put in place for both Facebook and Instagram will apply across the whole European economic area and for an indefinite duration of time and not just for the three months that the order is for now. So, as we already say in the episode, and you will hear that after, this discussion is far from over.

So now, without further ado, on with the conversation with Gabriella and Romain. So, oh man, this, this whole idea for the deep dive podcast started when... You started inviting comments on LinkedIn on this on this case, Meta v. Bundeskanzleramts.

what was your first response to the case when you read it?

[00:06:28] Romain Robert: The, the answer was the, the, the reaction I would say is like, what a non surprise, to be honest. Like, okay. That we already knew. And as you said, I was inviting for comments because I was more surprised about the surprise of other people on LinkedIn. I was not really understanding why people were so surprised about something that seemed to be a little bit obvious.

I don't know if you agree with that, and I wanted to know why we had all of us different reading of the. The core decision, and especially with you, Paul. We also had some, you know, discussion about very specific provisions and I, and I thought, oh, I really would love to have in that discussion with.

You know, my favorite privacy expert, and here you are, and so ha I'm so happy that you organized that, you know, because it's, it's so nice to exchange, you know, different readings or same readings of the same decision. Because as you know, and especially for Nobe, it's also linked to a lot of cases that we have, and it's really, really important for other aspects as well of Enforcement of digital.

Right. I think it's also important because it's not only competition, but there's also a lot of other aspects. So yeah, that was my first reaction. I don't understand why people are surprised.

[00:07:42] Paul Breitbarth: So, Gabriela, what about you?

[00:07:44] Gabriela Zanfir: I actually read it with a lot of excitement when I opened. The, the document and I remember reading it very, very soon after it was published on the website. And this is because I've been following quite closely this interaction at the theoretical level at least data protection rules and how they impact collection and sharing of personal data in the data economy and the impact of that.

On antitrust and competition assessments and because I was working. On following all of this this particular judgment of the Court of Justice was a landmark that the community was waiting for because we wanted to see clarity from the most authoritative voice in EU law. On how these two fields of law are supposed to work together.

I think there was a sense that there is something there. It's, it's it has a logic to it that they should be interacting with each other and maybe put some pressure in, in, in specific points. But we did not have this clarity from the Court of Justice yet. And this also comes on top of the fact that this is not really the first time ever where such questions that relate to data protection, privacy rules impacting conduct of.

Companies in the data economy. The Court of Justice had the opportunity to look at another couple of cases in the past, but what it usually did, and very famously in a case from early two thousands the as f case is that it usually said the data protection and privacy considerations are absolutely separate.

Then antitrust and competition assessments. So it always consider them in their own little world in the past, but, but this case completely overturns that line of reasoning and that's why it's absolutely significant.

[00:10:02] Paul Breitbarth: So, a little surprising still, given the history of the court. I think I agreed on, on first reading with Rollman's position that maybe most of what the court said was not, As surprising that they said it but I was surprised how they said it, that they were so explicit and basically so damning for, for meta in, in their in their arguments and in also in, in how they formulated certain points.

And maybe just as a, as a quick summary, because maybe not all of our listeners have read the case or re read the case this week. I think there were... More or less four main points that the court makes. First of all, that competition authorities indeed can look at data protection as part of competition cases when they're looking at abuse of dominant market positions.

But they need to involve the data protection authorities and follow their case law when making their decisions. When you look at personalized online advertising it is not generally seen as essential or objectively indispensable for the delivery of online services. Valid user consent also comes up where users should also have a real but also a granular choice, and the court reconfirms that.

And that is also, for example, the case about combining, On platform and off platform data. So what you do on a social medium can not just be combined for advertising purposes for everything else you can do on the internet. And the final point is about sensitive personal data. And the fact that if you do something online, that doesn't mean automatically that you have made it manifestly public.

so that it can just be reused for any kind of purposes. And then there are all kinds of smaller tidbits and nuggets in this court case that are, at least in my view, equally exciting to take a look at things like paid alternatives and using and retaining data for law enforcement request purposes. And even in paragraphs 91 and 92, something about the order in which to use legal basis for data processing.

Which also surprised me at least a little. So, where do you want to start?

[00:12:15] Gabriela Zanfir: Let, let me also add that one thing that surprised me is how systematic the analysis of the court was on lawful grounds for processing. I think we have the most comprehensive. Case judgment of the court interpreting Article six of the gdpr, which says something. And as you know, usually the court doesn't like to just make additional findings that are not essential to answer the referring court, but in this case, they really went and analyzed.

Every single letter in, in Article six and gave an answer to all, all issues that have been raised within the proceedings, even if not necessarily, all of them were essential to answer. The question that they received. And I think that that was really special for, for the community to get that level of clarity about all of the local grounds.

And it turned out that this case, which started from a competition authority and went through the judicial system in Germany as an antitrust case, it ended up. Converting into one of the most significant judgements in data protection law from the court of Ju of Justice, explaining some fundamental issues from the gdpr

[00:13:41] Paul Breitbarth: yeah, I agree. There will be lots of papers written about this case for a long time to come. Let's bite the bullet immediately. One of the points that that you raised in, in our previous discussion on, on LinkedIn was about offering paid alternatives. And I've seen quite a few people stating yes, here, the court has opened up the gates for, for paid alternatives to consent.

Basically making data protection into something for the rich. Others, including myself have argued that that is not the case. Where do you stand, Roman?

[00:14:14] Romain Robert: I, I, I'm, I'm surprised. I think I'm honestly, I, I, I was surprised for this paragraph, you know, it's really, for me, the most important paragraph for business model, like Facebook and social media is the paragraph one 50 of the decision. Why? Because I, I, I, and as you know, because when I was working at EDPS, I was quite already obsessed with the, you know data as a.

Counterpart or like data monetization. And I think this question is quite crucial for this kind of business model. And, and I read, I, and I still read the this paragraph one 50 as the court saying that it would be appropriate if would, was just okay to offer an alternative if necessary for an appropriate fee.

If you don't consent to the processing of your data, and I. Think it is again, opening the Panera box of is it okay to pay? This is the typical question payer. Okay. You know, which is also relating to different complaints that we did at against newspaper in Germany, in Austria. If you refuse the cookies, then you have to pay.

But you have to pay an amount, which is totally beyond what would be consider as reasonable. So I think this is really important also in relation to the digital content directive because you, you may know that the digital content directive kind of implicitly recognize that you can pay with something else than money, including data as a counterpart, which I really.

Don't like as an idea because it seems to accept that you can use data as a counterpart and therefore using data as a counterpart for a contract, and therefore using Article six one B of the GDPR as a legal basis. If you accept that you pay with the data I, I know by, I make myself clear, so it kind of circumvent the article six one six one a requiring consent if you can basically force someone to pay with the data.

You don't rely on consent. You relied on contract and you may know as well it is exactly what the what meta Facebook did before the DPC. We finally complained that NOI five years ago against meta for the use of what we call force consent. And it appears that Met Meta realized that it was not based on consent, but on contract.

Sorry. And I think this shift of legal basis is a never ending story. And it was also tiring for us and the DPC because basically meta seems to shift over from run legal basis to another one. They were kind of relying on consent and they said that they were relying on contract. And as you know, A couple of months ago, they now shift to a new legal basis, which is legitimate interest.

And I agree with Gabriela. I think it's quite interesting to see what the court said here, because basically it's like the court saying, don't even go back to me , because I don't want to see you again. I will go all, you know, I will go through all the legal basis. I don't want to be like the dpc, don't come back.

Facebook, you cannot use this one, you cannot use this one and don't even try to use this one because I'm tired. So I, I have the impression, I dunno what you think.

[00:17:14] Paul Breitbarth: Would be nice if they

do the same for the data privacy framework in a

[00:17:17] Romain Robert: Exactly. Yes, exactly. Valla, just to answer your question, I think it was the most important question for me. What about this alternative?

And, and if we accept the alternative, what does it mean when you read the court decision? An appropriate fee? Because as you know, I'm also working in a dpa for the Benson dpa. I don't know how to interpret it in, you know, appropriate fee. Even if I had to accept that they would be okay. To pay for an appropriate fee, what would be the acceptable amount?

R D D P is equipped to determine this amount. I have doubts about it, but happy to hear you Hughes, because it's really what I was looking forward to have a discussion about this. And I would like to know how to, you know, connect all the dots of all these cases, all these decisions and, and, and, and the legal tax as well.

[00:18:05] Gabriela Zanfir: Well, the way I read it is that. The language is quite clear in the sense that the court does refer to an alternative that must be offered to users in exchange for an if necessary, if necessary in exchange for an appropriate fee that if necessary. Opens a bit a possibility of interpretation, you know, when is it necessary to, to offer the services for an appropriate fee.

What is clear though, from the paragraph is that there needs to be an alternative offered to users who do not consent for their personal data out that. Is processed outside of the contract to be processed in specific ways. So I think the paragraph concerns two types of processing of personal data.

The first type is the processing that's necessary for the performance of the contract, and I think that once you accept the contract, And for the data that's strictly necessary for the contract, that processing in any case takes place on the basis of six one B. Now, we've seen that a big part of what's being processed by the social media services and meta in this case was not considered by the court as being necessary for the contract,

in terms of personalized content, for example.

And also, very interestingly, the court doesn't specifically refer to personalized. Advertising all the time. It refers to personalized advertising, but it also refers to personalized content, which is a, a bit of a different thing. And I think it's a concept that we also encounter in the Digital Services Act.

Like as a parallel. So going back to this paragraph, it refers to first the processing that's necessary for the contract, and you enter the contract as a user. Your data will be processed in that sense. What is necessary for the contract? I don't know. Username, password. Your network of friends.

So there is a level, minimum level that's necessary because you accepted the contract, but everything else must be offered as an alternative if you refuse, if you do not consent for everything else to be processed in a specific way. And that alternative. If necessary can be paid for if you want to access specific things, perhaps like personalized content without a trove of other personal data to be processed.

So I'm not sure how much clarity this paragraph brings to this, but I do see that the court refers to this potential appropriate fee as, as an alternative to. Giving access to more of your personal data or per, perhaps across platforms or across the internet. So, so no definitive answer here, but.

[00:21:10] Paul Breitbarth: No, that's very clear and let's hope there will be follow up cases that will give that clarity. At the same time, I'm just... Trying to figure out in my mind, what a Facebook timeline would look like in the various scenarios. So you have on the one hand, your personalized content, you will, you have your, you have your friends.

I mean, that is the main intent of the social medium that you can follow people, that you can connect with people, exchange messages. I think that Facebook. It also is how it was originally designed. So that's, that's. probably what the court would also refer to as necessary. Then you have your personalized content that can be created on things that you like.

And then I like a, a, a Facebook post of the Future of Privacy Forum or of NOIP or of of a, a restaurant that I, that I like. And those will show up in my timeline as well. That is something that can be done, I think, on the basis of consent because you choose to like those pages. But then the further personalization, Hey, if you like this, then you must also like that, there the court has been, been quite clear in my view, that is not objectively necessary for the product to function.

So that can not be done. Just on the basis of those terms and conditions. It could be something where you could consent if you know everything, but then it needs to be granular. But then the only paid alternative that I can imagine is that you, that they would continue doing all that they do and that you basically pay for the basic version so that you pay for the version that only has your friends and family and, and the things that you like.

Which would still require a legal basis which then would either be consent or a contract, and then you come back to the objectively necessary. So, I'm not sure whether the court actually thought their own logic through in saying that a paid alternative could be used because there is no 6 1 G saying a paid alternative in, in, in the GDPR.

You still need one of those 6 legal bases, and bar consent, all of them require some form of necessity. And that necessity cannot be created by having a payment.

[00:23:28] Romain Robert: Yeah, I agree. And interestingly enough, because we had the discussion with the DP c you know, with the case that we filed five years ago, which is the, you know, the, the, the EDP p decision of January of this year, it was about the same thing, right? About can you use contractors contractor liquid basis for.

And that's, I was going to say four and four. What? And that's, I think it's an interesting question. What Gabriela raised because what are we talking about here? It's not quite clear. Even in the e dpb decision. Just to give you an example, we filed a campaign going personalized ads five years ago, and we got the decision of the DPB on OBA online, be behavioral advertising.

Here, the court switch, I'm sorry, I don't know if you have the same impression, but the court is changing from personalized content to personalized ads and, and this kind of processing, and I'm like, yeah, but which processing are we talking about? Now, do I have to pay for not having. Personalized ads at all, or not personalized content or no use of the data of Facebook?

Sometimes I'm a little bit confused in the decision, but maybe I'm too tired. I don't know, but I don't know what you think about it. It's not really clear, in my opinion, to know there is such a huge range of processing operation, and especially for Facebook, that they can be done potentially. Right. And I'm not sure that I, if I would be, I would be Facebook, I would just be lost as well, because I'm not sure to understand.

Where the, where's the line that the court is trying to put, you know what I mean? And I know that I may see more in favor of Facebook for once, which is not probably in the case. But I understand that we wanted more legal certainty and clarity, but I'm not sure that reading the decision, but it's maybe because you don't really know all the facts behind the decision.

So maybe we don't really understand what were the specific processing operation behind the decision. I'm not sure that I still really understand. Where is the cursor, you know, where, where, where's the line that I'm not supposed to cross the social media? Am I supposed to not do personalized content at all or can I do it but not using.

WhatsApp data, and as you know all the examples, and I'm sure you've been through there, if you add someone, a friend on WhatsApp, you may have this friend request or friend suggestion on Facebook. This kind of processing operation apparently would not be legal according to the court.

[00:25:46] Paul Breitbarth: Yeah.

[00:25:47] Romain Robert: Using a WhatsApp friends metadata to deliver advertising on Facebook might not be possible.

But when you, you read the decision is it still possible for Facebook to provide personalized content? Meaning, I like this city, I like Vienna. Am I gonna see content related to Vienna or is it that also prohibited? I'm not sure that I really. That I'm not really lost in the branch of processing operation that are allowed and not allowed.

I dunno if you think about it.

[00:26:18] Paul Breitbarth: So yeah, Romain, I think you make a fair point there. I think the court is very clear on one point. You cannot just take data from every single source and put it together and start analyzing it and just do whatever you damn please with it. There they've been very clear. You cannot just take data from WhatsApp and Instagram and use that for Facebook and certainly not for Use the advertising data from all across the rest of the internet and use it for improvement of your timeline or whatever it is that people want to do on Facebook.

I think there the court has been been crystal clear, but about the rest of the questions that you raise certainly a lot less so.

[00:26:57] Romain Robert: Okay. I'm, I'm happy to hear it because I thought that maybe I was stupid or too tired. I don't know, but. Or maybe Gabrielle, is it clearer for you? I am, I'm still struggling. Maybe you are. I don't know. I'm still struggling to understand where's the line and you may to, to filter all this processing operation.

It's, it's still not clear to me. And it's also, as you say interesting to see that the court went through all the legal basis, whereas apparently the original case was not about all the legal basis. And, and maybe we can maybe talk about it later, but as you know, and I think it's another obsession of mine, You cannot throw all legal basis to a regulator, to a dpa.

You have to choose one and to stick to one. And it seems that here, the court, by going through one legal basis to another implicitly accept that, you know, you know what I mean? Gabriela,

[00:27:43] Gabriela Zanfir: I, yeah, I know what you mean, but I'm, I'm doubting a bit here because I, I'm not sure that the court says you can throw all.

[00:27:53] Romain Robert: No, no, no. You can't. Of course. But why did, yeah. What? Why? Why analyzing all this legal basis? You know, just what I want you to say is that E D PB in February did not do that, and I would've loved the ebb just, you know, went further and to say, oh, and Facebook don't even come back to me with a legitimate interest and because it's not gonna work.

E D P B did not, didn't do that in February, and I'm happy that the court did it here, you know,

[00:28:18] Gabriela Zanfir: What.

[00:28:19] Romain Robert: but why? That's what I don't

[00:28:20] Gabriela Zanfir: My sense is that, well, practically speaking, this may have been raised during the proceedings. You know, as lawyers usually do, I am, you know, saying I'm submitting that. This is all based on the con consent. If you. Think this is not lawful, then I'm submitting this is based on legitimate interest.

So maybe it was raised during the proceedings, but it, it also might be that the court just wanted to address them all in one shot. So as you were saying it, it, it wouldn't be called again to interpret each of

[00:29:01] K Royal: Line them up, knock 'em down, Gabriela.

[00:29:04] Gabriela Zanfir: Yes, in, in separate proceedings, or it might be a combination of two. So it's very possible that they, they were raised during the proceedings in front of the court and then it decided to actually address them.

All. And I would say what's even more remarkable on the legitimate interest ground, which the court spends a lot of time analyzing, it actually goes purpose by per like interest by interest and making findings. Also with regards to product improvement, which I know is one of the very debated topic.

In, in compliance, you know, can you base product improvement on legitimate interest and so forth. So that was also, I would say interesting. But to go back to

[00:29:50] K Royal: one I liked.

[00:29:52] Gabriela Zanfir: yeah. To go back to, to the lack of clarity within what we first said, that it's a bit of clarity from the court. I, I think you, you are both right that You know, that line is quite clear in specific points, and the court I think is particularly clear with regards to what can be bundled within a contract for a social media services.

And what cannot, or at least, at least it's clear about the fact that not everything can be bundled under a contract. So then of course the issue is what exactly cannot be bundled because it keeps switching in between personalized content and online targeted advertising.

which I, I have to agree that indeed leaves.

A bit more of a debate open and I, I recall, let me quickly check in my undocumented here, I recall also noting that the cord refer to an equivalent alternative that must be provided to users who do not consent to, to specific personalized content,

[00:31:06] Paul Breitbarth: Mhm.

[00:31:07] Gabriela Zanfir: In paragraph 102.

And that in, in that paragraph, as opposed to what the court says in 150.

In the last sentence, the court says that those services may were appropriate, be provided to the user in the form of an equivalent alternative, which does not involve such a personalization such that the latter is not objectively indispensable for a purpose that is integral to those services. And here the court refers.

Strictly to personalized content. And to me, I think this is also, I made a connection with some of the obligations in the dsa, right? Where we now have a law that actually requires I would say very large online platforms, but I don't know exactly if all online platforms or just very large online platforms to offer.

An alternative to personalized content like a ti, like a timeline sort of content. You know, how you have a wall on your social media service in chronological order rather than content pushed by an algorithm based on your interaction with the platform in the past. see if we read 102 and 150, like we have to ultimately perhaps get a bit more clarity.

But I'm not very convinced of that. And I, I notice the paradox of having started the podcast by saying, oh my gosh, we have so much clarity from the court. And now, you know, 15 minutes later, Oh, okay. Hmm. Maybe there, there we, we don't have that entirety of clarity.

[00:32:48] Paul Breitbarth: But isn't that always the case

[00:32:49] Romain Robert: Yes. Not always that clear, but I agree. Gabriella, I was just playing naive, I think because of course, I think the result of all this assessment of the illegal basis by the court is the result of maybe meta throwing all the legal basis before the hearing and everything. But that's something that is a trick that you see even before a dpa, and I'm not sure I like it because you know you have, the principle is that you have to advance a legal business beforehand.

And waiting for a D P A to open an investigation against you and then throwing a legal basis during the investigation is too late. As you know, you have to connect a legal basis before. I also want to link to the decision of the D P C and and the D p B of January when there was one thing at least.

About which all DPAs, including the dpc agreed, is that the transparency obligation and the privacy policy of meta was not clear at all. So no one understood what they were doing with the data. It was noting between the legal basis and the purpose. You know what I mean? It was really a mess. And I think it's also quite important to remind us in the context of this decision that the Court of Justice, because again, I, I agree.

That's why not analyzing all this legal basis, but. And I don't criticize the court of justice, but I would've loved to see in the decision of the court, and by the way, meta I'm doing, I'm doing this exercise on going through all the legal basis, but by the way, you were supposed to come up with only one, or at least only one legal basis per purpose.

And I'm not even sure that the court was provided with sufficient information about what is at stake here. You know, we don't even know whether you know, in the file, Meta was really explaining to the court the entire range of processing activity that were at stake during this decision. It's maybe another hypothetical, you know what I mean?

It's maybe the explanation why the court went through all legal basis, because it was impossible for the court to know. What is corresponding to what? What is the purpose? What is the processing operation? It's not really clear. And still today, even after the DPC decision, you can read the data po the data policy and the privacy policy of meta.

Last time I checked it hasn't been changed. So meta is still not linking the purpose. Like it's still a mess for the regulator, but also for the court.

[00:35:03] Paul Breitbarth: Maybe because they first need to find out themselves what data

[00:35:07] Romain Robert: Yeah.

But that's a little bit clear. Voila, that's what I meant. So I, I agree with you, but I think it's a pity that you, you can't see a line like, okay guys, it's nice to throw all these legal bases at my face, but I would like to see at least some clarity myself.

So maybe the, the, you know, the fight was not clear for the court as well. 

[00:35:23] Paul Breitbarth: It's okay.

[00:35:24] Romain Robert: explain more clarity than what was you shared with the court.

[00:35:28] Paul Breitbarth: It's okay. For, for us Europeans, this is a landmark decision. But what do you make of it from a US perspective?

[00:35:37] K Royal: Okay. I have a lot

[00:35:40] Paul Breitbarth: in your news?

[00:35:41] K Royal: it. It did not. So here, here's the things that stood out to me. One, we knew this was coming two Facebook is because of its sheer size and reach and scope and popularity. So Facebook, I believe, is being used as an example, and I don't mean a scapegoat example, but it's a perfect example because it's so big and it has so much reach and so many people use it and it's, it's so public.

It's like Google and Facebook. You can't get away from them. But that's the problem as well is because of those things. So, to me, one part that stood out, Gabriela, was the exact paragraphs you read. It almost sounds like they're telling Facebook how to run their business, not just how to deal with data or European data.

They're saying, oh no, you need to offer an essential, an equivalent to, you know, these alternatives if people don't consent to them. I know that's not what they did, but as, as I was sitting here going, you know, I'm gonna say this, but I know it's coming from my pure, rogue, rebellious American heart.

I, I know it is. I agree with the decision. Facebook has a lot to clean up. 

[00:36:55] Romain Robert: We have to be cautious when you say Facebook case now, because there are so many

[00:36:59] K Royal: But you know, Facebook is gonna put, well, they can't push back. What are they going to do when you made the comment over? Maybe they don't know what they do with their data. What are they

[00:37:09] Paul Breitbarth: Yeah, and for now they have to wait, because this case goes back to the lower court in Germany that referred it to the court of justice and they will have to make some final assessments, so we'll get that in the next couple of months. 

[00:37:21] K Royal: Does it surprise Europeans to know that American companies don't know what they do with the data or why they have it?

[00:37:28] Romain Robert: I'm still surprised, honestly, personally. Yes. I'm surprised,

[00:37:31] Paul Breitbarth: no, I'm, I'm not surprised because also many European companies do not know what they do with the data.

[00:37:37] Romain Robert: through that. Yeah.

[00:37:39] Paul Breitbarth: they should learn by now that this is something that they should start paying attention to.

[00:37:44] K Royal: it's serious.

[00:37:45] Paul Breitbarth: We've had data protection laws at this level since 1995 which, which is a few weeks ago. 

[00:37:52] Romain Robert: I like the way you formulate like it's, it's something they should start paying attention to gently. Like, like, come on. It's a little bit more than time. No, like , they should start thinking about, you know,

[00:38:03] Paul Breitbarth: Well, you know, I'm still friendly.

[00:38:05] K Royal: And let me, let me say one thing that I picked up, and this came from I A P P just. Last year, I think it was one of the biggest conversations that I really took away and I started implementing it in the privacy notices that I write, is Europe wants to see not a category of who we collect data on, a category, on what data we collect, a category on why we it, and where we share it.

 They said we need to be able to tie, why do we collect it, to which data subjects to why we use it. You know, and, and, and what data we collect. We don't do that. You're, you're given a list of 10 types of data subjects, 20 million types of DA data, and three reasons for using it, and you don't know what goes with what the US wants. It’s completely different. 

[00:38:54] Paul Breitbarth: earlier this week I reviewed a privacy... notice for a company and they said, Oh, yeah, we process personal data on the basis of six legal bases and they just listed all six in the GDPR Whereas they have no right to process personal data on the basis of a public interest in the first place So

they just did a copy paste

[00:39:11] Romain Robert: Yeah, I agree. 

[00:39:12] K Royal: and I will say that that's one big thing that I took away, and that's at the crux of this. The decision that we're talking about is why do you collect that particular data on that particular data subject? US companies don't look at it that way. I'm sure there's companies all over the world that don't, but US companies don't.

And to be able to say, well, we're collecting this data on this type of person for this reason, and then to be able to make sure that that reason is legitimate. Under European law, there's a lot of reason. But is that reason legitimate under European law? And if it's not legitimate under European law, do you have an equivalent alternative to what you offer?

And here in the US, they don't want privacy notices put that way. They actually, California says, you need to tell us these things, and they list a formula of things you need to say. And it has nothing in relation whatsoever in how Europe would like to see it. Logic. So it's almost as if the US is form. And the Europe is function

[00:40:12] Romain Robert: It's interesting, but I agree. I think it's yeah, exactly. I think it's quite important to remind that. Is it going to a DPO to a court and changing the legal basis because one is not working and shifting to another one is like changing alibi in front of a court. You know, , you can't, you, you have to pick and choose first.

You okay, this one is not working. Let's try this one. Okay guys. But it's too late.

[00:40:33] K Royal: Can we possibly spin it to meet that one?

[00:40:36] Romain Robert: No, it's really like, yeah, it's changing your alibi and it's, it's, it is not even like a proforma. It's really is not like a formalized exercise. You have to know which legal basis you realize so that the data subject can know which, you know, which Right. He has, 

[00:40:50] K Royal: The legal basis should come first. 

[00:40:51] Romain Robert: Yeah, because if it's consent, you don't have the same right as legitimate interest and the same right as contract because consent you, you can withdraw legitimate, legitimate interest.

You, you can object. So it's quite important for the, it's fundamental for the data subject and even for the dpa, if you just say, as you said Paul, that. We rely on all possible legal basis for all possible data, if you call it, for all possible purpose. It's, it's just a nightmare for everyone, for the data subject and for the DPA itself.

So it's not flying, it's not healthy mapping of the data. It's not good transparency, I think. And it's, it's,

[00:41:24] K Royal: I

[00:41:25] Romain Robert: it's driving me insane not to see. Yeah, it's not really healthy

[00:41:28] Paul Breitbarth: Gabriella any sensible words from you here

[00:41:30] Gabriela Zanfir: Yes. Actually I have uh, two points to make. We, which sprung in my head as Kay and Roma were, were doing their, their latest. Contributions. The first one is that the court I think, is very, very aware that it is addressing a business model and not necessarily one conduct of one company because the court actually says business model like two times in in the judgment.

And I think this might also be a part of the explanation of why they went to such length. To actually make findings about all of the local grounds and expand on legitimate interest and as, as product improvement and security services and and so forth. And I think the court. Wanted to sort of send a clear signal that, look, this is about the, so social media targeted advertising, online type of interactions, business model, and the second point on the lawful grounds.

One thing that I noted while reading the judgment was that the court was, Quite specific about the local grounds being consent and even consent is not validly given. It only works on necessity, on a necessity basis for the other local grounds, so it groups the other five available local grounds in Article six under the head of necessity.

Which also made me think of our good old days in in the Article 29 working Party when you know that there was a lot of time spent discussing necessity and proportionality but not in a you know, law enforcement type of context. But in our sort of day-to-day things that we had to assess and there's even what the necessity blues that.

[00:43:36] Paul Breitbarth: David Kauke

[00:43:37] Gabriela Zanfir: David Taki wrote. So I, I really appreciated how the court really put a spot on, a light on the idea of necessity and how the data needs to beary like objectively necessary for those specific local grants to actually work and be valid.

[00:43:57] Paul Breitbarth: and I like it that they also linked that very clearly. And that is a concept that Americans do understand to the reasonable expectation of privacy. Because especially when it comes to, to legitimate interest, the court is very clear, for example, in, in, in paragraph 112. That data protections do not reasonably expect such processing when it comes to online advertising, for example, they do not expect it.

And thus it means that you cannot just do it. 

[00:44:24] Romain Robert: for sure. But maybe we can, maybe subject, one thing that we can also commend is that the court already preempts the future obligation of the dma. And I think it's also interesting because we had Article five of the DMA saying that you cannot, you know, cross use you know, cross platform data for personalized ads and you know exactly what Facebook was doing.

But there, we had a specific test negotiated for years before the Parliament to prohibit that. And guess what? We didn't need it. Apparently the G D P was already prohibiting that, and it's crazy because the commission was waiting to enforce it and we were like, We don't need your commission. Thank you.

But this is already prohibited and the court is confirmed

it's kind of funny thing to to see.

[00:45:03] Gabriela Zanfir: let me just end on this note. When a couple of my colleagues here in the US first saw the decision and they started to read through it, the first question I received from one of them was, but wait. Isn't this like the DMA and the dsa? Isn't this why, what's going on ? So I, I, I was saying yes, exactly.

Thank you. This is something that the data protection community has been trying to signal for a long while, that, you know, if we have had the interpretation and application of the GDPR, As it was meant to be. Perhaps we wouldn't need all of these extra double obligations, complications and systems that overlap and that will need to make sense of, you know, because they, they will all need to work together.

[00:45:55] Paul Breitbarth: Well, and on that note, I will wrap up this episode of Serious Privacy. Thank you all for listening. And thank you especially Gabriella and Romain for joining us today. and sharing your views. As said, the debate will continue initially on LinkedIn, so if you have any follow ups to this conversation, please do share them on the Sirius Privacy page on LinkedIn or join us on any of the other social media.

You will find Kay always as heart of privacy, myself as Europol B. Until next time, goodbye.

[00:46:26] K Royal: Bye,

[00:46:26] Romain Robert: Good time.

[00:46:27] Gabriela Zanfir: all.