Making Data Better

EP11: The Internet Bends Toward Privacy: Michelle Finneran Dennedy, PrivacyCode

Lockstep Consulting Pty Ltd Season 1 Episode 11

Privacy has been a major casualty of the Internet. Twenty five years ago Silicon Valley leading light, Sun’s CEO Scott McNealy said we “have zero privacy. Get over it.” And that has been the truth ever since and the profits basis for AdTech behemoths Google, Meta, and more.

The good news is that, in the quarter century since advertising became the Internet’s business model, few have gotten over it, including powerful national and regional regulators.

In this wide-ranging discussion with Steve and George, Michelle Finneran Dennedy, CEO of PrivacyCode, Inc. and Partner at strategy consulting firm Privatus.Online 
speaks to this shift toward restoring online privacy and what her company is doing to streamline implementation of privacy enhancing practices.

In this Making Data Better episode, Michelle addresses:

  • The immorality of Data Subject Access Requests
  • Ethics vs. zero trust
  • The impossibility of privacy regulation compliance
  • AdTech’s shifting model
  • The liability tornado about to strike data hoarding enterprises

This is an important and exciting conversation. Take a listen. 

Speaker 1:

Welcome to Making Data Better, a podcast about data quality and the impact that it has on how we protect, manage and use the digital data critical to our lives. I'm George Peabody, partner at Lockstep Consulting, and thanks for joining us. With me is Lockstep founder, steve Wilson. Hi, steve, g'day, george, good to be with you, glad you're here. So, steve, let's get into it.

Speaker 1:

Today we're addressing what I consider to be the biggest negative effect of the internet. My list has more than one entry on it, but my biggest one is the loss of privacy. We've got big tech companies who have every incentive to know everything about me, because our business model is based entirely on advertising to me totally targeted advertising, a corollary of that or along with it. We've got an entire ecosystem of data brokers who aggregate and resell every bit of my digital dust that they can hoover up. So, and then there's this enterprise belief that it's the best thing you can do is store every bit of data that you've ever encountered to help with your marketing and maybe risk management. And then, of course, analytics you're going to apply to all that someday that make all that data be useful to you. So we've got this pervasive habit and business model that is all about learning everything you can about people, without regard for the immediate need for the data and without any concrete reason. Often, in other words, as other studies have shown, even with all that data, bad decisions and injustices are made.

Speaker 1:

Okay, Steve, I give up. I realize I'm complaining about the state of the world here. You've worked in digital privacy for years. Where do we start?

Speaker 2:

Well, that's interesting. I got into privacy almost by accident from many years working in identity. Often put that in air quotes. Security and privacy are different. Cyber security starts, I think, with important data and thinking about what makes data important to you, and then you have controls to protect the value of data. There's a well-known set of dimensions that you look at data through from the security perspective.

Speaker 2:

I think that privacy is sort of the opposite of that. I think that privacy is about restraint. I think that privacy is about what you don't do with data rather than what you do do. I think that privacy is about what you don't do with data rather than what you do do, and therefore the technology is not as important as policy and governance and I put policy in quotes as well. It's not just those big, thick 20-page policies, but policy like what is your organization's culture? What do you want to do with data? How is it important to you that old promise that we respect your privacy by businesses? If they really mean that? Here's the paradox of privacy. They need to choose not to know things about you. They need to say we're going to mind our own business, we don't want to know everything about you. That's respecting privacy. So that's how I come at this.

Speaker 1:

Wow. So, yeah, I kind of went in the other direction, didn't I? I was all about collecting every bit of data and whining about that. You're pointing out that it's about minimizing the data that's being captured in the first place. Well, to help us further unravel this privacy knot, it's our great pleasure to welcome Michelle Finnernan-Denady. Pleasure to welcome Michelle Finnernan-Denady, who's the CEO of Privacy Code Inc and partner at strategy consulting firm Privatus Online. Get that right, michelle Privatus, privatus. All right, I was going to ask. I was thinking earlier is this a world where we use the word privacy or privacy?

Speaker 3:

This is going even back further. So there is publicus, which is your public persona or your cultural, ethical rules that you follow, and then there is privatus. So privatus is your private self or your individualized self.

Speaker 2:

Gorgeous.

Speaker 1:

Well, you were an attorney before becoming a CEO and a consultant. I want to ask, to begin with, what brought you to privacy? How did you get into this domain?

Speaker 3:

Yeah, I think, like so many people, steve included we kind of backed into it. So I started out life as an intellectual property lawyer. I was a patent litigator and then entered the world of high tech, which is interesting to me. I was doing a lot of medical devices and pharmaceutical work in patents and then ended up in high tech and I think are we supposed to call ph scale really, on the internet was dawning and you could buy and trade and do business online and so somehow, magically, that absolves you of doing what you needed to do. And as a patent litigator coming into this environment and a businesswoman looking over the Sun Microsystems portfolio, I said this is really.

Speaker 3:

The paradox for me was that the value is in sharing information. The value was we had people like Whit Diffie who was talking about encryption, we had early versions of what we now call Kubernetes and we were dealing with what we called the grid or the utility back then which we all now call the cloud distributed secure compute, and so when I looked at the portfolio and the business needs of this company, as well as the 165 countries in which we were doing business and the various cultures that those imply, everything rang a bell in my head saying if you want to do business well, you better get privacy right, because you have customers and you have employees and these are pretty critical characters in your business drama. And if you're a government entity, you have citizens and voters and those are pretty important constituents for you. So wherever you find a person, you find a story that needs to be told, a connection that needs to be made, and so you have privacy and identity.

Speaker 2:

Love that story to be told. The theme of our podcast is what is the story behind the data?

Speaker 3:

And the story, you know, it's amazing. I mean, data is quite benign until humans get their eyeballs on it, and then it can be very good and it can be very evil, and so I think that's the thing to remember is, you know, this is an ingredient. So data, really telling the story, I think is the right thing. And, exactly as you said, george, is like the old fashioned way for the old people who don't like making money. They stockpile as much data as they get their hands on, they scrape the internet and they steal all of our data and they put it into generative AI models.

Speaker 3:

That seems really attractive in this day and time, doesn't it? It seems like you'd give somebody a trillion dollars to do more of that. The reality is it's already setting a time bomb off. You can dig in a pile of pony dung as much as you want. You ain't going to find a pony in there, but to find something that's fresh, something that's new, something that's targeted, something that's individualized, that has value. That's fresh, something that's new, something that's targeted, something that's individualized, that has value, that's current. So the old-fashioned way of just because I can, I do because mainframes are cool that's changing, and it's changing very, very rapidly the new value model is coming into focus.

Speaker 2:

Let me react to that really quickly. And we didn't prep for this, so I apologize for this story out of the blue. But you'll remember 15 years ago, the Google Street View catastrophe, where the Street View cameras had a good job to do. They're photographing the streets of the world. But some bright engineer realized that the equipment on the back of the Google car had a Wi-Fi transponder and they could also pick up Wi-Fi transmissions that were quoted in the public domain.

Speaker 2:

And this chap innovated by collecting that data and just started mining it to see what was there. It was a process of discovery. It was a process of literally data mining. You know there's a vein of data, let's mine it, let's see what's there. That was the Google culture. They reward innovation. They make a stupendous amount of money out of using data in innovative surprising ways.

Speaker 2:

And when they all got sprung and they said you were collecting that data without intent, you don't need it, you didn't tell people you were doing it. It doesn't matter that it was in the public Wi-Fi mish, it was personal data, you had no right to just grab it they were shocked and they sort of gave the guy an attitude transplant and said we won't do that again, but I thought it spoke to the culture of an innovative company, a purely digital company that feels as though it's got the rights. Like frontiers people to you know it's like the old oil rush. We know this valuable treasure is under the ground or under the digital ground. Let's go and find out what's there and dig it up and use it. And I thought that is anathema to privacy. And yet it's so counterintuitive that you should not do that.

Speaker 3:

Yeah, I think that's right and.

Speaker 3:

I think that's where it's getting really interesting, because we now have more subtle controls than we had back in those kind of early kind of oil days of get whatever data you can, because we didn't have fat pipes to every home, we didn't have the stability in the Wi-Fi connections that we have today, we certainly didn't have the fidelity of the identity tools to layer whose persona is whose where, and so there was sort of this feeling of just because I can, I will. And the reality is, I think the more innovative view, ironically, is quite old fashioned. That's a weird sentence to say that innovation always starts, for me, with ethics.

Speaker 3:

What do people want? The things that people become really, really addicted to and they really really like are things that bring out our human passions hunger, thirst, lust, greed. We love being on social media as much as we hate being on social media. Why? Because we crave connection with people. It's a very basic human thing. So, by really looking at, what do we really want, what are we serving, how much does it cost, what are people wanting to do with that information and what if that information is false? So starting to have these questions again because we have the fidelity that we have today that we did not 20 years ago, I think, opens up a whole new can of really valuable innovation. I've stunned George into silence.

Speaker 1:

Yeah, you have. I'm trying to take what you said and trying to search for the evidence that there are tech companies who actually get what you're saying and agree with it and are casting themselves as stewards. I'm still looking at the paradigm, the business paradigm of the net, which is basically advertising.

Speaker 3:

Still ad tech? Yeah, yeah, and I think that's right. I think what exists today, you still see remnants of that old-fashioned sort of ad tech world. Right, it's like Remnants.

Speaker 1:

That's Google's. That's virtually 90% of Google's revenues, if not more.

Speaker 3:

Yeah, but it's changing. I mean, look at just, the Europeans are saying, oh, we don't like your Google cookies and things, and so that changed radically and overnight. And how much money are they spending into trying to shift their models? How much money are they now investing into trying to find various ways of obfuscation and various privacy enhancing technologies to either do an anonymization or figure out how to do radical personalization? I think even those big, big companies are investing in what's next.

Speaker 1:

You put your finger on what I suspect is the major mover of the market these days, which is regulation. Is that right?

Speaker 3:

It always has been. So this nonsense, this nonsense that, oh, regulation is going to stop innovation how was Silicon Valley formed? Silicon Valley was formed we broke up the telephone monopolies and there was all this room for building. There was all this sort of regulation. After deregulation, they re-regulate other things and people find innovative models to go around things, sometimes to find a loophole, but also sometimes to truly innovate. Regulation hasn't stopped the pace of innovation in human history yet. So, yes, the lobbyists will continue to lobby and they're adorable and we love them. However, the facts are the facts. The cash is the cash. I don't see a slowdown, even though we have actually known rules. I would prefer to have actually a federal law in the United States rather than state-by-state patchworks. I think it protects more people. I think it gives us a way to show that we're driving on the right side of the digital highway and we can signal to other drivers that we're on the move. This patchwork of little power fiefdoms, I think, is destructive.

Speaker 1:

Absolutely. Your point makes me think that how glad I am, despite its occasional misses, that the FAA exists and does such a good job of making air travel as well. It's the safest way to move a person in the world.

Speaker 3:

Yeah, well, you put your finger on it. I've long said it. This is one of my first conversations with Steve. They had these great drawing people at this conference we were at and we were talking about what did we do when piracy was the biggest threat to commerce, when we were trading nutmeg and tea? Well, we had admiralty law, and so when a ship was in port, it was one thing, when a ship was in the shipping lanes, it was another thing. We followed each other, we had rules, we understood what it meant to help a distressed fellow sailor, and a lot of this was to prevent pirates, to preserve goods and services, to cling together with the technology that we had to do commerce and literally find our way in an emerging planet, despite which country of origin that vessel emanated from and where they were heading. I think that's where we really, despite a world that's starting to be more and more kind of nationalized again, for this type of commerce, for privacy to thrive, for identity to thrive, we need to think more and more about standards than ever before.

Speaker 2:

I love that. I love talking to a real lawyer. There's so many what we call in Australia Bush lawyers. I love talking to a real lawyer and your background, michelle, I think is deep. It seems to me as a non-lawyer that we're probably on the cusp of new jurisprudence. Just as shipping created that and just as the oil rush in the 1850s, 1870s created new property law, regulated a whole lot of intangibles.

Speaker 2:

I talk to technologists and they say you can't regulate data, it's ones and zeros, it's intangible. I talk to technologists and they say you can't regulate data, it's ones and zeros, it's intangible. Well, you know what we do in a civilized world we regulate telecommunication bandwidth. That's not even a thing, it's not even physics. Bandwidth is so intangible we regulate the hell out of that. So I think that we're on a cusp. I see data protection, which is sort of synonymous with privacy. Gdpr stands for data protection. I see a bigger form of data protection where we look at what makes data valuable from every dimension the human dimension, the social dimension, the business dimension titrate out the properties of data that makes it valuable and attend to maximizing that value. And knowing that it's a multi-discipline sort of thing, I can see I'm so optimistic that we're actually going to see some good things in the next 10 or 20 years.

Speaker 3:

I think we will too, and some of it. Again it goes back to, like, those human needs. Greed is a big motivator and we saw that when Apple decided to flex its muscles on what it's calling privacy. You saw what it was able to basically punch Facebook in the nose pretty hard enough so that they did this weird head fake with Web3. Like, oh, we're doing Lyndon Labs again. I was like you guys are so cute. That clearly was a head fake in my mind. It was like right after the market fell over on its side, suddenly we're doing this weird thing that didn't work, doesn't have a business case. That's cute. Why? Because suddenly we've got this immense power with this, as Steve says, an intangible right. And if you go back and you read, it's actually a beautiful piece of prose the Morn and Brandeis piece from 1892, I always get that year wrong from Harvard Law Review called the Right to Privacy. What was it about? It was actually about the dawn of new technologies. It was about Kodak-ing or taking portable photography.

Speaker 3:

Although the portability was a giant heavy box, but you might see a woman's ankle. Oh my, think about it. It's like morality and ethics, and what they discuss in that article is really the evolution of you know, when Fred got out of the cave and hit Ted over the head with a hammer, that they understood was something that they should have rules about, and then when you just threatened him with the club, it was a very long time before that was actually recognized as a criminal and a civil harm. The fear of that bat hitting you was an ephemeral right that we agreed as a society our ethics. Before the law happens, we have ethics. We decided that to happen. So privacy and this is why we fight so voraciously, I think, against this nonsense and nihilism of privacy's dead because we have toys. No, privacy's alive because we have people and people who have dignity and stories and we have ethics. Cultures decide how they want to interact as groups with one another, so as long as we have people and culture, we will have privacy.

Speaker 1:

Let's take it right down to you what's your work? Tell us about what you're CEO of yeah, what the company is doing, because I'm hearing this and I go. This is hard.

Speaker 3:

Oh no, this is fun. Yes, it's hard because it's valuable, so you should pay us lots of money, but it's fun. So, basically, two lines of business. So Privacy Code is an AI-driven platform. I would have said NLP-driven a year and a half ago, but now we'll say AI. We use natural language processing and we actually take written documents that are usually written in legalese and they'll have either your RFPs or your notices or your policies or your privacy impact assessments, and we'll scan them in and we read them against a requirements library that has three subject matters responsible AI or new technology. It will be the same set of criteria for quantum Privacy, of course, or data protection and data quality and data governance. So three different sort of types of libraries, and so they're captive in two types of tasks processes, technologies. So if you make a promise and you say we promise to respect your privacy, we list our vendors. Well, you're going to find a whole category of requirements from places like NIST and GDPR the Australian laws that say what to do with vendors and so we've already broken them down task by task what could be a policy, what could be a technology control and then we put them into projects.

Speaker 3:

So that's one line of business is helping to do that translation between and amongst technology and word people so that you can get your work done. Know what you need to get done so that you can get your work done. Know what you need to get done. And really no one can be expected to be as geeky as Steve and I are. If you're a UX person, I'm going to tell you exactly what I need in your UX so that we can help our customers engage in. If we're going to call it opting in or opting out, I don't care. It's going to be some sort of informed consent mechanism and I'm going to tell you what to present. Now, you are not going to be the right person, probably, to store the proof of that consent. That may be someone who's doing my logging. That may be someone who's doing my security criteria or my consent management.

Speaker 3:

That person is going to get a task that says every time someone pushes this beautiful button designed by my gorgeous UX, the result is going to be this kind of time and date stamp, et cetera. So how do we do these times and dates? So that experts are doing expert things and then the person who's doing the data governance on top looks across the environment and says how are we doing? How are we doing on? You know, when we do things that are truly informed consent, we're actually hitting 1920, maybe even 60 different countries requirements at once. That's wonderful. Look at that. You can actually celebrate that stuff and you can actually start to assign weights and values.

Speaker 3:

So, pravadas, what do we do? We do strategic consulting. So we look at your business problem and we think about how do you connect that to your data issues? How do you connect that to dollars? How do you connect that to actually getting this implemented so that your culture is sustaining what you say that you're doing? It's so much more than writing the right policy. That's like writing the right pitch deck to get cash. There's no magic pitch deck. It's a combination of a lot of different things.

Speaker 1:

I do create a dashboard of progress around each one of those three areas.

Speaker 3:

It's a progress board.

Speaker 3:

So, it looks like a task board. It's tied in with our wonderful Atlassian, our favorite Australia Well, Canva is getting to be a pretty big deal too, but one of our favorite Australian phenoms so it's already tied into Jira. So if your technical teams are already using Jira, like the rest of the world, all they do is they receive a ticket in their normal platform and so everyone else can either go onto the code and look at the Kanban or the project progression and you can discuss right there and everything is tracked so that you can say show your auditors at the end what I call a P-BOM or a privacy bill of materials that you can say here's what it looks like from end to end. And if they need to look at it further, they can say, aha, well, and everything's tied back to the standards too. So they can say everything's organized in the same kind of chapter and verse and there's cited paragraphs if you want to geek out on it of like here's where it is in GDPR, here's where it is in secure control framework, et cetera.

Speaker 3:

And then, if you really have to figure out, like, does George still work here? He was in charge of the UX last year you can click on that and say, oh yeah, he assigned this now to Steve and Steve has now taken over, so you can audit your stack, you can mature your stack, you can just know where you're at with data rather than just words. We're actually assigning deeds, matching them to standards, and now we're speaking a common language, so that we're not just sort of slipping down this well of, oh my gosh, it's overwhelming. Privacy is too hard for me.

Speaker 2:

And you're solving that wicked problem of privacy policy being like a dual-use artifact. Yep, but it's dominated by legalese and it's become a defensive mechanism, or even it's become a bit of trickery, where businesses have got a covert mission and they get their lawyers to write something that supports the covert mission, without saying what it is. It occurs to me that it's dual use, like a privacy policy, if it's really a statement of what the company is going to do with data. Why do I want your data? What am I going to do with it? What do I owe you? I going to do with it? What do I owe you?

Speaker 2:

I like to see that as like an embryonic document that lives with the project. You start by articulating what, how, when, why and where, and then it turns into a guidance for designers and it also turns into meeting your regulatory needs. Now, I don't underestimate the complexity of writing a legally compliant privacy document, and nobody should. You should get legal advice, as Michelle would do, but your process using NLP winds up being able to extract the dimensions of that important document that mattered to different stakeholders, I think it's very important to you, everything that you should like.

Speaker 3:

Years ago, I did a graphic novel privacy policy when I was at mcafee intel and I realized we were doing a b2c product. We're doing a lot of family protection, we're doing a lot of mobile phone stuff and security is. Security is hard and it has its own special language and everyone has their standards and you know we're using the, the des, this and that this. We do that to sound smart to ourselves and that's adorable. Standards and we're using the DES, this and the this. We do that to sound smart to ourselves and that's adorable.

Speaker 3:

But when we're trying to communicate and be fiduciaries, it's a lot better when you can talk to them, and nothing translates faster than pictures. So we did these pictures of these nerdy little ninjas and they weren't stylized, or so we did this pictures of these nerdy little ninjas and they weren't stylized, or they didn't. They had one ab, like the nice dome, dad bod, and they demonstrated you know, this is what a cookie is, and we had a thing that said what cookies are, and this is. This is what we do with when we say we're keeping some of your data. As for research, this is what this means, and so we explained in in pictures as much as in words, and it went over really well and I was really proud of that work. It's not easy to do because you have to be very thoughtful that you're not over promising or under explaining things, and we did append the actual words to the policy.

Speaker 3:

But what I learned from that exercise is, if you're going to demonstrate visually what you're doing, you have to know what it is that you're doing. And I think that's the trick of this is, once you have invested and you've done a really thorough job of understanding what your data assets are A, you deserve to brag about them because they are a commercial differentiator. And B, you should make it easy, because you should be proud of that and you should not expect no company is compliant. So let's just start right there, right now. Forevermore. There is no such place as compliance-ville. That's like thin young, as compliance-ville. That's like thin, young, perfect, brilliant, smart, rich. She's lying, whoever that is. She's lying to you. So compliance is something that you're going to try to go for in this complex area, but some of these things are actually they conflict with each other. So understanding someone's point of view on what they think is readiness or compliant with their point of view. It's really important to understand so that you can interact with them with the right due care.

Speaker 2:

I love how you don't sugarcoat privacy, michelle. There's a couple of things that happen. We've got like the orthodox privacy by design manifesto that talks about a positive sum game and as if there's just one dimension and everything's going to be good. And it's not. I mean, privacy involves tensions, and I think the act of restraint, the act of promising I'm not going to know things about you when I do business with you, that comes with a cost and there's a tension there, and I think that that's good.

Speaker 2:

The other thing, of course, is that there's privacy perfectionism, because we think that there's compliance Great, and we give it to the lawyers and once they issue some sort of decree that you are compliant, you think job done, privacy tick, but it's not ever perfect. The funny thing is that we know that security is not perfect. We know that it would come with infinite cost to have perfect security, so you don't even think about it anymore, and I wish that we thought about privacy similarly, and I don't mean look for compromises by any means, but I do mean set an expectation that privacy is a factor that can be optimized but never reach perfect.

Speaker 3:

Yeah, I mean, I think that's going to be really important now that we're going to be sharing more and more training sets. So when I go to the supermarket in person, people can see me. I'm not invisible. So the fact that I'm there and the fact that I've gone down this aisle or that aisle if you want to call that compromise, that's fine. If you want to just talk about it interoperable with society, you can call it that too, but it's a different situation than if I have like, say, I had a doctor's appointment and they made house calls. I don't know if anywhere in the world they still do that, but imagine if they did and imagine the intimacy of that interaction, very different than the supermarket example. So being able to communicate, am I a supermarket or am I a psychiatrist? Do my data sets reflect that? It's going to be really critical because the next frontier is this generation of generative AI is not going away, nor should it.

Speaker 3:

I think there are leaps and bounds of things that we'll be able to do and get through sort of some grunt work that we didn't want to do in the first place, but it's going to be more critical than ever that we focus on quality, focus on quality and we're going to have to be critical about the kinds of answers we're getting back, because it's hard enough now.

Speaker 3:

If you have anything or if you're a parent and God help you you Google some malady, you're either totally fine or you have cancer. Those are the two answers that are always true. It's like sex, drugs, rock and roll you're always dying on Google and yet will it be true? But if you doubt everything, then you actually won't get the benefits of some of this stuff. So if I'm well curating that grocery, it may be that for a community you can really have the kind of levels of freshness and wholesomeness and maybe even healthcare woven in that are very helpful and delightful things, but you'll also be able to understand when things are starting to go wrong. Data has a shelf life that's very short.

Speaker 2:

Mulling over, mulling over.

Speaker 3:

We're not going by your script at all, are we George?

Speaker 1:

We're not, and it's perfect that we aren't. And sometimes people call me grumpy, george, and your comments about generative AI. I've spoken with AI professionals themselves who are actually concerned with regulation and being able to prove the results and point to why the AI made a particular decision, and most of them throw up their hands and say they can't do. It Isn't that cool. That's the magic of it. Well, it's a black box and then we've trained this black box. I'm sorry, but so much of it. As you sort of alluded to earlier, michelle, we train these systems so much on the bullshit that's endemic on the net. I'm grumpy because I think human beings are often just lazy and don't have the ability to be critical about the results that come back. So it's really the responsibility of regulators and the producers and providers of data to do a really good job on this, because citizens, they don't have the tools or the time.

Speaker 3:

Yep, I think that's true and I think the critical skills that we're going to have to. Hopefully we can use some of this to do a better job educating our people as a whole. Definitely, here in the US we're having all sorts of issues with education and deciding what should be educated. I think we have an opportunity here. We have an opportunity of this brand new kind of whiz-bang thing to help communication by writing better, and we have beautiful examples of. Grammarly is a beautiful example of a, you know, a writing assist tool and an engine. But we also will have the opportunity to force citations and look for, you know, should this issue have more varied citations or are you going to the same source for the same thing? So there is an opportunity here. I think that's a little bit Pollyanna-ish to assume it's going to be baked in. I think you should assume it's not baked in unless it's worth a lot of money, but I also think that it shows the value of the data sets themselves. So right now we're we're trained on like what is it?

Speaker 3:

Reddit and cat videos is most of the results on these you need to have, you know, I think, one of one of the biggest issues in bias. So there's two. There's a couple of issues in biases. A lot of this stuff is trained on very biased data and even people who have access to this podcast or anything online, that's not the world. That's only about 60% of the human beings on the planet right now even have access to clean water and the internet. So we're not representing the globe and I think the other thing that's you know. So the sample is biased and the other thing is the answer in funny quotes might be the right answer, but the right answer might be that you should pay women less than you pay men. So the right answer that exists in society today might suck. So it might be that even though you're getting a correct answer back, you might be getting a morally corrupt answer back. I've been asked with no imagination.

Speaker 2:

Child labor was a thing when the coal mines were small and you needed small workers to fit through the logical Exactly, and if it's just about money, and that's how rewarding our society is just cash, then it is, you know, back to the beginning.

Speaker 3:

It's the big oil rush times and you can bonk anybody you want over the head. I think that that's not a very sustainable society, and so that's where regulation and, more importantly in this era, because it goes so much faster ethics, really comes in. People think that they can't know ethics because it's sort of like an opinion, but it turns out this is a 3,000-year-old discipline of really interrogating what are the basic structures of ethical interrogation and what's our point of view? And then again, being transparent, am I looking at a collective ethic where really it is? The ancestral society, the oldest, you know, continuous societies on the planet go for collective ethics, whether we like it or not, as individualistic Westerners, it's true.

Speaker 3:

Or are we looking at, you know, like zero trust? Security depends on trusting no one. That's a very individual, single person narrative. So if I'm building on a network, that's that I need to understand to go to the mean and probably not going to find my outliers. My Picassos don't live there, but the guys who can fold jeans at the mall probably are. So, depending on what you want your outcome of your tool to be, it's really important to understand what their ethics are, as well as what is their regulatory kind of compliance posture, if you will their regulatory kind of compliance posture, if you will.

Speaker 1:

And Michelle, is your privacy code platform expressing ethics through its regulatory understandings.

Speaker 3:

To the extent that. So we have the model cards that Timo Gru and some others have come up with to do ethical innovation. So if you're using that process, it basically will suggest that to you, that you should have a process that says how to do critical design and how to, how to recognize bias and what to do. And this is from our Pravadas framework. It's I have to look at it cause it's a long acronym, but it's called bears teach. So that's this is the acronym is B is bias and fairness. So this is the acronym, is B is bias and fairness. Ethical compliance and values, alignment, accountability and responsibility, robustness and reliability, security and privacy we did put them together for you, George Transparency and explainability, environmental impact and sustainability, which is a big deal these are heavy models Autonomy and control we cheated here. It's scalability, so we'll use the C for teach, Scalability and performance. And then, finally, human impact and safety. So BEARS TEACH is the acronym and this is the ethical framework, as well as the functional framework for our risk assessment profile.

Speaker 2:

Let's pick that up and pop it in our show notes too, okay, as we edit this production. That'd be great. That'd be helpful.

Speaker 3:

Yeah. So we do like a scorecard and kind of a risk heat map of where are you on any of these things? Because you might have something so critical, because you've got cellular research and so you are using protein fold management, maybe you are using quantum compute Heavy, heavy load on the environmental factor, maybe high load on societal impact factor. So this might be something where you're saying, okay, let's go for that because the benefits are here and we know about the tradeoffs here and it fits this sort of model. If you're using quantum computing for advertising models to sell middle-aged people face cream, whatever.

Speaker 2:

Yeah, I think some of this stuff talks about the dynamism of privacy. That privacy is normally framed around. What do I know about you? Personal information, is it identifiable? What do I have? The fascinating thing to me about AI and even good old big data is that there's stuff that I might have and I don't know yet because the algorithm hasn't run or I don't have the algorithm yet that does the protein folding to predict the probability of diabetes. But the funny conversation that we need to have around this dashboard that you're describing, michelle, is give people some settings where they can enter into the bargain with the data holder. The data holder says this is what I might know about you. Are you cool with that? What I might be able to figure out? I might be able to synthesize the risk of height conditions. Do you want me to do that and have a meaningful dialogue around what the outcome of some of these enormously important algorithms could be?

Speaker 3:

Mostly outside, if we do it properly.

Speaker 3:

I think so too, and I think once we do that, we'll stay away from and I've been saying this in public more and more. It's a spicy take on it, but I'll give you guys some spice. I think DSARs have become immoral and I'll tell you why. The data subject access request We've always had the right to correct and the right to accuracy and we should continue to do that. Who is asking for the data subject access requests? Do you think it's like hourly wage makers or people who speak the native language as a second language? Do we think that it's people who don't have the same kinds of educ for that crust tool on the edge? And our lawyers are running around like chickens because there's like 72 hours and you have to get back this data subject access request and so we're breathlessly doing this and breathlessly, maybe, maybe, honoring those requests. So if they say, please forget me in your database, if they have that capability, if they have that capability, if they have done the architectural work, readers, they have.

Speaker 2:

That is a hot take and it stimulates me. It reminds me that one of you know it's canonical in electronic health records that people have the right to know who has been accessing their records. Now, you know, in a hospital situation there's dozens of professionals who have every right and they can't do their job unless they look at my records. Now, you know, in a hospital situation, there's dozens of professionals who have every right and they can't do their job unless they look at my records. Now, do I want to get pinged every three minutes that somebody's looked up my record? I mean, it's a sort of a luxury of the worried. Well, and the people that care about that stuff are not. Actually that's the hot take of yours, isn't it?

Speaker 2:

The people that are making those requests aren't actually by some measure important.

Speaker 3:

They're not. I mean, I won't say they're not important, everyone's important. But I will say like A, they don't have the tools to do anything about it other than to say naughty you or I'm going to take you to the commissioner.

Speaker 3:

And if we think that's the result, we should just give that staff directly to the commissions and have them do what they do. That staff directly to the commissions and have them do what they do. I think the more important part is it has sucked resources away from the core, from actually deleting stuff at the end of the process. Exactly as you say, we've thought in our past that storage was cheap and we were going to need this data forever, and so people who have long left our organizations stored and stored and stored data. They don't own this stuff anymore. It's probably for campaigns that haven't been run for 10, 15, sometimes 30 years, as we've seen in some hotel cases, for example. It may have been accurate at the time, but we've got all of this sort of liability tornado sitting in our bellies, and shouldn't we be cleaning that up first before we start doing all of this sort of dance of you know with fans and feathers? It's a grumpy old lady. Take on things, but I'd rather go to the core first than you, first than put lipstick on it.

Speaker 1:

I'm wishing every tech organization was embedding privacy code into its process right away, right? So I have to say I loved your P-BOM, your Privacy Bill of Materials, because that gets instructions. This is what you have to do to build a complete product UX, whatever it is, and that's how we can talk to each other, rather than a DSAR, which will become valuable if you do these steps.

Speaker 3:

I mean, that's the ironic thing about my crowing about. It is if we can talk to each other as vendors. If we could talk to each other and say like this is who I am, I'm somebody who does it. Right now it's all or nothing, Do you sell data or not? And everyone's like oh yeah, you have to say yeah, Like that's how we stay in business.

Speaker 3:

There's no, there's no middle answer. There's no. Well, you know, we, we parse it and most of it is we use this and you can't tell all your secrets because then you make things very insecure. So there's no real great answer, but there's a good answer here.

Speaker 1:

So, michelle, what's the good word to leave us with? Because we need to bring this in for a landing.

Speaker 3:

If you like cash, of any denomination or currency, you should fall in love with privacy, because we tell the stories about customers, about voters, about people, about students. So if you like having customers and employees, you should love privacy. We're about the money.

Speaker 1:

Well, thank you very much. It's been an absolute pleasure to have you on Making Data Better. Really appreciate your time. Thank you very much.

Speaker 3:

It's been a wild ride.

Speaker 2:

Thanks, Michelle. Great insights and thanks for the candor and the optimism.

Speaker 3:

Absolutely. You can't do what we do without being optimistic.

Speaker 2:

Indeed. Thanks, Michelle. We'll see you on the other side.

Speaker 3:

Yes, sir, thank you.

People on this episode