Real Talk about Real Marketing

#42 - Real Identity: A Balanced View of Data Privacy

July 06, 2023 Acxiom Season 4 Episode 3
#42 - Real Identity: A Balanced View of Data Privacy
Real Talk about Real Marketing
More Info
Real Talk about Real Marketing
#42 - Real Identity: A Balanced View of Data Privacy
Jul 06, 2023 Season 4 Episode 3
Acxiom

Considering a balanced view of consumer privacy that works for people and businesses, Sachiko Scheuing, Acxiom European Privacy Officer and four-term chairwoman of the Board of FEDMA, joins the Real Identity podcast to discuss what is being proposed as a more flexible approach to GDPR via the UK's new DPDI Bill. But that's not all! Federated IDs, AI, virtual reality, risks in a changing IT ecosystem and the “golden rule of data privacy” are all part of this don’t-miss conversation.  

 

Please add this to Sachiko’s bio page

Connect with Sachiko on Linkedin

Thanks for listening! Follow us on Twitter and Instagram or find us on Facebook.

Show Notes Transcript Chapter Markers

Considering a balanced view of consumer privacy that works for people and businesses, Sachiko Scheuing, Acxiom European Privacy Officer and four-term chairwoman of the Board of FEDMA, joins the Real Identity podcast to discuss what is being proposed as a more flexible approach to GDPR via the UK's new DPDI Bill. But that's not all! Federated IDs, AI, virtual reality, risks in a changing IT ecosystem and the “golden rule of data privacy” are all part of this don’t-miss conversation.  

 

Please add this to Sachiko’s bio page

Connect with Sachiko on Linkedin

Thanks for listening! Follow us on Twitter and Instagram or find us on Facebook.

Kyle Hollaway:

Hello and welcome to Real Talk about real identity from Acxiom. This podcast is devoted to important identity trends and the convergence of ad- tech and mar-tech. I'm Kyle Holloway, your podcast host, and I'm joined by our Dustin Raney. Dustin, I was reading a post recently by Barry Scannell, a leading AI law expert, concerning the EU Parliament's AI Act. The Act seeks to regulate high risk AI systems and foundation models, like those on which chat GPT are based. While we aren't going to be discussing AI directly on today's show, it certainly reminded me of how the EU Parliament is once again at the forefront of major consumer protection related legislation.

Kyle Hollaway:

Similar to how the AI Act will have material impact on the marketing and advertising industry, the General Data Protection Regulation, or GDPR, has had significant impacts over the past five years, which leads me to the introduction of today's guests. I am super excited to introduce , European Privacy Officer for Acxiom and four term chairwoman of the Board of FEDMA, the Federation of European Data and Marketing, a Brussels based trade association. She is also one of the founders of Women Lead, an Acxiom Business Resource Group focused on gender equity in business. is a dynamic speaker, passionate advocate for and expert leader in privacy, and we are thrilled to have her on the show today. , welcome to Real Talk.

Sachiko Scheuing:

Hi, Kyle. Thank you very much for the invitation. I do need to correct one thing, though I am not one of the founders of Acxiom Women Lead, but I'm just one of the coaches. I joined a little bit later, but that doesn't mean that I'm less interested in the topic. I'm therefore even more excited about this topic, but thank you for your invitation.

Dustin Raney:

Sachiko, so excited to finally get you on our podcast. We've had a past couple of years had a chance to really get to know each other while at ICOM in Spain. But before we talk about some of that, would you mind giving our listeners some insight into your background and kind of how you got to where you are today?

Sachiko Scheuing:

Sure, well, some long time ago in my previous life if you want to at Acxiom, of course I was actually the chief analyst of the Acxiom's Dutch office. I was doing that for a number of years and my privacy boss came over to the Netherlands to say how excited she was about this privacy thing and it is really at the core of Acxiom's business and blah, blah, blah, blah blah. And that really made me think, hey, this new thing sounds so interesting, I'm going to do that. So that's exactly what I did. I went to the CEO and I said hey, you know, that chief privacy officer came. Can you please get in touch with her and tell her that I want to do privacy here in Europe? That was 2005. Since then, I'm doing what I'm doing today.

Dustin Raney:

We're so glad that you did. One of the things that I've always admired about your positioning on privacy is that you've always held, i feel like, a really good balance, because it really is about a value exchange. If you go too heavy on regulation, for instance on representing one side or another whether it's an advertiser, the publisher, the vendors, the consumer then it really starts to that imbalance, could potentially be catastrophic to the industry right And across all entities. So always loved your feedback and perspective on consumer privacy in general.

Sachiko Scheuing:

Yeah, actually, you raise a really really good point because, interestingly though, like you said, not so heavy on privacy and so on, but the GDPR and all about which we talk a lot all over the world, not only in Europe actually says the right to data privacy is not like an absolute right, so you always need to sort of balance other rights and freedoms, like the right to conduct business and so on. So how you are phrasing it is absolutely spot on. This is how we need to take a look at the law. It's my point of view.

Kyle Hollaway:

Speaking of GDPR and what's coming up on like five-year anniversary, and you've your interest in privacy preceded that. Obviously, you got involved with privacy well before that time and now you've seen that legislation come through and now it's kind of starting to mature a degree. Where do you see things heading with the regulation? Do you feel like now that there's, you know, been some instances whether it's some rulings against some of the major platforms and other considerations do you see any pulling back or refinement of it? Do you see it actually extending? And so what are the kind of prevailing headwinds right now?

Sachiko Scheuing:

Havin i f the GDPR for the first involvement with way the the the the the the the the the the the the the GDPR, i would say, was in 2008, when the European Commission said hey, we, you know, we want to know if this law is still, like, up to date enough to actually regulate all the digital evolution that is taking place. Well, we have certainly come a far, far away from that. I think this, i think, what we can expect right now is that in reviewing the reviewing the GDPR, there are a couple of things that we need to sort of take a look what went well, but also what did not go particularly well. I think one of the things that we really need to take a look is whether the GDPR is I mean, going back to that previous conversation right, five years of GDPR really makes us think about whether the law is working the way policymakers have intended. I mean, as to one thing, i did say that GDPR, or the right to data protection, is not a not an absolute right. So there's this thing called the Recycle, which is like a footnote of the law. In Recycle 4 of the GDPR, they, it clearly states that the right to the protection of personal data is not an absolute right. So I'm not making, i'm making this up or something it actually says there. Maybe it is being understood in a way that is too rigid, but in any case, this was not only my understanding, but it seems like this was also the British government's understanding, because, due to the Brexit, they had a chance a little bit earlier you know, a little bit earlier than the, than the European countries to reevaluate the effects of the GDPR And, as a result, they are now working on a new law called the DPD I bill, and the improvement that I see, at least in the drafts today, is that they start to understand that GDPR may be having a disproportional burden on SMEs.

Sachiko Scheuing:

And, of course, we all know, you know, smes are really the backbone of our economy. So if we actually dampen them out, we can forget about the, you know, our economy economy from blooming. So I think that's a that's a really interesting thing, in particular, for the marketing sector. What I feel is really important is that there's this concept of legitimate interest, which means that you use the data, making sure that you are the one who's safeguarding in, you know, doing or putting all sorts of mechanisms to make sure that the data can be used. It is not going to be weighed against some other interests that the consumers have and so on. This legitimate interest as a legal ground in current GDPR in Europe. It is just in once again in this footnote, in this recycle section. But the UK DPD I bill is now bringing that into the main law, saying, hey, you can really use this data to do marketing and and so on. So I think it is really interesting to see how the UK government reacted to the learning from the GDPR.

Sachiko Scheuing:

The other thing is, i just want to talk a little bit more about this rigidness of interpreting the GDPR, because we all think GDPR always so strict, we can't do anything And that is like so wrong. Gdpr is actually based on the so-called risk-based approach, which means, say, you want to use the data for a certain campaign or you want to actually do certain analysis or something like that, but then you think, hmm, maybe this is in the gray area. Well, gdpr doesn't say, no, no, great, it's not white, stop it. But rather GDPR says, are there ways that you can think of to reduce the risk so that it's no longer gray, the risk becomes more acceptable And if you manage to do so like by means of putting extra security or pseudonymizing the data, blah, blah blah. Then it's okay to actually use the data. It's so much more flexible And I hope that this learning will be reflected in the European laws as well.

Sachiko Scheuing:

We are seeing so many laws that are emerging in the area of data use and digital data use. Like Kyle, at the very beginning you mentioned this EU AI Act, which is really timely that you mentioned that, because yesterday the European Parliament has adopted it. But you also have, like DMA, dsa, digital Market Act, digital Services Act, data Act and all sorts of things, and it seems like this is going to go on for another couple of years. But what does that mean? That means the goalpost is going to change very frequently in Europe And, by the way, what happens to Europe would usually spill over to other countries like the United States, canada, to Latin, to wherever. So my golden tip to everybody and maybe I'm just doing myself a mega favor work close with your privacy and legal people. They will be able to navigate you through the labyrinth of law exponential legal growth, i would say.

Kyle Hollaway:

Yeah, i think that's a great call out And I know it's been a continued focus within our work here of ensuring that we're doing those PIAs the privacy impact assessments to really take any kind of business process and ensure that it aligns with those privacy components, but also have that dialogue internally even on not to overreact or assume that it can't be done, but rather say this is what we would like to do. Yeah, exactly.

Sachiko Scheuing:

It's a dialogue. How do we fit that? How do we fit that within the constraints of the law? That's why I love working with you guys. That's why I love working with the engineers, Because you guys always come with challenges And it's not like I'm going to say no or my team is going to say no, but we think, hey, what can we do with each other? You have this technical knowledge. Can we solve privacy problems with our technical knowledge? That's what I'm actually talking about when I talk about risk-based approach. If it seems very difficult, then let's see what are the steps that we can take to make it okay again. Well, sorry, I interrupted you, Kyle.

Kyle Hollaway:

No, that's great. I did have one question, because obviously most of our listeners are likely in the US and we're not directly under GDPR, But speak a little bit to the extraterritorial aspect of GDPR and how it could impact US-based businesses.

Sachiko Scheuing:

Yeah. So I think one of the big things about not only the GDPR but other laws as well, is this aspect of cross-border data transfer. As you know, there's a lot of movement in that section. One think about the SHREMS-1, shrems-2, but also about the UK-US bridge, which is being built or is already built, i don't know. I think one do need to have a strategy in dealing this challenge. I would really like to see a world that allows cross-border data transfer between different countries, provided that the same level of protection is actually given to the data receiving country. To actually say hey, cross-border data transfer is illegal is actually not taking reality into consideration. I mean, there is no national border on the World Wide Web, is it? So we need to actually come up with a framework and also precautions to make sure that these risks are dampened down as much as possible, ideally eliminate these risks so that we can also support a global economic growth and innovation.

Dustin Raney:

It seems that just hearing you talk about how, in some ways maybe the markets overreacts to legislation because they might not completely understand it. They don't know about these articles that you're talking about, that actually have a little bit more balance than they might think. But regardless of that, the market reacts and people are tying this whole cookie-less thing directly to GDPR and CCPA. For instance, cookies non-transparently tracking you across the system browsers started deprecating. How did the market react? We start creating federated IDs. In the US that looked more based on a hashed email, for instance, uid 2.0, UID 1.0 was built on a cookie. Uid 2.0 is built on a hashed email.

Dustin Raney:

In Europe, it seems as though the telcos are stepping in and building out a federated ID more on the phone number, because telcos have phone numbers. We heard one of the companies coming out called UTEC that is basically a co-op of the major telcos in Europe, trying to build a federated ID off that phone number spine for addressable media and advertising. On the other side, we heard people trying to mimic Google and build anonymized browser-based IDs. We saw the two camps starting to rise. We're seeing that globally, honestly. Some saying hey, federated deterministic ID built off your PII. Others saying cohort, browser-based ID. What are your thoughts on what's happening there? Do you feel like either solution is really a viable solution?

Sachiko Scheuing:

Well, in principle, i think the more options there are, the more helpful it is for the marketeers, because they're, like you know, different ways to solve one one solution. You're not actually stuck. Let me actually first of all stop by commenting on this telecom initiative. I think this is going to be really, really interesting because, as telecommunications sector, regardless of in which region of the world that you go to, they are usually like really really tightly, tightly controlled. So I think and I think they are indeed doing this I think they're going to deploy a lot of like pseudonymization and perhaps even anonymization processes to make sure that it's like super compliant, which also is acceptable under the telecommunication regulation, and so on. I think about like other solutions and so on. Well, if I were a marketeer and I'm sitting down and I'm thinking, how am I going to actually build up my my first party identity, or maybe it is even like you know, how am I going to make sure that my campaign measurement is actually going to work seamlessly, and so on?

Sachiko Scheuing:

One of the things that marketeers are starting to ask because their compliance departments are coming to them and telling them that you need to take this into consideration is that they will start saying like you know which channel, which solution will give me the that will have the highest level of trust and would actually mean that they will be protecting our brand reputation the most? I think that is the questions that will be used to determine which solutions they would use. Maybe they'll use a portal. You know they'll have a portfolio of different solutions. I mean, we will see. But one thing I can say I really like these evolution. You know that many clever people are coming up with very you know, really interesting solutions, and I think it is really a hot area to keep your eyes on, particularly for privacy people.

Dustin Raney:

Right, i totally agree. So really want to get your thoughts, sachiko on. Like we're talking about Google, they're building out you know it was flock, they didn't want the topics but basically storing an ID on your device, likely without you really knowing you know what's happening. It's contextual, but it's kind of anonymous, right, but it's regardless. It's an ID that's keeping up with you, basically showing what contextual information that you've been interested in as you browse the internet, as I think essentially what it's doing. Do you believe that that is falling within you know from a consumer that it's meeting my expectations as far as transparency, as far as like compliance, even from a regulator's perspective? Do you think consent, explicit consent, should be required to put any ID on a device?

Sachiko Scheuing:

Well, if I would actually start by talking about the European market. You know, there is this thing called the e-privacy directive, and it is not really GDPR, but this e-privacy directive says you need consent. So if this law says you need consent, then you need consent. You don't have any other choice. The thing is, though, exactly what you say is being like debated a lot in Europe at the moment, so let me first of all start talking about like, the idea of like. Why would anybody want to have give the consent, give the control of the consent, control of the data in the hands of consumer? Maybe for the US point of view, this is like a little bit strange because, like you know, if a company goes out and collects data, perhaps I don't know, i'm just guessing, correct me if I'm wrong Perhaps the data that is collected would belong to that US company that actually put in the effort, invested in people, to collect that data.

Sachiko Scheuing:

But the thinking behind data protection, particularly the modern or the current data protection, actually goes back to this thinking called informational self-determination, and it sounds very, very hardened, right, and that's because it comes from Germany, though to actually be, you know, be quite honest, it actually comes from a Greek-born professor, professor SpiroSemitism, who's actually referred to as the international father of data protection. But anyways, this guy said you know, everybody should be able to control his or her own data. And you need to think it was in the 70s, right? So we didn't have, we didn't have internet, we didn't have apps, we didn't have, like you know, all these things that are normal to us today. The question is, is this concept still meaningful to us today? I mean, you know, some of the things that were derived from his thinking is still in place today and being adopted in the United States too. Like you know, reporting you say that under this CCPA and in Europe we will actually use the word rights of data subject, like right to know what information is being collected about you, right to correct the information if the information is wrong, and so on. So it is still there.

Sachiko Scheuing:

But here the reason why I actually asked the question whether that is still meaningful today is because, like, how can a non-technical, like non-legal person understand exactly what is being done with the data, even after reading the privacy policy and something like that? Like you know, you say things like we use your data to generate inferences, to create profile about a consumer, reflecting the consumer's preferences of characteristic. It's like what For most of us. I think one of the questions that we need to raise is is it fair? Also, like you know, privacy policies are not particularly concise. They're usually, like you know, in legalese and, like you know, really, really, really, really long. So, you know, the question that we need to ask is whether it is fair to expect that consumer is able to digest that information and make a decision about that.

Sachiko Scheuing:

And my thought about it is I think we really need to empower and educate ourselves. Like you see, we need to educate our children, but also our adults too. I mean, we need to be able to sort of make it easy for everyday people who are not lawyers, who are not technical people, to understand what these sentences actually mean. But that is my thinking. It's not that important. More important is what the, for instance, the European commissioners are thinking about, and this is, i think this is going to be really really interesting, because they, the DG justice, the consumer protection wing of the commission, is now coming out to say hey, look, i know that we need to do everything on the basis of consent, but then, frankly speaking, there's this consent fatigue. It's because it's not centralized. You need to actually consent on any and every website that you visit or app that you download or whatever. So they have started this initiative called Cookie Pledgees And what it is. It is actually like a voluntary framework, so it is not going to be like law. You have to do it, but it's like you know you can actually, as a company, try to make it easier for the consumers.

Sachiko Scheuing:

Now, one of the ideas that are there, you know, to avoid consumers to click so many okay buttons is to actually say, hey, have that, put it as a browser setting and the browser's administers the consent. Well, of course, that's a good idea, but people of all debate, hey, wouldn't that give a disproportionate amount of power to those browsers that are like so powerful? So any and every idea that actually will be there to improve the situation is welcome. You know, a good idea would be to actually have like a publishers unite and actually have a central opt-in mechanism and even something that says, hey, you know what I actually want to see more advertising about fountain pens.

Sachiko Scheuing:

Like, i like fountain pens. I don't collect them because they are so expensive. So I don't have like a big collection, but I like fountain pens And then, and then they know that my you know, this is what I'm interested in, instead of me spending hours and hours looking for different website. If you know that solution, through a network of publishers, for instance, is able to provide me that information, i think I'll be really, really grateful. But having said all that, yes, indeed, that is something that we are also discussing in FEDMA that you mentioned, kyle, at the introduction. Next week, we are going to discuss five years of GDPR and we have indeed invited the legal policy person of the DG Justice of the consumers of the EU Commission. So, yeah, watch the space.

Kyle Hollaway:

So I've got a question and I don't know if it's really a question or just a thought and get your reaction to it. But I think one thing that I've continued to wrestle with in this whole category and you were talking about it, about, you know, giving the consumer kind of the right to control. Part of the question is that's kind of predicated on the assumption that the consumer actually knows what's best for them selves. But consumer knowledge is very limited in many ways. Like you don't necessarily know.

Kyle Hollaway:

Like you said, it's a very complex ecosystem. You know what brands are doing is hard to understand really how it all functions. So A does the consumer really have the capability to really make an informed decision, or are we just putting it in the hands of you know somebody and just going like, hey, make this important decision, but you probably don't know? I mean like, like you know, at some point, like with my car, i eventually defer to the mechanic because I, like I don't really know how the car works And so I'm like hey, if you think that's best, i have to make you know kind of make a decision I have to make you know, kind of an assumption there.

Kyle Hollaway:

So there's some implied trust is, and do you think we can get to a point where there's a balance between trusting the experts to do things, or do we always need to push towards the kind of lowest common denominator, which is the individual? you know, and so it's? it probably is bouncing a little bit of individual rights versus you know what's best for the population as a whole, because you know it's more efficiently said, it's enabling brands, especially those small to medium businesses, to to not have an onerous outcome or or to be precluded from something just because somebody made an uninformed decision. I don't know if I'm making sense here. So what is your view on that? Like, is there a point where individualism and individual rights, like we can overcorrect to that and that we need to find some balance between Well, kyle, you are spot on.

Sachiko Scheuing:

You are really spot on. That was one of the issues GDPR wanted to address And it has done so by basing itself on this principle called the accountability principle. Basically, it is indeed. So The consent. I'm not really sure if consent is fair because, like you say, what you're doing is you're shoving the responsibilities of the consumer, make that important decision right And then if later on the consumer complains, well, you'll be like well, you clicked on, okay. It is, in a way, really, really unfair. So that's why what I heard back in the time before GDPR came into effect I was at one of those seminars by the European data protection supervisor back in that time.

Sachiko Scheuing:

It was Mr Giovanni Buttarelli, an Italian person, and he was like you know, when I take a look at the different we call it legal grounds, like in Europe and in many other countries that adopt the GDPR, like laws, you actually have like five or six reasons why you can use personal data And, like you know, like you carry out a contract or it is in the interest of national security or public interest or whatever, and then you have legitimate interest And then you have consent. And then the guy said, well, take a look at all the legal bases upon which you can base your use of data And if you can't find anything, then use consent, not the other way around. And that approach also makes sense because the company is actually using, or the organization is using, the data, so aren't they responsible of protecting the data in such way that the right to data protection of consumers and everybody would be protected in the environment and also in the way that they use? So I really do want to see this accountability principle taking a stronger root in the GDPR, but also in other countries, like in Singapore. I think they have really hit a great balance, like how you can use techniques such as pseudonymization, anonymization Well, anonymization is, of course, an organization. Let's say pseudonymization strong pseudonymization to actually decrease the level of risk that is there in using the data. So that's one side.

Sachiko Scheuing:

My strong feeling is I really do want companies and organizations to be more accountable, as the GDPR says. The other side is, however, those who are interested and who really want to exercise their right. They should be encouraged and empowered to do so, and that's why I think it is not like just do the accountability and don't do the education, but I really do think people need to understand what's going on. I really think it is important that we provide a continuous education, like one of the things that I heard this is already like 10 years ago from a regulator, i would perhaps even say from a regulator from a certain country, and this person said well, you know, we try to familiarize people from our country with the concept of data protection.

Sachiko Scheuing:

So what we do is, for instance, i think, whatever that country's language, but let's say English, because it's easy, in their English textbook they would actually have stories about data protection, so that unconsciously, very naturally, children would learn that, hey, we have these rights and we can take control of the data if we want to. And this is what it means. I would like to see more of these educational effort coming out from all over the place, actually.

Kyle Hollaway:

Yeah, that's a really interesting thought there And more institutionalizing that effort to educate and kind of earlier in a life cycle so that, like I said, it just becomes more of an innate understanding versus a lot of us that have been around in a non-regulated world for a long time, you know, starting to try and absorb and understand that regulation. I knew this would happen, that we would start talking and that we could talk forever, because there's so many interesting parts to this discussion and everything. So we're starting to kind of run out of time, so we're going to shift the conversation just for a second And want to kind of be forward looking. I know we've talked a lot about legislation, even some things that are currently about to happen, but let's do touch on the topic of AI for a minute, because AI, i think, enters and opens up a whole new area of privacy considerations because of just the its ability to massively process data and data points that maybe aren't even ones that we've traditionally thought about.

Kyle Hollaway:

I was reading just an article yesterday where just human locomotion, our motion, is enough that AI can actually determine individual identity just by looking at a video of somebody walking And they can begin to process that across the hundreds of thousands of people that may be walking down the street and it can start to really identify just from your gate who you are, how we hold ourselves, you know, because of our skeletal structures. That kind of blows my mind to start thinking like it's not even biometrics as I would have understood biometrics, but it's actually observable metrics that now suddenly become something that can identify me with. So give me some of your thoughts on AI. What are you thinking about it in terms of privacy as a chief privacy person within a large enterprise?

Sachiko Scheuing:

Well, i think it is really a double-edged sword. Ai is because some of the new uses of AI I mean your example was a little bit scary, but they're also really, really practical, but mind-blowing examples Like my sister is on the creative side of marketing and she made a chatbot for a pharmaceutical company in the United States And then they're like oh, we also are in the Japanese market, so she and then she used AI. It was like so flawless. She was like you know, you can just use it one-to-one in Japanese. And also the other thing she was actually doing was she was actually building like a complete marketing suite, like with branding and a landing page, chatbot, social media post and so on. You know, those are the things that creative bureaus usually spend like hours and days working on. In 25 minutes, can you imagine? I'm like, oh, my goodness, do you still have a job in like a couple years? And then she was like you know, get this after five minutes on top of that 25 minutes, so you have half an hour.

Sachiko Scheuing:

The entire campaign can be localized you know, for Brazil, for China, for Japan, for the UK and so on. So the potential, i think, is like really really huge. But the other side of the thought is, of course, that AI creates fact, for instance, the chat GPT it can. I think it is really really, really, really dangerous to actually use chat GPT's answers as a source Or like the other dangers used of AI can be. Like you know, it can create so many scripts in like no time. So I think if hackers get hold of it and I bet they have already it will really really make the IT ecosystem a dangerous place.

Sachiko Scheuing:

So I think we all agree or I hope we all agree. I think banning it is not going to be the option, because imagine if we were ban data due to the uproar about census and all that back in the 80s. You know we will not be having this comfort and innovation that we are enjoying today. Same thing can be said to AI. I think we need to actually find a way to live with AI in a responsible manner, and only so I think we can reap the benefit while having the AI use in a controlled, ethical manner.

Dustin Raney:

You know, as consumers, having gone through COVID, having gone through you know digital acceleration, you know all this stuff with our data, AI. There's a lot of change happening. There's a lot going on. I think there might be a need for, like, a consumer therapy sector to grow to like help us understand our value again. Right, it's like I don't think that that might be. One of the kind of foundational things is that you know, yes, technology, it's going to be required to keep us all together and help us have more meaningful experiences, but oftentimes I don't know that the consumer understands how valuable they are And, like, how potentially abusive some of the players in the ecosystem have been with their value. So I think, at the end of the day, my hope is that that is kind of balanced out a little bit right That, despite what's happening with AI, chat, gpt, even AR, vr, where Apple just released their new headset, which is going to be, mind blowing More than $400 for that stuff.

Dustin Raney:

Yeah, i mean, it's like some people will, right, but eventually that's going to be mass adopted and we're going to be walking down the street and, like Kyle said, i can like look over and see someone's gate and be like, oh there's Kyle walking down the street amongst in New York City, amongst the hundreds of if, if, if. He says I can do that, right, like, will the technology exist for you to individually say, hey, like, i don't care what everybody knows about me or what specific people know about me, and I think that's where technology is going to come in maybe give the consumer more control, because some people, like they want to be popular, they want to be known, they want to be like, whereas other people are like, want to be super private, they don't want anybody to know them. I don't want like. So I think there might be a balance. What are your thoughts?

Sachiko Scheuing:

there, i think. I think everybody is different. That is indeed true. But you know, of course, these people who are saying you know, i don't have anything to hide, everybody can know everything about me, until one of these days you'll be, you know, you'll be like oh, sugar.

Sachiko Scheuing:

I shouldn't have, i shouldn't have made that part transparent, and so on. It's always. You know, human being grow up and we mature and we look back about. You know all these things that we did during our colleges and so on. I'm not really sure if your decision back in the college time would still be valid today.

Sachiko Scheuing:

So, once again, i think that comes down to people being aware of the different consequences, people being aware of the rights, and, speaking of rights, i think it is really interesting to read the United Nations Human Rights Declaration or, like people who are interested in the European side of the story, the derivative thereof, which is the charter of European Charter of Human Rights. It's really interesting And you know what? To sum it all up in my own words, it's about like showing respect and being respected. It's no different. It's our basic, fundamental social etiquette, if you want to, and it's just being transported over to the digital sector, i think educating yourself, reading around, being respectful of the others, regardless of what you do, in whichever form. I think these are very important for ourselves.

Dustin Raney:

That's per. I would call that a mic drop moment for Sacha. I don't know that there needs to be anything else said in this episode, but hey, sacha, we are out of time, unfortunately. As we knew, this would be an incredible, very informative taking tech and humanizing it and thinking about people, thinking about all the players and bringing balance. We hope our listeners got as much out of this as we did. Thank you all for sticking in with some of these. I know when you're talking about privacy and compliance and regulation and technology, you can get heady, but it's important, and thank you for bringing clarity there, sacha Ko, and we look forward to having you back again sometime in the future. So thank you everyone. Thanks for joining us today and we'll catch you next time on Real Talk. Thank you very much.

Privacy and GDPR Regulations
Federated IDs and Consumer Consent Evolution
Privacy Considerations of AI