DESIGN THINKER PODCAST

Ep#30: A Human Centered View of Privacy Law

May 12, 2024 Dr. Dani Chesson and Designer Peter Allan Episode 30
Ep#30: A Human Centered View of Privacy Law
DESIGN THINKER PODCAST
More Info
DESIGN THINKER PODCAST
Ep#30: A Human Centered View of Privacy Law
May 12, 2024 Episode 30
Dr. Dani Chesson and Designer Peter Allan

How might we use privacy law to build trust with customers? Often regulations are viewed as an obligation, something organizations need to comply with to stay out of legal trouble. However, what if regulations can be more than that? What if regulations can provide us a way to build trust and create better experiences for customers? In this episode, Dr Dani and Designer Peter are joined by Rali Andreeva, a thought leader who helps organizations demystify privacy to improve customer experience.

In this episode, you will

• understand the intent of privacy law

• discover how to build customer trust with privacy

• learn simple ways to get started

About Our Guest 
Rali helps teams simplify privacy, mitigate risks, and build trust by embedding privacy in the customer experience. With 20+ years of marketing, loyalty, and data and insights experience across retail, FMCG, and banking, she brings a customer and business perspective to an area often dominated by legal and compliance expertise. 

She speaks at events, runs training sessions and facilitates workshops, as well as providing coaching for senior leaders and teams. You can connect with Rali on LinkedIn https://www.linkedin.com/in/ralica-andreeva/

Show Notes Transcript Chapter Markers

How might we use privacy law to build trust with customers? Often regulations are viewed as an obligation, something organizations need to comply with to stay out of legal trouble. However, what if regulations can be more than that? What if regulations can provide us a way to build trust and create better experiences for customers? In this episode, Dr Dani and Designer Peter are joined by Rali Andreeva, a thought leader who helps organizations demystify privacy to improve customer experience.

In this episode, you will

• understand the intent of privacy law

• discover how to build customer trust with privacy

• learn simple ways to get started

About Our Guest 
Rali helps teams simplify privacy, mitigate risks, and build trust by embedding privacy in the customer experience. With 20+ years of marketing, loyalty, and data and insights experience across retail, FMCG, and banking, she brings a customer and business perspective to an area often dominated by legal and compliance expertise. 

She speaks at events, runs training sessions and facilitates workshops, as well as providing coaching for senior leaders and teams. You can connect with Rali on LinkedIn https://www.linkedin.com/in/ralica-andreeva/

Dr Dani:

Welcome to the Design Thinker podcast, where we explore the theory and practice of design, hosted by me, dani, and.

Designer Peter:

Peter, hi Peter, hi Dani, how are you today?

Dr Dani:

I am good. How about you?

Designer Peter:

I'm very well, thanks. Yes, I'm looking forward to our podcast today.

Dr Dani:

Yeah, we have a guest with us today. We're going to be talking about privacy law and what design thinking has to do with privacy law, and to help us with that conversation, we have a guest with us, raleigh, would you like to introduce yourself.

Guest Rali:

Hello, Dani and Peter. Good morning. My name is Rali and I'm very passionate about privacy and customer and how to make things easier for customers to navigate this ever-changing and complex world. I have a lot of experience in marketing loyalty data and insights across retail, banking and FMCG industries and at the moment, I help work with teams and help organizations just make things simpler. I embed privacy in customer experience and often I talked about, as you say, business and customer perspective in area where usually compliance and legal perspectives dominates the field. So I'm super excited to be here and talking about customer and privacy.

Designer Peter:

Yeah, we're excited to have you and this sort of topic is right up our street. Our listener knows that we like to get into subjects that on the surface seem really different, but actually if you scratch the surface you can see that they're really joined and aligned. And, yeah, we're definitely really interested in how to make things easier for, uh, for customers, full stop. So, yeah, looking forward to our conversation echo all of that.

Dr Dani:

So maybe we can start Rali with just um. We usually like to start with kind of defining things. Um peter and I are kind of nerdy that way, so maybe we start with defining what do we mean by privacy.

Guest Rali:

I use a very, very big question and there are many, many definitions. From my point of view, every individual have the rights to define how they want to be seen in the world. For some people, they will post their lives on Facebook and TikTok. They will be sharing pictures of their families and stories. For others, this is probably a step too far and they will stick to emails and not sharing personal information online. So personal information is no longer about date of birth and name and address. It's a lot more into that of birth and name and address. It's a lot more into that because digitally, so many data points are created and you can use literally three or four data points to identify the individual. So, yeah, it comes back to who you are, what your values are and how you want to be seen in the world. So does that?

Dr Dani:

mean the traditional definition of privacy was things like making sure we keep, like, our name and address and date of birth safe, but it sounds like what you're saying is now it's also things like our photos or images of ourselves.

Guest Rali:

Yeah, so you definitely need to keep your passport details and date of birth and address safe. But there's a lot, a lot more into it and it comes to. What I'm hearing from customers is that it comes. It's about personal choice. Some customers will be a lot more digitally savvy and they will understand how data moves and what happens in the digital space. Some of them will clear their cookies on a regular basis, for example, or use software so they are not tracked across the web. Others will change their settings in Facebook, but again, from what I'm hearing is a relatively small number of customers do that.

Guest Rali:

Most customers are really confused. They don't know how things worked. They don't know how their data are used, and what really concern me concerns me a lot is that they don't know what value they get back when organizations use um, use their data. So I often, when I talk about customer experience, I talk about imagine like privacy is a maze and it's a maze very, very complex and the customer experience teams have the craft, the art, the power to put the signpost in this maze, to help customers, to put the signpost in this maze, to help customers navigate the maze and also get value so they can live more meaningful life, more fulfilled lives.

Dr Dani:

I love that visual explanation of a maze, all of these accounts and things we set up. There's all of these settings and honestly, half the time I look at them and I don't even know what they mean and what. And then there's, you know, the settings, and then there's the options, but it's never really clear what if I move it from this option to this option? What does it actually do?

Designer Peter:

so I love that description of it being a maze and then having signposts, yeah, and it being in the power of the cx team or their organization to actually put signposts in? Yeah, and it being in the power of the CX team or their organisation to actually put signposts in place you know, as you were talking there, I definitely think the opposite of that is true at the moment.

Designer Peter:

In other words, you know there's been deliberate hiding of signposts or elimination of signposts. A couple of, maybe just to dwell on some definitions. When you're saying customers here, this is customers, any customer in any situation. Is that where this applies and where you're looking at?

Guest Rali:

Yes, so when I tell customers, to tell customers in any situation, privacy law will apply to individuals and any organisation and again, the intent is to promote and protect individual privacy.

Dr Dani:

I'm talking about customers, individuals who interact with services and and products so those services can be services that are provided by, like commercial organizations or government agencies. So across the privacy law would apply.

Guest Rali:

Yep, that's absolutely correct. Also, customers will be people who read media, people who interact with social media platforms, people who go to the supermarket and buy bread and butter and snacks for their kids.

Designer Peter:

Be another kind of defining, drawing the box that we're playing and talking about it. So the three of us are in. I think the three of us are in new zealand. I know I'm in new zealand at the moment and you know, I imagine that our privacy laws are similar to, but slightly different to, some countries and maybe very different to other countries. So your, your, your point of view, your perspective to help customers is is about more than just people in New Zealand.

Guest Rali:

Absolutely so. You're right, legislation varies a lot between different countries. Gdpr is still probably the strictest legislation. However, there's a wave of change across the world. The US is catching up. Australia is changing their privacy laws as world. The US is catching up. Australia is changing their privacy laws as well. New Zealand is slightly different. In New Zealand, our legislation is principle-based. There's some argue that we are behind the world, so our legislation is much lighter. The fines for British are, let's say, $10,000. In globally, there will be millions of dollars. So we still in New Zealand, let's say $10,000. Globally, there will be millions of dollars. So we still in New Zealand take the approach yeah, she'll be all right and we believe that people will do the right thing. But we have a lot of work to do to catch up on legislation, technology and customer experience to make sure that we actually manage these spaces in a lot more appropriate way.

Designer Peter:

As you were speaking there, a bit of a face palm moment, because of course, we are in New Zealand, but even right now we're using technology that is based somewhere else in the world this Zoom technology belongs to. I don't even know where Zoom is domiciled and what regulations apply to that. So once we're involved in any digital experience, then we're essentially global citizens and we could be the whim of regulations and you know who knows where. So I guess that maybe this is bringing back to your perspective, which is helping customers be aware of the maze they're in and helping them navigate it.

Guest Rali:

I love this. I talk a lot about the concept local customers, global community because I live in my local community in New Zealand. In Auckland, I have my friends, but I'm part of a much bigger global community because I interact with so many people around the world via the digital technology and my individual footprint is no longer really living in New Zealand. It's the global footprint that literally touches millions of people across the world and, yeah, many customers don't realize that.

Guest Rali:

And yes, there are privacy implications if your business promotes services to products overseas, or overseas products are promoted in New Zealand as well, and organizations need to understand that. From a legislation point of view, governance, compliance and I keep coming back to customer how do we make things easier for the customers? So we respect customer choices, we provide good services and help them navigate the maze.

Dr Dani:

So we talked a lot about privacy law and I think there's also a lot of confusion around what is the intent of privacy law, and I say that because I think you've got that view of people feeling like privacy law means we can't share any information about anybody with anybody anyone else, right? What's actually the intent of privacy law? What's it intended to do?

Guest Rali:

so the again the intent is to very broadly is to promote and protect individual privacy. However, the devil is in the detail and again I'm going to talk, probably in New Zealand, and touch base on some of the GDPR. So in New Zealand, by law, consent is not mandatory. When you collect data, you need to inform customers what you do, but you don't need to ask for consent. But when it comes to communication and direct marketing communication, the guides and the legislation in New Zealand says you have to ask for consent. So I recently asked one of the experts what do we do? Do we ask for consent or not? And this is where it's really important that there's a solid legal view, what the legislation mean, how it is interpreted. So I always say start with your compliance is is your baseline. That's the minimum you need to do. You need to do the right thing.

Guest Rali:

However, I don't believe organization actually can allow themselves to stay in this level anymore, because privacy is about building trust with customers. It's not just about meeting the legal obligations. Now give you a couple of examples and why we need to be moving beyond compliance. So, ethical dilemmas there are a lot of cases where can we collect this data or do something with the data? Yes, we can because it's legal and technology allows us, but is it ethical to do that? Example is I have different sources of information, I have collected them, but I decided that I'm going to link them so I create a completely different view of the customer. Well, is that the right thing to do if we haven't actually asked customers and customers don't understand what's going to happen with the data ai?

Guest Rali:

I'm asked a lot shall we tell in our privacy policy that we're collecting that and we're training ai? And I'm like, yeah, of course, you should maybe ask customers if they they're happy to, you know, for their data use. So legislation and ethical questions say a lot in some cases, but given the speed of technology, the change in technology, there are big ethical questions that have to be, we have to face and we have to resolve. And also there's a competitive advantage. I do strongly believe that competitive advantage. I do strongly believe that embedding privacy in customer experience and moving beyond compliance will give you competitive advantages, and Apple, for example, is taking the lead on a lot of this. You know how they use privacy as a competitive advantage. There has been advertising as well by Apple showing how data are used, and I have seen other use cases around the world as well.

Dr Dani:

Two things about what you've said. One is there's legality and then there's ethics, and something can be legal but still be unethical, and just because something is legal doesn't make it ethical when it comes to data. That's a tension that we're always.

Designer Peter:

That's a tension that we're always have to have to balance and navigate yeah, just because we can do something doesn't mean that we we should do something. Something I picked up on there about apple using privacy as a competitive advantage and what I think what we're saying, what you're saying, what we're seeing is. For me it comes all the way back to the definition you gave us at the beginning there, ali, which is every individual has the right to decide are they going to be seen? And let's go back 20 or 30 years ago. That would have been relatively straightforward for everyone to kind of pick up and understand. You know it was like your date of birth and your name.

Designer Peter:

But now things have changed and are changing right now kind of exponentially and there's lots of noise, whether it's about data or ai, and I think for me this is the, this principle, if you like, this kind of founding principle is the anchor that we can come back to as designers and people who want to help other people understand the maze and and navigate through the maze.

Designer Peter:

And there's something about the apple situation where they maybe they're realizing that I I guess I'd be interested in hearing what your thoughts are on it. How might we educate all of us around the value of the information we are willingly but unwittingly given away over the last 10 or 15 years any of us on social media, last you know 10 or 15 years, any of us on social media. And it reminds me of a documentary that came out and must have been 10 or maybe longer, about google or about internet terms and conditions. You know the terms, the terms and conditions that everybody just ticks saying yes, I accept, because to read it would take three years. The documentary illustrated, you know, just be aware that these are the things you're signing up to. You're're basically giving away your privacy. What are your thoughts on that?

Guest Rali:

Apple has made a stand in saying this is a competitive advantage. So now we help customers navigate this space and give them choice. A lot of organizations they choose and control from a designing point of view needs to be at the center when we're helping customers manage privacy. I do agree with that, but probably I have another thought as well. Giving too much choice to customer, too often with things that are extremely helpful, probably confuses customers a lot more, and this is where it needs to be. Organizations needs to have really strong values when it comes to data and privacy and strong governance. And saying I will have the best security, I will know how my data is used. I will ask the question should we collect? And then telling customers in a very simple language my promise to you is that I will keep your data secure and I will give you value back and I will help you in the journey and then giving customers meaningful choices during the journey and this is really important. We agree on terms and conditions. I do not.

Guest Rali:

10 years ago, when you sign up for this email, how often have you been asked about your preferences and how they have changed? I mean, a lot of your life has changed, but we're never asked if our data preferences have changed and the way we receive communication should change. So embedding these nudges within the journey and saying, if you tell me you like these colors, I will be able to show you clothes or furniture in your favorite colors. So explaining the why when customers share more data within the journey In this way is much easier. It's relevant, very simple and customers almost visually can understand the value immediately and again they feel really good. Examples from Ikea and Fioras across the world, where they actually can demonstrate how, within the app journey, online journey, you can show the value back to customers. Unfortunately, I think this is an area that is not developed. It's not a lot of focus because it's complex and because it requires a lot of effort to build all of these nudges through the journey and show the value.

Dr Dani:

There is such a thing as like choice overload. When you have so many choices, the more choices you have, the harder it is for you to choose. So I think when you're given a lot of choices about something you don't understand, we tend to make less choices because it's just that much harder, right, even if it was like I mean, you think about like I can never decide what I want to have for dinner, and that's like a really simple choice and it's like a topic I know a lot about. And then you take that to like privacy law. It's just that much more complicated.

Designer Peter:

How do you use our design thinker toolbox to make this simpler so that people understand, like the key points, that they need to make a choice, and then those choices are presented in a way that is easy, simple to understand for me that what's emerging, I think, is the problem to the kind of design challenge, if you like to overall to solve, is maybe less or is about privacy, but I just thought perhaps it's more about trust, because you know, as we just said here, there's so much coming back to the maze analogy and trust. You know, if I'm the organization, then then I need to earn and maintain the trust of Danny and Riley to lead you through that maze and have you be able to trust me that I am doing the right thing on your behalf when it comes to privacy, because he's finding the balance between, like we said, giving customers too much information and giving them too many choices too often, and then at the other end of the scale is like not giving customers any choice or any time.

Guest Rali:

Yeah, look, you guys are spot on. I often start with having the courage to ask a question. When it comes to designing a new product or a journey, you will have stakeholders that we always want everything and all that in the world. But having the courage and asking why are we doing that? Is there a better way to do it and can we actually use a simple language to show customers how this works? It's like if I can explain to my mom why I'm asking for this information, what she's going to get back, that is a good start.

Guest Rali:

The privacy policies Peter you referenced is a very good example. I still find it's very common in New Zealand. It's a legal agreement I'm reading and it's pages and pages long and most customers don't have legal degrees. They don't understand this and the organizations that are hiding behind the privacy policies so they can use that for a lot more benefits on organizations than actually doing the right thing. It's like hide-and-seek. I keep telling organizations don't play hide-and-seek games. It's not a good place to be.

Guest Rali:

Be transparent, courage to ask the right question, challenge yourself to use simple language, like talking to mom or grandma Really really important, and let's figure out how to simplify privacy policies. I have seen great examples of using videos, icon graphics, even games, for children. This is a great example from overseas where children have to understand what's happening when they play games and they use actually a game to show what information to create and collect and how it is used. So massive, massive, massive opportunity for customer experience teams to get this legal, complex jargon that the legislations have created and make it accessible, transparent, simple and just for for humans, for real people.

Dr Dani:

The other thing that that you mentioned earlier that I really want to come back to privacy legislation. Yeah, it is a it is a legal obligation, but I've always believed it's also the way that we show customers we care about them. The legal obligation is just, that's just table stakes. As a customer, I expect you to do things that are legal. I expect you to be abiding by the legalities. But then I also have this other expectation that you're going to take care of me. Like I'm giving you money for services. I'm giving you money for services, I'm giving you my information for services and in return, I expect to be looked after. So this is where that competitive advantage comes in, because we can actually use regulations to show customers that we are looking out for them, we are doing the right thing by them, and then the regulation is actually the measure that helps, or the metric that helps to show that yeah, you're spot on.

Guest Rali:

governments already said that you can't hide behind terms and conditions and privacy policies if something goes wrong. So lawyers will be looking a lot more details. They say, well, yes, it's in the terms and conditions, but it's not an accessible language, so the customers didn't understand that. Hence this resulted in data breach or something else, sharing information, and this is a lot of spotlight on how do we make this, especially privacy policies, a lot more accessible to customers. And again it comes back to the values of the organization Having the courage to ask the question and these questions are welcome Getting CX involved in governance forums when it comes to that, so you bring diversity of thinking, and just simply debating the ethical implications from different angles. So all of that will build stronger culture where you know there's a lot more transparency, and this, all of this will result in better customer experience and more value in building trust.

Dr Dani:

That's another thing you hit on, because I've been at plenty of tables where regulations are being discussed and oftentimes it feels like I'm the lonely voice. That's going well. What about the customer? What about the customer? What about the customer? And it it feels like I'm the lonely voice. That's going well. What about the customer? What about the customer? What about the customer? And it almost seems like that's not a space, that there isn't a space created when things are being discussed about how we're implementing regulation and that really brings the customer into that conversation privacy is a leadership decision to do the right thing so it needs to start from the top.

Guest Rali:

The exec teams and senior leaders need to create the space for people to be able to ask these questions and it's literally having the courage to ask the first question and starting this conversation probably is the hardest. Probably is the hardest. And also, again, I spent a lot of time talking to lawyers and I would love to spend time with lawyers and coach them how to ask different questions in the customer language, understand customer behavior. So we facilitate this really good discussion between the different teams. But the centers around the customers, not one team looking after legislation, the other one customer journey thing. There's not actually any debate or discussion in the within the organization you're talking our language now.

Designer Peter:

Really, those are my favorite parts of a design project where we get the cross-functional team together, often with not just different points of view but completely different and opposite points of view, and centering the conversation not around one of those points of view and debating it, but actually putting the customer or the employee, the person we're trying to help, putting them in the middle of things and then having everyone and start to align their perspective and realize that that's the important thing, not their individual. So, yeah, that's definitely a human-centered approach. From from my point of view, leadership needs to start and I imagine that leaders and organizations this is a a change for for them, they've become successful leading organizations in in one environment that has worked up until now, but now, as we're saying, we need to they need to shift their organization to carve out a new competitive advantage around around trust.

Designer Peter:

I imagine that's quite difficult for people yes, it is yeah, what sort of uh, you know what sort of things have you seen working to help them um, start to see the potential and what they might need to do differently?

Guest Rali:

starts with the decision that we want to make a difference in this space and again it comes from it's a leadership decision to privacy to do the right thing. What worked really well is that, when it comes to governance, the governance forums include representatives from different teams marketing or CX, legal technology. I have seen cases where actually the leaders leading the governance are not from the technology or the legal team. It's actually customer-facing functions like marketing or chief marketing officer, chief customer officer as well, which, just because you have senior leaders working in the customer function, leading data in privacy, governance changes the conversation completely, absolutely changes conversation completely. The other thing that works is that I, like the workshop a few years ago, again representative different things and I was really pondering how to how do we bring the team together, given the different expertise and expectations in the room, and I ended up saying that I need to get um into bringing fun into the room. So I found videos of how that are used. Uh, for different things. I got customer stories, I got market research, I recorded videos, videos with team members, but the real human stories bringing it in the room and once we started talking about this, you could just see how the tone changes in the room and instead of talking about what's the technology roadmap or what's the next legislation coming, I said, oh my God, really this is happening. I did also have to present an exec team about the roadmap and I'm like, well, here's the pre-rate, but it's quite long and complex. And in the end I said, you know what, if you have time to read Pre-Retomizing, if not, just watch this video. And I sent them one video how actually IKEA is showing how privacy is embedded in the customer journey and showing how that is collected by the journey and explaining the value back. And I walked in the room with the exec and one of the senior execs working in operations says I get it. Now, you have been talking about this for two years. I get it, how can I help? And I'm like I watched the video and I'm like fantastic.

Guest Rali:

So in coming back to where I started, this space we focus on very long documents and legislation and technology and complexity, but there is a different way for us to look things, differently. One more thing so when it comes to training usually the legal team we do the training and show us what the principles, legislation, where we are. Here's the policy read it is 120 pages. All of that. How about we change that? And actually CX experts do privacy training or marketing experts do privacy training and make it really relevant for the team. Yes, we cover legislation. That's what I do in the training I run. Yes, we cover legislation, but we talk about registration journey, we talk about preference and make it really, really relevant for the team.

Guest Rali:

For many brands, the first moment of truth when customer interacts with the brand is the registration page and the consent button. So how can you use this as an opportunity to build trust and excite customers and tell them the value you're going to give them back? And excite customers and tell them the value you're going to give them back? So this trusted relationship starts from literally the first moment, instead of trying to follow customers across the network and use different medias?

Designer Peter:

Something that's been playing in my mind is this kind of maybe a reminder or a realization. The mental picture I've got is, when I go into even a retail store, let's say a furniture store, and I buy something, I hand over my you know, my, my card and I give the store some money and I get let's say, I'll get an armchair and that'll end up on my house. I've carried out a transaction. It's an exchange of value. I've given them money, they've given me an armchair. I think hopefully more and more people are realizing that at the same time as doing that, I'm also even on that simple transaction giving the organization, that company, some of my own personal data, some information about me, and at the moment it's an imbalanced transaction because I don't get any value from that and what you're saying is part of the privacy maze is being aware that we're in a maze, but also, I think, having consumers, customers, users, et cetera, start to realize that there is an imbalanced exchange of exchange of value, especially online, especially in digital environments, where we're giving away so much of our own information kind of unknowingly, unwittingly, and but we're not getting anything in exchange for that. That kind of brings me back to the.

Designer Peter:

I just need to ask you something you mentioned. We're almost at the beginning rally, which is a lot of the time. I'm kind of a little bit naive and let's say fair about up to a point, about digital information. It just is what it is. You know where we're living in the you know the 21st century. Our information is out there and I part of my brain, imagine, is that, well, it's just in a massive big pool of data so they can't possibly find who I am or information about me. But you mentioned that we only need three or four points of data to identify individuals. Tell us more about that.

Guest Rali:

Yeah, so there are different studies and the latest I heard was that you need four different data points and you can identify a customer or individual, and generally the study site has a rate that goes between 3 to 4 and probably 6, 7 different data points and you can identify an individual, and this is a well-documented fact. That's why privacy information is no longer perceived to be only your name and date of birth. It's actually any information that you can identify a person. It could be religious beliefs, it could be political preferences, etc. All of these data points can be used to identify individuals. So when organizations collect data, really the why needs to be debated. Why do we need it? Also, how the information will be used in a safe way so we don't actually do harm even an intent that you don't do harm in our organization, and also not debating the why I'm collecting it, but debating if I have the right data models to get the value.

Guest Rali:

Um, I'll give you an example carbon footprint. I decided to register for a service there when you know I will be given prompts how to improve my carbon footprint, and I have to ask a lot of questions and naturally I was do I drive a car, do I take a train, how long I travel, number of households, etc. Which was okay. I mean, like makes sense, yeah. But then I was asked what's my household income? And I'm like I don't really trust this organization. And why should I tell them what's my household income?

Guest Rali:

And then, based on my experience in market research, I know that customers, very often they don't share their real income, they will usually share lower income, and I'm like, well then it's not accurate, the information, the information you're going to give me back. And then I literally googled and say, using income to you know, predict carbon footprint. And yes, I found study that use income data. But they found quite a few study that says that you don't need to predict that. And then I'm like, okay, so I get it, but why do you need my income data to give me value back? Sure, thinking has evolved and you don't need to collect this data anymore. So, coming back to the why, how information will will be used and do you have actually the right data models to give the value back to customers?

Designer Peter:

Just bringing back to this sorry, dani, the maze and the signs an alternative. That organization could create a better customer experience. Did you sign up with them or did you resist?

Guest Rali:

no, I didn't. And then I just decided that I don't trust the organization with my, my data and it's data that is not necessary for them to collect, and I decided not to sign up they lost your your.

Designer Peter:

They lost your uh the opportunity to have you as a customer. Alternatively, they could have. Maybe if they had explained why they were collecting that uh income data, you you might have continued, but the fact that they um put that burden of investigating that question to you, then they kind of they lost you. They uh anti-nudged you. If that's a if that's a phrase yeah, the trust.

Guest Rali:

That's the way I want to be seen in the world. Yeah.

Dr Dani:

When I do research for organizations, you know they usually have like, oh, we should ask these, you know 97 questions. But for me, as a researcher, doing research ethically is something that not only an obligation but it is just something that I just do. For every question that we ask a customer, we have to be able to demonstrate why we're asking it, like, why do we need to ask this? And because we've asked this, we're going to then do X, y, z with it. So I think it's not just understanding why it's being collected, but it's also understanding the outcome that's coming out of having that information. And you have to do that for every single question that you're asking. And if you cannot fill out, if you cannot explain why and then the outcome of that, why then you don't need to be collecting it. But oftentimes it's such a fight to go no, we're not. I don't care that. You've asked that for 100 years, but you don't have to ask it anymore.

Guest Rali:

It is. It is. It's a valid point, dani, that's my experience as well. We want to know everything for everybody because we believe it's important. And again, referencing a study I've looked at recently, this is a study that 60% of the data organizations collect is not used, and that's a massive number, and that's my experience as well. Most of the data collected by organizations is not used. We keep it just in case. But just in case is not a timeframe, it's not in legislation, it's not in any best practice, guys, just in case is not a timeframe. Yeah, so that's again comes back to the values of the organization, the principles the organization wants to follow and the culture they build for. So people can ask, have the courage to ask questions and drive change.

Dr Dani:

Throughout this conversation. You've mentioned having the courage to ask the question. There's always like a couple of questions that get everything going, everything moving. So I wonder if you could give us some examples of questions that were a leader in an organization or just anybody in an organization that really wants to get a conversation going around this and they built up the courage. What are some of those questions that can get started?

Guest Rali:

with. Should we? Um, especially when you we have a senior stateholder saying I want to do that and I said, can we collect it? Yes, and say should, should we do that? And I would say, take the first step. It's really scary when you engage with senior stakeholders and you need to be asking these challenging questions. Start with the first step. Senior leaders are getting a lot more involved in the data and the privacy and I would like to believe they understand it's a big ethical device that needs to be had. So should we is a very, very good question, then.

Guest Rali:

Probably not the question, uh, danny, but I would say an advice I would like to give if you work in cx, call your legal team and have a coffee with them and just just talk. Don't try to understand legislation, say how can I help, what is going on. And if you are in the legal team or the compliance team on the privacy team, reach out to the marketing guys and the cx guys and have a coffee with them and just how it is going. I just want to understand more. Probably this is the two most important things for me, um, that have made a difference. I had to build up my courage and I literally I had every week. I said I will reach out to three people and ask questions and I will try to meet them for coffee. And this is how I met you, daniel and Peter, and I'm continuing that and it gets easier and we meet amazing people and I promise you, I promise you, you will get more confidence and you will identify opportunities to add value to customers and build your expertise in this space. Nice.

Guest Rali:

Follow me, I'm always up for a coffee.

Designer Peter:

Thanks, raleigh. I love peak end theory, Dani, and that feels like a nice high to wrap up on.

Dr Dani:

Yes, that was definitely a peak end role moment. Yeah, so I just want to say thank you so much, raleigh, for making the time to come chat with us. This has been so insightful and, I think, a view of CX and design thinking that isn't talked about very often. So we appreciate you making the time for us.

Designer Peter:

Thank you, Raleigh.

Guest Rali:

Thank you. Thank you very much, dani and Peter. I love discussing this with you and I hope, I really hope, that you find it, uh find it helpful, thank you definitely, thank you all right, that is it for us in this episode.

Designer Peter:

Bye peter bye danny, bye riley, bye riley.

Navigating the Maze of Privacy Law
Navigating Privacy and Ethical Dilemmas
Simplifying Privacy Policies for Better Trust
Data Ethics and Customer Experience
Peak End Theory in CX Design