The Joy of CX

Lies, Damned Lies & Statistics

May 02, 2023 oomph agency Season 1 Episode 2
Lies, Damned Lies & Statistics
The Joy of CX
More Info
The Joy of CX
Lies, Damned Lies & Statistics
May 02, 2023 Season 1 Episode 2
oomph agency

We look at the crucial role that data plays in creating and analysing customer experience. 

  • Why is data so important for customer experience?
  • How do you make data make sense to personas?
  • How can we make data more simple to understand?
  • Quant v qual?
  • How can you tell customer stories with data?

Presented by Sue Carter. Featured guests Stephen Priestnall & Richie Hester.

Recorded & Edited by Mr Anderson Limited.

Support the Show.

@oomphagency | it's hard to make things simple

The Joy of CX +
Become a supporter of the show!
Starting at $3/month
Support
Show Notes Transcript

We look at the crucial role that data plays in creating and analysing customer experience. 

  • Why is data so important for customer experience?
  • How do you make data make sense to personas?
  • How can we make data more simple to understand?
  • Quant v qual?
  • How can you tell customer stories with data?

Presented by Sue Carter. Featured guests Stephen Priestnall & Richie Hester.

Recorded & Edited by Mr Anderson Limited.

Support the Show.

@oomphagency | it's hard to make things simple

Unknown:

Hello, and welcome to the second episode of the podcast the joy of CX, where we look under the hood of customer experience. If you haven't listened to the first episode, I think we'd advise just go back and have a listen to that. Because in that episode, we really lay out what customer experience means, what it means to businesses, and why we even need to think about it. So for today, however, we're going to start digging a little bit deeper. Today, we're going to look at the role of data in CX and customer experience, and looking at how data can be helpful in setting your CX strategy. And in changing that CX strategy, perhaps. And this might give us a little insight also into how you can persuade any CX sceptics that it really is worth looking at. I won't give too much away at the moment. So I'm going to introduce our guests. First of all, we have Stephen Priestnall, who is founder of oomph's customer experience agency. Hi, Stephen. Hi, Sue. Good to see you tonight. And we're also joined tonight by guest star Richie Hester, who is a data scientist, Richie, perhaps you can just give us a second or so to talk about your background. So I worked in. hello Sue, by the way, I've worked in operational research and customer insight for 30 years plus now at various companies in various sectors. So in the airlines in banks, health care, a lot of agency work, telecoms, you name it, fabulous, all of them. They're very different in how they view themselves as companies. But typically, they're all trying to do the same thing, which is understand their customers. Brilliant insert joke about you're not looking at day older than 21, with your 30 years experience. And I'm Sue Carter, I'm the host of your podcast today. I've been working in customer experience for a number of years now and work very closely with both Stephen and Richie. So I mentioned that we're going to talk about data today, which can sometimes be terribly confusing, but also sometimes extremely useful, as Richie i'm sure will tell us shortly. But first of all, I just want to step back a little bit. And start with you, Stephen, tell us what the role of data is in CX and I and I say that because customer experience is often seen as something that might be subjective, or based on qualitative research, understanding what people want, what they need, how they work. We don't necessarily think about looking at data and numbers and crunching numbers to make your experience the truth fact those two words don't even really mesh when you think about it. So what even is the place of data in customer experience, you make some really good points. So the fact often, I think the, you mentioned talked about listening to our first podcast of describing what customer experiences but often the conversation around customer experiences to do with what happens between the customer and the organisation and it becomes quite a pragmatic thing it becomes to do with services it becomes to do with engagement levels. And some of that can feel quite soft and quite subjective. So it becomes aspirational for organisations but aspiration is about right, have we changed the experience yet? Have we we put new people in the call centre we change the scripts have we built a new website. But in fact, the science of CX as we've started to look at it can be really well informed by data. But it's not just about looking at a spreadsheet and then having the data fall out. We've we've got to be really thoughtful about combining you talked about qualitative data, which might come from survey style research. It might come from talking to customers, it might come from reviews and feedback, and then some quantitative data to give us a better handle on the trends of experiences. And there are ways that you can build those those different components or data together. At some point you have to make the FD interested, financial director, those financial director, the CFO, the person responsible for the numbers, the ops director, you've got to make them interested in why CX strategy is a viable thing for an organisation to do and justifiably for their role. They're only going to be interested if they can see a benefit to the organisation. And that really is where we found bringing the science of data analysis into the world of CX has been really helpful so rich, I'm gonna turn to you just to get your view on this then Stephen and I probably come from that perhaps qualitative talking to people understanding their needs, trying to work out what they think, what they feel practically what they might need. But you I know our numbers, man, because I've seen your spreadsheets and I glaze over slightly. Let's be honest. So when you try to match data and customer experience, what do you see? What's the place of data for you? How do you make that jump? Okay, but if a top level, and we were to see them go into financial director, and it's one one question, do the numbers that we make money from this, and quite often what you're getting asked very high level is the confidence in decisions being made. And that involves numbers that involves using statistics about your customers to give the finance director ops director, the confidence that their money will yield a significant return. And so typically working backwards on a project by project basis, that's what you're going to be asked to look at. Therefore, you need you need to get the volumes of customers and range of data such you can analyse, look back historically, and then try and get forward based on that. So how do we do if like, that all sounds brilliant, in theory, you know, we've got some brilliant ideas about customer experience, and hey, we've got some data that can back it up. But that's not obvious these, the two worlds that don't necessarily meet. So how do we who wants to go first? How do we go actually go about making the data fit with the experience that we've looked at? It's a good challenge and Richie has kindly let me go first. Because normally involves a bit of a squabble between Richie and I. And those worlds and the world that I start with often see, what you and I work into the process is usually a hypotheses. So it's usually, you know, we scrape some secondary data, we talk to stakeholders, we run workshops, and we established right, this is what it looks like, is it as it is a representation of the needs of your members or your service users or your customers? And we think we can categorise these needs in different ways. And we think they move over time like this. And we think they are related to products and services like this. So here's a, here is a hypotheses. And then it's for our world. And Richie's world me and Richie to get into the room and go, here's my hypothesis, Richie Have we got any data to provide any evidence to support this? Or develop it or evolve it or disprove it? or disprove it? Sometimes, that's most common once. They're just great tension. And it's good healthy tension, what makes it really good and beneficial is typically if I'm brought in to prove or justify ongoing behaviours, all you're ever doing is finding better ways of doing the same thing. Coming in from your angle means you find actually better things to do, which is significantly better if I do that. Can you give an example of that? Explain that a bit more? Because that sounds like something that, yeah, it's around. Yeah. So typically, you'll be brought in and think, Okay, if we reduce this product, or this service by 5%, it'll have a 3% effect on revenue. So you're just doing more of the same. And just make a slight tweaks to get different revenue sort of performance was much better as to come in your way and just ask open questions and actually identify something which is totally different that you hadn't thought about before, which is not just building on what you've done. But taking you. And if you rely on historic data, you will find that you've got to come from it from the unwritten consigners. In fact. It's interesting that you made that point about historic data, though, because if I'm a customer experience manager, director listening to this, I might be thinking, Well, that's all very well, but I don't have a budget to do all this extra data research that could be quite costly. So how have you used? And I know you've done this? It's not a trick question. How have you use existing data to help support perhaps hypotheses that have been produced by the more qualitative side of things? How have you manage that? How have you scraped existing data? Okay, I mean, recently, we did a project for a major automobile organisation. And, again, that's built on hypotheses, as the starting point is how people behave and what needs they have rather than what does the data tell us? And it's just turning on its head. So it's just a case of looking to what data do we have that's available that could fit the conjecture? And if not, what can we use? What rules can be used to suggest them imply and infer and if not walk me by the hand overlay to build up the story but starting from the story and then getting the data is the difference and how have you seen that pan out Stephen worth I think the conversation is actually Richie Stephen one iterative and it's it's not as simple as saying we do some cool stuff. He does some cool stuff and concept and we glue it together. So the the art in the science So that that we do is to create modelling equivalence to the hypotheses that we start to create and as modelling equivalence out data. Now, I think it's fair to say, Richie that we don't have 100% in the equivalence every time that we do it. But that's fine, because we're not trying to group discrete segments of customers into little pockets, what we're trying to do is to understand the needs of customers or members or service users, and where they're trying to understand the incidence of those needs, and therefore how an organisation can intervene to support those needs. And so it's fine to have soft judgments about how accurate the model is. Because across the whole base of the customers, then it becomes truer, and then more effective. So the iteration for Richie and eyes to go, here's an idea, Richie goes I can't really worked that into the model for how bad the idea was this the app, I think we've got some data that can support that. Okay, so if you run that through the desperate what happens? Oh, that's really interesting, it looks like this, we then go back to client. So here's a perspective of how it looks like needs, show up on your database, does that reflect real life, and then we check back in with stakeholders, we get positive for that. And then we extend the model. So that process is finite. So you get to a point where you have confidence, then you keep having to inform them over time. And you can build hypotheses. So you can think, well, this seems to be working, then you can take a subset. So second circuit needs state. And we've got real data and see what those customers are saying, is what we're saying we'd expect them to say, based on how we picture them. I must admit a confession here. I'm always slightly nervous when we actually dig into the data, because I think we might have literally might have got everything wrong. And this could be our downfall. So Richie reassuring us that if you did find that you would speak up, right? Because you are pure data scientists, you're you're not trying to find something that matches what you're saying. You're trying to help us develop in the heart and some backup what we're saying, but not not at the sake of what am I saying, we're not trying to massage the numbers here. If you do find something else, it's never a correction. It's just a further insight. Now, so it doesn't mean anything necessarily wrong before, it just means you found something further and see, would you ever get that slight here? Or do you have more confidence in me, it's all about confidence, I think, I don't think so is about confidence or fear. It's just, it's just the trust in the process. So Richie, and I've been around it a few times. And as of you and I, on the call side where we, you know, we might be sat in a room with lots of white paper on the walls, looking at the notes we've taken from various meetings, and have a great idea to start the meeting. By the end of the meeting, we've dismissed and moved on to something else. And I think the iteration that the we go through with Richie has the same trust in it, which is it doesn't it's not about where you start as what where you finish. As long as I think what is important, though, is we can we can run a trail of logic through the process. So we can give the client the confidence that you can go back to the beginning. So rather started from here, then went through a number of stages. So I think, yeah, maybe I've just got to that stage where the process takes the strain. And we're confident we have a good methodology to deliver that. And if I were presenting stuff to you guys, and you were taking it as fact, and proven, I would worry, but it's the fact that we're taking its assumptions, guidelines, suggestions in science, and our clients are doing the same is growing. So the the art in the science that that we do is to create modelling equivalence, to the hypotheses that we start to create and those modelling equivalents have data. And I think it's fair to say, Richie, that we don't have 100% and the equivalent every time that we do it. But that's fine, because we're not trying to group discrete segments of customers into little pockets, what we're trying to do is to understand the needs of customers or members or service users, and where they're trying to understand the incidence of those needs, and therefore how an organisation can intervene to support those needs. And so it's fine to have soft judgments about how accurate the model is. Because across the whole base of the customers, then it becomes truer and then more effective. So the iteration for Richie and I is to go here's an idea, Richard, because I can't really work that into the model for how bad the idea was this the app I think we've got some data to the console thought that. Okay, so if you run that through the database, what happens? Oh, that's really interesting, it looks like this, we then go back to clients. So here's a perspective of how it looks like needs, show up on your database, does that reflect real life. And then we check that back in with stakeholders, we get a positive for that. And then we extend the model. So that process is finite. So you get to a point where you have confidence, then you keep having to inform them over time. And you can build hypotheses from that. So you can think, well, this seems to be working. And then you can take a subset. So the second circuit needs states, and look up your data and see what those customers are saying, is what they're saying. We'd expect them to say, based on how we pictured them, I must admit a confession here. I'm always slightly nervous when we actually dig into the data. Because I think we might have literally, we might have got everything wrong. And this could be our downfall. So Richie, reassure us that if you did find that you would speak up, right, because you are pure data scientists, you you're you're not trying to find something that matches what you're saying. You're trying to help us develop, enhance and backup what we're saying, but not not at the sake of what am I saying? We're not trying to massage the numbers here. If you do find something else, it's never a correction. It's just a further insight. Yeah, so it doesn't mean anything necessarily wrong before. It just means you found something further and see, would you ever get that slight fear? What do you have more confidence in me? Confidence, I think I don't think seriously about confidence or fear. It's just, it's just the trust in the process. So Richie, and I've been around it a few times. And as of you and I, on the call side, where we, you know, we might be sat in a room with lots of white paper on the walls, looking at the notes we've taken from various meetings, and have a great idea to start the meeting at the by the end of the meeting, we've dismissed and moved on to something else. And I think the iteration that the we go through with Richie has the same trust in it, which is it doesn't it's not about where you start as what where you finish. As long as I think what is important, though, is we can, we can run a trail of logic through the process. So we can give the client the confidence that you can go back to the beginning. So rather started from here, then went through a number of stages. So I think, yeah, maybe I've just got to that stage where the process takes the strain. And we're confident we have a good methodology to deliver that. And if I were presenting stuff to you guys, and you were taking it as fact, and proven, I would worry, but it's the fact that we're taking into assumptions, guidelines and suggestions in the site. And our clients are doing the same is great. Just want to take that on a little bit then and kind of read perhaps, again, reassure people that are listening, when we talk about data, we often pitch a numbers, essentially. And if you are not a numbers person or even if you you know it's just not you don't get it how do you go about making that data understandable which because we don't want to scare people off in thinking all you need to have this you need to have this really complicated data, scientific way of looking at it and you can only do it with X Y, Zed complicated programmes, which I'm sure you do. But you know, how do we reassure people that it can be done we can make it explainable we can we can make you comfortable, you can understand this data, we typically Finance Directors with confidence and what they ask them typically assets. Is it significant? I do the statistics suggest that this is trustworthy? They don't actually know what it means. Because no to ask it and they just want a yes no answer. So top level they want to know is this believable? Good way to bring it to life to bring out real examples. If you bring a Customer Quotes quite often back things up the paint, but poor pen projects have been going on for years at the more you can describe in common English. People aren't interested in statistical approaches used. It's just what kind of people does that mean? What kind of action can we take in English? So how do you feel about people not being interested in the statistical approaches? That's a bit sad, I would be more than happy to build a career with people trusting you. As Stephen, how do you go about making that data? Understandable approachable for client? Because you as Richie said, You're the person that has that one on one relationship? Yeah, so this is the challenge at the end of our process as well as because if our information isn't accessible to the target audience, which is our client, then it hasn't done its job. So we try and do as good a job we can of believing in our principles of customer experience on behalf of our clients. And so we we will invariably take the kind of methodology and process driven kind of cycle that we've been talking about a The last few minutes, and then close that off with a creative visualisation process. Now that might be graphic visualisation of the data items, or it might be a more substantive visualisation of the report of the work. And what we're very conscious if we, if we, if we have a forever board of directors who are going to be the recipients at some stage of the work, then the first two or three pages have got to do a real job. And they've essentially got to cut through all of the data that we've got as evidence behind it. But the head of insight, wants to make sure that that data reside and understandable and is proud of the West represented. So what we found is the creative visualisation approach, and sometimes it's even an interactive format, is, is as important as the nuts and bolts of the work going through to drive the kind of sites out because there's always going to be a number of different stakeholders, you've got the CFO who might want that top line. And then as you say, you've got the insights driven team, who might actually want to get into the nitty gritty. Do you ever find anyone actually wants to open the Data spreadsheets and go, Oh, I'm really fascinated by this. I don't know. You're all the time. I'm gonna just move on to something now, which I'm a question that I'm interested in. And I'm not sure where I stand that whole qualitative versus quantitative. Stephen, where do you see the weight of, you know, if you're looking at a project, how would you weight? Does one have more importance than the other? I think if there is a question about qual versus quant, we haven't done our job and driving the inside out. I think the point, the point is the evidence for the insight. And I think sometimes you that might be driven by a more qualitative engagement. And sometimes that might be driven by, you know, a bunch of transaction data that we've looked at through a different lens. I think the important is to track the evidence back and, and I think one of the things that we've learned over time, this is not about delivering fact, it's about delivering support and insights. And what we don't do is, is present a set of here's 10. Things to do as a result of this directly. We present ideas for development to the organisation and their expertise and knowing how to apply the Insights is most important. So I feel quite proud that we really blur the edges around quality one. That feels right to me. Yeah, and quant typically is very good at looking backwards and understanding who people are, where they come from what they've done. If you can merge it. Well, with the qual side, it starts looking forward to what they do. And the two together are much stronger than two in isolation. That's really another way of looking at it. Yeah. You're welcome. Richie, just turning to then thinking about opportunities. You know, you mentioned you've been working in this field for a few decades. And the work that you are both doing here is is quite new the way you're kind of you're saying blurring the edges around quantum quants. So we're, you know, we've been talking before about opportunities, where do you see this going, bringing data into customer experience? Okay. I mean, typically, what's happened is people got quite set in their ways and think of people in terms of segments, and they pigeonhole people and say, This person behave that this, this person behaves like that. And then as a result, you just do more of the same thing. But the same people the same message, if you start thinking wider and acknowledging, basically underneath, or most people are similar. So most of us were more more alike than other life. And it's just capturing some different needs different moods, different requirements, it's a much stronger way of giving your customers what they need. And would you echo that, Steve, what opportunities do you see? I think, yeah, I would echo what Richie said, I think the idea of moving from characteristics of individuals that are static to needs which change over time that could be merged. The way there's a Venn diagram crossover between individuals is that it is a much harder picture to form in your mind, which is one of the challenges that we try and solve and in our visualisation, but if you can form that in your mind, and you can, as an organisation, start to influence your product development and service development with the idea that needs change over time, both lifetime relationship with you the organisation and the VI the ageing of the individual, if it isn't individual or the changing nature of the business, if your target is a business, then you can continue to be really accurate and presenting products and services against those needs. Even though different people are coming to be served by, and so I love that idea of the dying and dying. Not that easy to say. Or rather it not being static, your idea of the customer. So Richie, to put the data on top of that for an organisation is it then a case of, you know, repeat that data, do more research, get more data out to help that kind of ever, ever increasing circle of customer experience? Typically doing stuff regularly. Set times is a good working fine change. So there's definitely some strength in that. But the main thing is just accepting people aren't the same. And just building models that allow your customers to work within that framework. So what about stories, building customer stories? I know you talk Stephen a little bit earlier about building the picture when when we were talking about how do we how do we present the data. But organisations are probably very much stuck in that segmentation that you know, 35 year old Tom, who lives on the outskirts of Peterborough, and he likes cars, you know, they see that as the person they want to talk to you. But there's no ever evolving story about that. So talk to us a little bit about building the story of the customer, but how you then bring the data into it because this episode is about data. Yeah, we talked, we talked a little reference Episode One quickly, because we talked in episode one about the about what CX or how, what an organisation does with it. And we talked about the idea that there is an implementation consideration that comes with the CX strategies. So how does an organisation deal with it. And some of that is internal, some of that as a cultural change inside an organisation to understand its customers or members or service users in a different way. So one of the one of the ways in which you can help the organisation on a journey of cultural change to police customers first is to create vibrant, interesting stories that resonate about our members and about their experiences. And we've often turned that into an interactive tool. So key stakeholders are key members of staff who have responsibilities in that area can start playing around and interacting with different members needs or different customers needs at different points in time. But of course, that tool needs to be informed by some data, and then we need to challenge ourselves making sure we keep that up to date as well. Can you thoughts on putting data into a story that again, the two don't sound like they match, which, you know, I think Stephen covered that pretty well really. He doesn't say that very often. Never said that before. I've never heard you say that before. But I think that might bring us to an end of our conversation with data and customer experience unless I have missed something which is entirely possible. I think, you know, if I guess if I, if I just had one thought, in my head, too, it was it would be the for customers to get their sorry for clients, to put trust in the data about their customers is a real challenge sometimes, because a lot of the infrastructure built around recording information about members and customers is legacy infrastructure with built to do something else is built for a function. And there are so many of clients that we work with, really struggle to trust the data that they have in the form that it is and they spend a lot of time and money trying to improve it trying to go back to source and make it better. I think what we found is a work that we do ends up creating an interpretation of the data, or which can be populated over time without changing it at the front end. And creating its own dashboard of customer intelligence. And I think if we you know if that's part of a cultural change for a client that can put data right at the heart of a CX strategy. And quite often clients will be really keen to invest in appending data or social media analysis and stuff and until they've worked their own data properly and really understood it. They need, everything's in their own customer data. Brilliant. Well, that will be reassuring, I'm sure. So thank you, Stephen and thank you Richie for being our special guest today. We have many more podcasts coming up so in the future episodes we'll be talking about the intersection or the relationship between user experience and customer experience. We'll be looking at how do you persuade other people in your organisation that the customer experience strategy is a good thing and many other topics for the joy of CX So for today, we will say goodbye We will see you next time