The Product Experience
The Product Experience features conversations with the product people of the world, focusing on real insights of how to improve your product practice. Part of the Mind the Product network, hosts Lily Smith (ProductTank organiser and Product Consultant) & Randy Silver (Head of Product and product management trainer) “go deep” with the best speakers from ProductTank meetups all over the globe, Mind the Product conferences, and the wider product community.
The Product Experience
How to achieve better results by changing your mindset - Connor Joyce (Senior User Researcher, Microsoft)
Connor Joyce, a Senior User Researcher at Microsoft and author of "Bridging Intention to Impact." joins us on the podcast this week to discuss building features that are built with evidence and collaboration, fostering not only successful launches but also long-term user satisfaction.
Featured Links: Follow Connor on LinkedIn | Microsoft User Research | Desired Outcome Labs | Buy Connor's new book 'Bridging Intention to Impact: Transforming Digital Product Development through Evidence-Based Decision-Making'
Our Hosts
Lily Smith enjoys working as a consultant product manager with early-stage and growing startups and as a mentor to other product managers. She’s currently Chief Product Officer at BBC Maestro, and has spent 13 years in the tech industry working with startups in the SaaS and mobile space. She’s worked on a diverse range of products – leading the product teams through discovery, prototyping, testing and delivery. Lily also founded ProductTank Bristol and runs ProductCamp in Bristol and Bath.
Randy Silver is a Leadership & Product Coach and Consultant. He gets teams unstuck, helping you to supercharge your results. Randy's held interim CPO and Leadership roles at scale-ups and SMEs, advised start-ups, and been Head of Product at HSBC and Sainsbury’s. He participated in Silicon Valley Product Group’s Coaching the Coaches forum, and speaks frequently at conferences and events. You can join one of communities he runs for CPOs (CPO Circles), Product Managers (Product In the {A}ether) and Product Coaches. He’s the author of What Do We Do Now? A Product Manager’s Guide to Strategy in the Time of COVID-19. A recovering music journalist and editor, Randy also launched Amazon’s music stores in the US & UK.
Hello and welcome to the Product Experience. This week we visit Pendemonium 2024, where I caught up with Connor Joyce, who is a Senior User Researcher at Microsoft and author of Bridging Intention to Impact. We talked about how to achieve better results by changing your mindset. The Product Experience podcast is brought to you by Mind, the Product part of the Pendo family. Every week we talk to inspiring product people from around the globe.
Lily Smith:Visit mindtheproductcom to catch up on past episodes and discover free resources to help you with your product practice. Learn about Mind, the Product's conferences and their great training opportunities.
Lily Smith:Create a free account to get product inspiration delivered weekly to your inbox. Mind, the Product supports over 200 product tank meetups from New York to Barcelona. There's probably one near you, connor. Welcome to the Product Experience Podcast.
Connor Joyce:It's great to be here, especially live.
Lily Smith:I know it's so nice to be in the same room. So you're doing a talk today, but we're going to have a chat about all of your experience and the book that you've written and some of the work that you're doing and your background in user research. But before we go into all of that, it'd be great if you could give our listeners and viewers a quick intro to you and, yeah, how you got into the world of user research.
Connor Joyce:Yeah, yeah.
Connor Joyce:So I started my career in consulting and I was implementing Workday and that was my first introduction to first big tech but also all the challenges that come along with actually creating and then deploying technology.
Connor Joyce:And so while I was on the implementation side, I saw the challenges that people face actually getting the value that they want from technology, and so I decided I want to go deeper in that. I want to go, and I was doing some basic user research in that role, but I wanted to make a career out of understanding the users, not just researching them, but ultimately building the product for them. And so did a master's in behavioral science chose that space because I saw it as a unique perspective that I could bring to product, and then, from that point forward, held mostly research roles but a couple of product roles throughout my career since then and have just really enjoyed that again the entire pursuit of not just building product but building products that solve user problems, that really people don't use products just to say they used it. They do it because they're tools that they need to get a job done, and that's the philosophy that I've taken throughout my work.
Lily Smith:And it's interesting that you've done product roles and user research roles and you know, I think a lot of product people don't have the benefit of a user researcher on their team. And you know, even when there is a user researcher or a user research department, even it's interesting to see, like, the dynamic between product and user research. So you know, in your unique experience of having done both product and user research, so you know, in your unique experience of having done both product and user research, like what would you say are the ways to work together or work to make sure that user research is really sort of done well and done right.
Connor Joyce:Yeah, and to your point, I've been on teams, I've created startups where I was playing the product role and was doing research, and at the same time I've been, I've created startups where I was playing the product role and was doing research, and at the same time, I've been at Microsoft, which has gigantic entire they call them research studios because they're so big and filled with just researchers. And so I think the through line of all that experience and this is my personal take, but I believe it to be true is that research is only as good as the impact that it creates. And that is no matter where you're at, no matter what product you're working on. If you're in an applied setting, the research has a purpose it is to impact and make a better product. And so, if you take that from an abstract or from a holistic view, no matter what role you're in, if I'm a researcher that's as I currently am at Microsoft, where I am a part of a studio I am still focused on trying to drive insights that are going to change product, the same as when I was a single researcher, just with a group of engineers directly impacting how to build that product. So the main tip behind that is just to make sure that research isn't seen as some additional element that's out there, that we need to figure out how to fit this into the equation.
Connor Joyce:The research is the equation, the research is the equation. The research is the understanding for which good products are built from, and so, with that mindset, you don't need to be a quote unquote researcher and for those on the audio only, I'm throwing up some quotes here because I think that there's just like you don't need to be an academic to do experimentation, and I think data science has done an amazing job, showing that you don't need to be an academic to be a good researcher. You don't need to have this deep breadth of skills in research to be a good researcher. Good research is just creating evidence, and evidence helps make better decisions, and so that would be the core tip. I can go into some others around collaboration and communication and how to fit research into more traditional product development pathways. I think those are all other tips that I can elaborate on, but the real core is it's a mindset shift. That is, around looking at research as creating insights, and insights are the core of understanding what needs to be built.
Lily Smith:Yeah, and is user research, like always necessary from your point of view? Like, if you're in a situation where you're just launching a product and you know that you need to hit a certain baseline of feature set or whatever in order to at least compete with, like you know, the next nearest similar product, do you need to do research in order to understand what the kind of baseline customer expectations are? Or is it a case of, like?
Connor Joyce:you know, research is and insights are more about discovering the things that you don't know, that's a great question and then, while you're saying it, I'm really kind of mulling through on what is my honest answer and I think that when I think of research, I think of it as the creation of research, really in deep interviews, diary studies, etc. That is what many user researchers excel at, but that is not the only thing that user research can do, just as data scientists, while they're great at quantitative research, they also can do analysis on qualitative insights and the sort they also can do analysis on qualitative insights and the sort. So when you ask, is user research needed in all places, I would say no, it's not needed in all places, but there should be evidence coming from somewhere that's indicating that it should be built. And so, at the highest level, I would just call that more insights overall, and those insights can come from many sources. Maybe it's the sales team who just ran an analysis on their deal close and what was necessary to close certain deals, and that's what's driving a feature. Maybe it's the customer service who has done analysis on their Zendesk tickets and they see that there is a common request that's continually to come in and that's the evidence of why a feature is being built.
Connor Joyce:Those are all, in my mind, research. Maybe it's not a user researcher that's doing it, but it's evidence that's backing why it's being created. That, I think, is necessary. But does a user researcher have to go in and have an interview or do something like that every time? No, I don't think it's necessary. That is best saved for the unique features that a product has. What is going to differentiate them from their competitors, those sort of pieces. What makes that unique value proposition for that company. That's where I think user researchers have the most impact opportunity.
Lily Smith:And a lot of people are working with exploring AI at the moment and generative AI LLMs. What are your recommendations or the things that you found with your work and your experience with user research? That is maybe different or maybe not different. Like, is it just exactly the same when you're looking at how people's opinions are about AI tools and things like that?
Connor Joyce:There's some that's the same about AI tools and things like that. There's some that's the same, and, at its core, anybody who's building an AI product should ask themselves as they're designing it, as they're developing it, do they think AI is going to be a completely different type of product? And if they do, then that means that they should approach how they're developing it in a different mindset than how they have in the past. I personally think that the best AI products are those that still follow a traditional product design. It is what I like to call AI enhanced features. So you're finding a way to, if we're talking specifically about LLMs, integrating those LLMs into the feature in a way that it's still a traditional user interface, but it's using the power of the LLMs in the background. And so the reason I bring that up is because I believe that AI enhanced features are really the future, where the value will be captured from generative AI. I don't think it's that different for the most part. The thing that is different and when I say not that different, I'm still saying that again the core is about really understanding what tool needs to be created to solve the problem for the user, the difference that LLMs bring to the table and potentially more AI advances as they happen is really it's twofold. One is the overall conversational ability, even if it's not a chatbot, just the fact that if the answer isn't exactly what I want, that I may have the opportunity to tweak it in a way that was previously not possible, that I may be able to change that prompt slightly, or I may be able to have more. Change that prompt slightly, or I may be able to have more. Yeah, just we'll say overall ability to edit the output. And so users have seen that they have that potential, because most people their first exposure to LLMs are chatbots. So if you tell them in a feature this is a generative AI feature, and then they can't do that, you've taken away all of the ability You're putting in the prompt. They have no ability to change it. I've noticed there is a reaction of kind of like, well, I don't really like this answer, where do I go to go and change it? I don't like exactly what this is doing. So it's this frame. I guess, to sum it up into a single sentence, is that users are coming in comparing LLM features against ChatGPT and the freedom that ChatGPT offers.
Connor Joyce:I think the other piece that is in line with that but is on the output side specifically, is that there is this really interesting group of academic research that's now becoming highly relevant. Interesting group of academic research that's now becoming highly relevant. Some that shows that people have a natural bias to accept AI, some that shows that they have a natural bias against accepting AI and I've seen really both play out. I don't have the answer there, but it just as I'm doing research as much as I'm thinking about what are the frames, about the technology that our user is bringing to.
Connor Joyce:A situation from the previous of Pennsylvania that shows that if you give a person the ability to edit an algorithm, they're more likely to accept the output. The interesting part about that is it doesn't matter how much editability that they have 1% change, 10% change, it's about the same. It's just whether or not they can have a role in editing that algorithm. So there's these interesting pieces that were once more theoretical that are now bleeding into exactly how I'm doing research and thinking about building these technologies into features.
Lily Smith:Do you think that will change as well as people start using these tools and features more Like I remember I worked in a sort of personalized search and recommendations business and you know we had to kind of quantify every time we showed personalized recommendations like these have been generated by an algorithm, you know so that people weren't like, hey, why are you recommending me?
Lily Smith:You know so and so, like that, like that's. You know that's not my taste and you know we had to explain this is generated by a computer, if you know what I mean. And now I feel like everyone just accepts, like the personalized recommendations Sometimes it'll be right and sometimes it'll be wrong and it's generated by an algorithm and you don't really think about it that much and you don't get offended when it is wrong. I about it that much and you don't get offended when it is wrong. I feel like we're there now and I can see the same sort of path happening with Gen AI tools and LLMs where, as product people and designers and everything, we're having to explain quite a lot to bring the user along the journey at the moment. But that will probably change in the future as people get a lot more used to these tools and how they work and what they do.
Connor Joyce:I would believe that's true, as long as the technology does follow in the same paradigms as previous technology. And so that's why I do believe the best approach for implementing LLMs specifically generative AI more broadly and then, ultimately, ai is to build AI more broadly and then, ultimately, ai is to build AI-enhanced features, build them in the same way that products used to be built, but utilize the superpowers of those technologies within the actual features pretty much in the back end is another way to put it, because then we can hold those assumptions. We can say the likelihood is that if we build this feature so it looks like a previous feature and then we tell them once or twice hey, this is an LLM in the background, it's doing some action, that's why it looks like magic right now. That person will go oh cool, when I click the magic button, the magic output comes out, and they'll get used to that.
Connor Joyce:If we do have this paradigm shift and it does become more conversational, there are those who believe that the future will be I just lift up my phone and I start talking to it and it's doing something that will be a completely new way to interact with technology, and so I don't know if we can hold those same assumptions. What you're talking about at its core to some degree is just learning overall, like learning philosophy the more that someone learns, the more likely they'll be able to do it themselves. So that's probably true, but how we actually communicate to users. If you're building features, you can communicate it like it has always been communicated. If you're building a brand new conversational ability or something else, it will require more attention to make sure that people really are learning it and following in the same path as before.
Lily Smith:I have a funny story to tell you about my 11-year-old who was 10 at the time who was like, oh hey, mum, you have the ChatGPT app on your phone. And I was like, yeah. Who's like, can I just have a play with it? He's like, yeah, sure, go ahead.
Lily Smith:And he then asked ChatGPT how he could persuade his parents to let him spend money on Robux and has this whole conversation with chat gpt and luckily you know the the llm kind of comes back to him and says you know, it's probably a good reason why your parents aren't letting you spend this money wow and it is hilarious. And you know, he says if she's a capricorn, is there something specific I should say to her in order to convince her? So it's very interesting, like seeing a child engage in that type of conversation and how they're using this tool.
Connor Joyce:Wow, you may have a lawyer in the making there.
Lily Smith:Exactly, yeah, so you've also written a book about impact mindset. Tell me a bit about, like, what an impact mindset is.
Connor Joyce:Yeah. So I previously suggested that I did a master's in behavioral science and that that gave me this unique perspective coming into the product and tech space. And what I really believe that unique approach is is that, whereas the traditional philosophies of design thinking jobs to be done a lot of these that are tremendous philosophies for building product I believe they have one downside and that is that they generally end at the user outcome. So they say here is what you're trying to build for. Now go do it. And there's been all these great design books that have come out that talk about all the different tactics and ways that you can design to achieve those outcomes. But I still believe the missing piece is the measurement of specific behaviors. Simple examples finance fitness.
Connor Joyce:If you think about an app that's built to help someone lose weight, there's going to be specific behaviors that are going to do that Increasing of exercise, decreasing of calorie consumption, et cetera, et cetera. Finance increase someone's savings account. That might be the user outcome. What are the specific behaviors? Engage in more savings behaviors. Whether that's like rounding up on the dollar or having an auto enrollment of savings going or money coming from a paycheck into those savings, there are specific behaviors that will drive towards that user outcome.
Connor Joyce:That's what taking the impact mindset is is going one step further with user outcomes and saying what actually drives that user outcome, how do we measure it, and then, ultimately, how do we build features that change that behavior? And so, taking that approach, I believe more products will be ultimately designed. They will be designed in a way that lead people to take the actions that will drive them to see the satisfaction of whatever need that they have or whatever outcome, and then that ultimately yields a positive business impact because people are like, oh, this product works and that is a way to build loyalty, to ultimately drive retention, et cetera. But yeah, at its core, the impact mindset is the identification and measurement of behaviors.
Lily Smith:And you've given two really good examples there. But even with those, a lot of the way that we try to measure whether someone is changing their behavior is by like usage within our products and within features within our products. So we're like, okay, so they're using I don't know the, the food logging features in in a fitness app or whatever. So therefore they must they're engaging and they're motivated and so they must be being successful. But I think to your point, like that's not necessarily like the behavior change. You know they might be logging successful, but I think to your point, like that's not necessarily like the behavior change. You know they might be logging their food but they might still be eating over the calorie amount or whatever, or not logging everything, or so how do we kind of take it that step further to understand, you know, on the other side of the device or the app or the website or whatever it is like, what is the user actually doing and how do we get a really good understanding of that and start measuring that?
Connor Joyce:You teed me up perfectly to say because it's how I start the book is that I believe that the greatest mistake that's happening in product development right now is the assumption that usage equates to users satisfying their outcomes. So it's that assumption that if someone is using something that means it's working. That assumption can be true I'm not saying that it isn't and usage is an extraordinarily powerful metric for growth and for understanding whether something is actually being used, which the best product in the world that sits on a shelf somewhere is again not going to have impact. Talking about research, like we previously were, but usage and even usability, so things like NPS and satisfaction, those again are not actually showing if something's working. It's just showing whether or not someone liked the experience. So to your point, to the calorie tracking app. Let's say that someone is using the feature that is tracking the calories. That is one way to look at it. Asking them did you like the experience of tracking those food? That's another way to measure it. The third is to literally think about from the feature sets that we have are they actually consuming less calories? Are we actually seeing over a 14-day window? Did the calories consumed go down? And if not, that's a feature opportunity right there. That is the goal. If the specific behavior is a decrease in calorie consumption, then the goal is to develop a feature that actually does that. We'll say like personalized notifications reminding somebody why they wanted to ultimately lose weight or have less calorie consumption. That could be the feature idea that comes out of the fact of recognizing yes, people track, they're using the logging feature, but it's not actually changing their behavior, it's just causing them to log their foods. So to your original question of how do you go a level deeper. So if you think of an ethnographer, their goal is just to really dive deep and go in and think and watch what a person does in the real world. So that's where I suggest product teams start is literally think.
Connor Joyce:If today you wanted to go and have, if a user wanted to go and satisfy their outcome, what would they actually go do? Where would they go? What behaviors, what actions would they take? Those are the behaviors that are connected to that user outcome. Is it always going to be possible to measure every one of those? No, to your point.
Connor Joyce:Some products, those behaviors, happen outside of the digital environment. But the place you want to start is I mean now with segments and mParticle and a lot of these apps, you can get super, super raw event data down to the most micro level. So how could you build a metric that captures whatever activity you want to see in that app? That's where I would start. If there's still behaviors that are happening outside of the app, how do you capture that outside of it? Are there integrations to other apps that might capture that? Can you get self-reported data from your users? I don't have the full solution, but that is the framework to start Look in the product, then look for integrations, then look for self-reported data and then wherever else, however else you might be able to identify. But the whole point is to first think about what are the behaviors that could be changed that are connected to those user outcomes, and then find any data that is there or proxies to that data, and then there gets into a whole discussion of how to find proxies.
Connor Joyce:But it is not the easiest way to measure a feature. I fully recognize that. It is why, even though we live in this, measure what matters economy right now, that the main reason or the main metric that teams choose to measure is usage, because it is the easiest to create Every product suite. That's the main default metric now, whether it be Pendo, whether it be any of those other analytics suites, that's the default because it is the most easy to capture, since by nature, it's created just by having something that someone could use. But it is misguiding teams and it is leading to leaky buckets where people see a skyrocket growth. They think this is great, but then there's no switching cost because the product isn't actually working, and so the users are jumping to alternative solutions as they think themselves I need to find something that's actually going to satisfy my outcomes. So when teams take that little bit of extra investment to identify those behaviors, they will have leading indicators that will be much more predictive of whether people will continue to use that platform.
Lily Smith:It's interesting because when I was reading about the book and about impact mindset, I was thinking like is that how Facebook became what it is? Because they I think they were like you know always wanted to focus on like daily active users or whatever. Maybe that's completely wrong, but I've left Facebook a very long time ago so I actually have no idea how it works now, but whenever I see my husband using it, I'm like what is that? It just looks like such a mess and it just looks like it's full of features to just try and get you in every single day and not necessarily I don't know, not necessarily a great user experience anymore.
Connor Joyce:Well, and if you think about Facebook specifically, their goal and I can't remember their mission statement verbatim, but it's something like connect the world, and at first they were doing that really well and people really did feel more connected to their friends and family through Facebook. And then over time, to that point, they've had a lot of feature bloat because they have focused more and more on usage and then also like eyes on screens, that type of retention there's another word for it that's escaping my mind but just consistent usage, we'll say. And then they had this pivot at one point where I think that, whether it was Mark or whether it was another one of their product leaders thought we should go back towards really connecting people and so they came out with groups and communities and for a while it did that. And there was this resurgenceurgence and you heard these stories of like mommy groups and people sharing free and for trade groups and local bike meetups, and you heard all these and it was like, oh cool, Facebook's getting back to its core. But then, as the product continued and it started to recommend groups because they were again were seeking higher levels of engagement, Then it started to go down these rabbit holes and if I liked one group that was maybe like like I live in Washington state and I'm I'm in Seattle, but there's large gun groups outside. Maybe I go and join a gun group that's in a Washington state. It's going to start to bring me down a specific guide of here's a recommended group, here's a recommended group, until I'm like what am I getting advertised Like? This is so far from where I started.
Connor Joyce:And again, it was because of this perpetual pushing for engagement that it lost the purpose of why they created the feature in the first place, which was to drive higher levels of what their mission is of actually satisfying the outcome of I want to be connected.
Connor Joyce:And now it is a way of it. I mean, the word was escaping, but it's radicalizing. It has a radicalizing function to it because it's sending people down rabbit holes through the groups that it keeps recommending them all in a pursuit of engagement because they're over focused, and most social media does this is it's an over focus on getting people emotional, because they know that that connects to higher levels of engagement. But that, again, is not why people are using the social media. They're using it to connect with their friends, to catch up. So there's these conflicting factors, and then you see something like TikTok come in, and one of the reasons why I believe TikTok exploded in popularity yes, because it's engaging, yes, because there's short-form video, but also because it brought people back to the entertainment factor of social media. A lot of people that's the main reason they're on social media they just want to have fun.
Lily Smith:Yeah.
Connor Joyce:And so TikTok was able to capture that. Where Facebook had become this pool of ideologies and conflict and emotions, TikTok came and was like hey, we're just fun. And it drove that user outcome again.
Lily Smith:Yeah, it certainly is very addictive fun. I don't have TikTok either, by the way, nor do I. That's one I've held off from. So just coming back to again your sort of experience with product and research circling all the way back to where, we started at the beginning. One of the things I think you mentioned as well in your book is how, like products and research, teams can become a bit misaligned and, I guess, work slightly in silos or have different purposes maybe. How does that come about and what's the solution to that?
Connor Joyce:I think that it happened. I was going to say it only happens where big companies, but can happen at any company, at any company. And one of the biggest factors that I don't fully address in the book because honestly it's out of scope for me, it's more for leadership books and leadership guidance is incentives, but that's just the. That, at its core, is something that I constantly hear talked about and very seldomly actually see addressed. Right Is that if you have a product team whose main incentive is to ship features, they're going to ship features. That is going to be what they do, and so when you then have a research team, that's main incentive is to try to avoid rework of features down the road or have higher levels of engagement on features, or whatever it might be. When they come and say, hey, product team, we would like to talk to you, we'd like to jump into your sprint, or we'd like to have this meeting with you to talk about how we can change features, the product team has literally no reason to listen. If all they're measured on is shipping of features, they're going to prioritize that, and so I believe that many of the problems, that is actually where they start. And again, I wish I had the golden answer to this, but I don't. Other than leadership really needs to ask themselves what are we prioritizing when we incentivize different behaviors? Now, holding that one is that is a big factor.
Connor Joyce:The other reason that I see it is that there are different reasons why people want to build features. Is that there are different reasons why people want to build features, and commonly I hear research thinking about it purely from the lens of what are customers saying? What are users saying? What does the research say? From product people, I hear more of the business standpoint. So what are the paying customers saying? Or just some of you previously mentioned feature parity. What are our competitors doing? What's the market doing?
Connor Joyce:So there's almost two different lenses of why features should be built, and so that misalignment can be. Well, this source is saying this should be built, this source is saying the other and my hands are showing one is the product team and one is the research team. Of the sources, the alignment, I think, starts with asking the question what are we building? Having a solid definition of that feature, what is the point of what we're building here? Next is what evidence do we believe suggests that feature should exist? That is an aligning function right there, because it forces both teams to come to the table with what they believe suggests why a feature should be created, and then from there, ideally having a unified vision of why a feature should exist, what is supporting its existence, can start that dialogue of how do, together, we fill the gaps that will be needed to have the full confidence that, when we ship this feature, it actually will satisfy whatever we're trying to achieve with it.
Lily Smith:Makes sense. Connor, we've run out of time, but it's been so great having a chat with you today. Thank you so much for sharing all of your advice and insight and, yeah, it's been fantastic.
Connor Joyce:Thank you, lily. Yeah, it was a pleasure here, especially, like I said, live at a conference. This is a new one for me too.
Lily Smith:So this is great.
Lily Smith:The Product Experience hosts are me, lily Smith, host by night and chief product officer by day, and me Randy Silver also host by night, and I spend my days working with product and leadership teams, helping their teams to do amazing work.
Lily Smith:Luran Pratt is our producer and Luke Smith is our editor.
Lily Smith:And our theme music is from product community legend Arnie Kittler's band Pow. Thanks to them for letting us use their track.