See, Hear, Feel

EP28: Dr. Itiel Dror on emotions, cognitive bias, and the utility of linear sequential unmasking

September 21, 2022 Christine J Ko, MD/ Itiel Dror, PhD Season 1 Episode 28
EP28: Dr. Itiel Dror on emotions, cognitive bias, and the utility of linear sequential unmasking
See, Hear, Feel
More Info
See, Hear, Feel
EP28: Dr. Itiel Dror on emotions, cognitive bias, and the utility of linear sequential unmasking
Sep 21, 2022 Season 1 Episode 28
Christine J Ko, MD/ Itiel Dror, PhD

Dr. Itiel Dror PhD is a senior cognitive neuroscience researcher at the University College London. He received his PhD in Psychology at Harvard University. He researches information processing involved in perception, judgment, and decision-making. Dr. Dror has dozens of research publications, which have been cited over 10,000 times. His academic theoretical work is complemented with applied worked in healthcare, policing and aviation --to improve human training and decision making. More information and publications are available here. Links to some papers: 1) Short piece from Science, 2) A bit more 'meat' explaining bias sources & fallacies, 3) A 'solution' too, and 4) 'Hot off the press', just published, a new paper on forensic pathology decisions

Show Notes Transcript

Dr. Itiel Dror PhD is a senior cognitive neuroscience researcher at the University College London. He received his PhD in Psychology at Harvard University. He researches information processing involved in perception, judgment, and decision-making. Dr. Dror has dozens of research publications, which have been cited over 10,000 times. His academic theoretical work is complemented with applied worked in healthcare, policing and aviation --to improve human training and decision making. More information and publications are available here. Links to some papers: 1) Short piece from Science, 2) A bit more 'meat' explaining bias sources & fallacies, 3) A 'solution' too, and 4) 'Hot off the press', just published, a new paper on forensic pathology decisions

[00:00:00] Christine Ko: Welcome back to SEE HEAR FEEL. Today, I am very excited to speak with Dr. Itiel Dror. Dr. Itiel Dror is a senior cognitive neuroscience researcher at the University College London. He received his PhD in Psychology at Harvard University. He researches information processing involved in perception, judgment, and decision making. Dr. Dror has dozens of research publications which have been cited over 10,000 times. His academic theoretical work is complimented with applied work in healthcare, policing, and aviation. All of his work is really about improving human training as well as decision making. That's where I get really excited because I think this type of work in decision making is so relevant to everyone, but especially doctors. More information on his work as well as his publications are available in a link in the show notes. Also there will be links to some of his papers as well, including one in Science and other articles that talk in more depth about sources of bias, solutions of bias, and a paper on forensic pathology. Welcome to Itiel! 

[00:01:10] Itiel Dror: Thank you. 

[00:01:11] Christine Ko: Would you be willing to share a personal anecdote about yourself as the first thing to just help listeners get to know you a little bit better? 

[00:01:19] Itiel Dror: When I do research and present my findings, especially to forensic examiners, but also other experts- medical, aviation, policing -things that seem very clear and obvious to me are either mind opening or outrageous for the listeners. So they all open their eyes and say, Oh my God, this is amazing. Or they say I'm an idiot; the research is wrong. They want the paper retracted, and they get very defensive. I'm just amazed that when I talk to scientists, to educated people, about scientific findings, they're not able to respond to the scientific findings. They get very emotional, and they don't see the research; which of course connects to the topic itself: how emotions and feelings and expectations and beliefs impact and bias. So it's important to really take cognitive biases seriously and be willing to acknowledge your weaknesses and listen and debate, and then make decisions based on science and not emotions. 

[00:02:26] Christine Ko: You've brought up emotions, and how emotions really color the way people think. And there is a bias called the affective bias, which I think is talking about how emotions do really influence our thinking. Not being a cognitive scientist, I don't know that emotions are really separate from thinking. I know that traditionally they're separated, but can you just talk about that a little bit? 

[00:02:49] Itiel Dror: We are all familiar with the saying, Love is blind

[00:02:53] Christine Ko: Yeah.

[00:02:53] Itiel Dror: So I got an email last night from a friend. I haven't talked to in many years. We were very good friends. One day he said to me, I met the woman of my dreams. She's beautiful. She's intelligent. She's funny. She loves me to bits. I said, Great. I want to meet this woman. So we went to dinner, her, him and me. And then after dinner said, what do you think? And as a friend, I had to be honest. I said to him, I don't think she's very smart or funny or good looking. And I think she's a gold-digger; I don't think she loves you so much. Did he come and say, Thank you, Itiel, I'm not gonna marry. No, he was angry at me; didn't invite me to the wedding. Now, 10 years later, he sent me an email last night. He said, Itiel, I need to apologize to you. I'm getting a divorce. All what you said was true. And he said, It's not that she's changed. She was like that before we got married, I just didn't see it. This is exactly a illustration of love is blind; it distorts how you evaluate. Our emotions are part of the thinking process, and they're not separate. You gave me just five minutes; I gave you the icing on the cake. But if you want to go into deep research and amygdala and this and that and how it affects, and that emotions are good for certain decisions... like bias! It's not, Emotions are bad, right? It's a weakness not to have the emotions to help guide you. Yet again, like bias, it's a complicated picture. 

[00:04:20] Christine Ko: Can I ask you to define cognitive bias? 

[00:04:23] Itiel Dror: In a nutshell, the human mind is not a camera. We are not passive in processing information because of the brain architecture and limited computational resources. So the brain pays attention to some information more than other information. Our decisions and our perception, it's not only based on the input, not based on only what the eyes see. The brain distorts it and manipulates it. We distort and change how we process the actual data and make decisions. And even what we perceive as the data is not based only on the data, but other factors that have got nothing to do with this information. So in your domain, you are perceiving a slide and interpreting a slide with a melanoma, not based only on the slide, but a lot of factors. And it's not one, it's a whole range of forces that impact what we see, and how we see, and how we interpret it. And these are the cognitive biases. 

[00:05:20] Christine Ko: I'm not a cognitive scientist, for sure. And I realized that I'm biased to think that cognitive bias is bad and then I'll somewhat quickly remember, Oh yeah, wait. Cognitive bias is not always bad, and actually I think it's mostly useful and positive. I think it's hard. It's an availability bias, right? Because if we're always thinking bias is about racism, sexism; the minute you hear cognitive bias, you're like, Oh, it must be like racism and sexism and et cetera. And so you're already biased against thinking about cognitive bias, because if you consider yourself not racist, you're gonna be like, Oh, I'm not biased against race. So I'm not biased cognitively because I'm someone who's not biased. So I think just to emphasize that context you've given, every thinking person has cognitive bias because cognitive bias is part of thinking. It's the way we think. Is that correct? 

[00:06:19] Itiel Dror: Absolutely. First of all, not to annoy some audience who are listening and saying that I will not give a definition, and they have to have one.... I'll give you the definition that we have in a paper from 2013. And we define cognitive biases as the class of effects through which an individual's preexisting beliefs, expectations, motive, and situational context influence the collection, perception, and interpretation of information. So that's a definition. Before we talk about biases, are they good or bad? They exist and we can't avoid them. We need to ask, Which ones are really bad? Which ones are good? Which ones are a bit bad? Which ones are a bit good? Then take actions to minimize, and try to even eliminate some of the real bad biases, and live with some that may be a bit bad, and harness the better biases. Bias has developed for a good reason. If you don't have biases, you are going to be paralyzed. Biases have developed for a very good computational brain architectural reason. We cannot do without them. And we don't want to do without them because they're selective and help us do what we want to do, and achieve, and be effective and intelligent and experts in our domain.

[00:07:47] We need to understand which biases are helpful, which biases are not helpful, and which biases are bad. Of course, it's a bit more complicated because some biases are very good in certain situation, but sometimes they are bad. So it's like driving on the road. Maybe I shouldn't say that I speed when I drive. But when the road is wet, when there's ice on the road, when I'm tired, I slow down. You need to know where you are on the slippery road, when there is fog, or ice on the road, and you need to slow down and take measures not to have an accident; to take measures that the biases will not lead you to make erroneous decisions. Awareness and willpower do not change the biases. They're unaffected by that. So we need to take this issue of biases and discuss it in medical school and among ourselves, and to do research, and to seek, and to contemplate. Then we are moving forward.

[00:08:47] Christine Ko: Yes. I think I would have very much benefited from having some exposure to metacognition and cognitive bias and what that can mean for the practice of medicine. 

[00:09:01] Itiel Dror: Medical decision making, diagnosis, is what you do day in and day out. Or in surgery, when you make decisions and how time pressure affects decision making. I'm expanding now from bias, much bigger. They should have that in medical school. The underlying cognitive process or perception, judgment and decision making, should be a whole course in any medical school. And part of it is bias, but not all. If they don't do that, then that is a sad state of affairs. We need to have it mandatory; it's part of patient safety, which we do care about. 

[00:09:35] Christine Ko: The whole healthcare system, at least in the US, is becoming adversarial. Because doctors are more and more afraid of being sued of malpractice, I think they are afraid of admitting to error.

[00:09:48] Itiel Dror: I agree, I'm depressed already. So I'll increase my medication or whatever, but it's not only what you say. I think it's much, much worse. For example, it's not that the medical doctors will not acknowledge your mistakes for lawsuits. I would say, this is gonna be very provocative, that sometimes they make decisions where the decision of how to treat the patient takes into account, maybe without their awareness, the implications later for a lawsuit. The fear of a lawsuit contaminates the decision. It impacts the actual treatment they give patients because in the background they say, If I do this and it goes wrong, I'm well protected for a lawsuit. But if I do this, which is a better medical treatment and something goes wrong and it goes to a lawsuit, then I have a problem. I don't want to depress you and the listener, but there are big challenges ahead. 

[00:10:44] Christine Ko: Your work actually makes me less depressed because it gives me hope on how to move forward. 

[00:10:53] Itiel Dror: I'm happy to hear you say that, cause I have to tell you that lately, I really feel like I'm learning and gaining insights into human decision making, but what I'm learning is very sad and depressing. The more I understand human decision making, the more I see that people make decisions based on ideology, motivation, personality, their experiences; and the data is less and less important. 

[00:11:17] Christine Ko: To move forward, to improve, what's really necessary for human beings, but also doctors, is feedback. I feel, for me in particular as a dermatologist and dermatopathologist, that I don't get enough feedback. There are a certain number of my own cases that are sent out for a second opinion. Another way that we get feedback at my institution, we have Grand Rounds with patients and so oftentimes, relatively soon after we give a diagnosis for a given patient slide, that patient might be brought to the Grand Rounds. And so then we do get feedback from that, from seeing the whole context, the photographs, et cetera, the medical history, if we haven't gotten it already. But I think the vast majority of my cases to go out into the world, and I assume I'm right, I hope I'm right. But I would say that I don't really get feedback on the majority. 

[00:12:14] Itiel Dror: Let's tease it apart. You said you want feedback. Most people, when they tell me they want feedback, they want to hear how great they are. And if they don't get positive feedback, they get defensive. The problem is that even though it's great to hear good feedback, we learn from the negative feedback. If we are not defensive, this is where we actually learn the most. We need that to improve. 

[00:12:38] Christine Ko: I agree. Most people want positive feedback, but I think you're absolutely right. I learn the most, I've realized, from my mistakes. I definitely do not love it when I make a mistake. I realize I feel a fair amount of shame. And it's very uncomfortable. I want to turn away from shame. Most people want to turn away from shame. When I feel shame, people want to turn away from me, and they often do turn away from me. I actually am learning to embrace the errors. Not that I want to make more errors, but I'm learning to embrace them because that does really make me better. 

[00:13:09] Itiel Dror: If you embrace them, you will make less errors in the future. I've suggested to use linear sequential unmasking. Linear sequential unmasking says very clearly, given that the order of information is important, that first piece of information causes you to generate ideas and hypothesis and impacts how the brain perceives and interprets subsequent. Even though we try to reserve judgment until we're exposed to everything, we can't. We see something, the brain starts activity of the neurons and hypotheses and so on. Let's think: what is the order of things, the information, what is the right sequence. Linear sequential unmasking suggests how to optimize the order. For example, let me ask you, when you are looking at the slide, do you look at the slide first, or before you look at the slide, do you read the context, the family history and everything? Linear sequential unmasking just asks the experts to consider what order to look at the different pieces of information. 

[00:14:11] Christine Ko: I'm glad you brought up your concept of linear sequential unmasking, because I was not familiar with that term before reading some of your work. For dermatopathology, as well as dermatology, I think that we do use linear sequential unmasking. I told you this in a separate conversation, but oftentimes people will say, look at the patient without getting the history, so you're not biased. Or look at the slide before you read what's on the biopsy requisition form, so you're not biased. And then look at it. And revise what you're thinking based on what the patient says or laboratory tests or other data. 

[00:14:49] Itiel Dror: The only thing I would say, it's not that you are not biased. You are less biased, because you're always biased. The question is to minimize and decide which biases you want to take first versus later, which biases are more negative, and which ones are not negative and may even help you. We're always going to be biased. The question is which biases, in what order, and what we can do about it. 

[00:15:12] I do also work in marketing and branding. In branding and advertising, it's about increasing biases. I want people to stand in line and to pay a lot of money for this product. Not because it's good. Not because it's cheaper. Telling people how good the product is will not make them buy. People don't make the decision based on how good the product is. To me, actually it depresses me. People are so ideological and emotional. It's a big challenge, but we have to try to do something about it.

[00:15:44] In the medical domain, how do we incorporate emotions and bias in medical training and medical decision making? Not tell the doctor, Don't have emotion, try to fight your bias... but acknowledging the role of bias and emotion in decision making, harnessing the positive element, and minimizing the negative elements. So I'm trying to be positive towards the end. It's very hard. I am making an effort to end in a positive note. 

[00:16:10] Christine Ko: That's perfect. So I will put that as your final thought, but I will ask you if you have any other thing that you want to say?

[00:16:19] Itiel Dror: I have many other things I want to say, but I'll keep them to myself at this stage.

[00:16:25] Christine Ko: Okay. Thank you so much for spending the time with me. It's been really an amazing conversation. 

[00:16:32] Itiel Dror: Thank you.