Securities

Risk, Bias and Decision Making

Description

Recently at Lux in New York City, Josh Wolfe invited three celebrated decision and risk specialists for a lunch to discuss the latest academic research and empirical insights from the world of psychology and decision sciences. Our lunch included Daniel Kahneman, who won the 2002 Nobel Prize in Economicsfor his work on decision sciences. His book Thinking Fast and Slow has been a major bestseller and summarizes much of his work in the field. We also had Annie Duke, a World Series of Poker champion who researches cognitive psychology at the University of Pennsylvania. Her books How to Decide and Thinking in Bets have also been tremendously influential best sellers, and she is also the co-founder of the Alliance for Decision Education. Also joining us was Michael Mauboussin, the Head of Consilient Research at Counterpoint Globaland who has also taught finance for decades at Columbia. His book More Than You Know is similarly a major bestseller.

This is an edited four-part series from our lunch seminar, with each part covering one topic of the conversation for easier listening.

Transcript

This is a human-generated transcript, however, it has not been verified for accuracy.

Part 1: Premortem

Danny Crichton:
Hello, and welcome to Securities by Lux Capital, a podcast and newsletter that explores science, technology, finance in the human condition. I'm your host, Danny Crichton, and today, we have a really special multi-part series on risk, bias and decision making, a cluster of critical skills that have interconnections across all the areas this podcast explores. Recently, at Lux in New York City, Josh Wolfe invited three celebrated decision and risk specialists for a lunch to discuss the latest academic research and empirical insights from the world of psychology and decision sciences. Our lunch included Danny Kahneman, who won the 2002 Nobel Prize in economics for his work on decision sciences. His book, Thinking Fast and Slow, has been a major bestseller and summarizes much of his work in the field.

We also had Annie Duke, a World Series of Poker champion who researches cognitive psychology at the University of Pennsylvania. Her book, How to Decide and Thinking in Bets, have also been tremendous influential bestsellers, and she has also the co-founder of the Alliance for Decision Education. Also, joining us was Michael Mauboussin, the Head of Consilient Research at Counterpoint Global and was also taught finance for decades at Columbia University. His book, More Than You Know, is similarly a major bestseller.

This is an edited four part series from our lunch seminar with each part covering one topic of the conversation, for easier listening. In this first part, we discussed the concept of pre-mortems, an approach of looking at the outcome of our decision, and if it were to fail, why we think it would fail. It's an approach that's designed to overcome group think and avoid the fact that pessimists are really unpopular in group decision making sessions. However, recent research has shown that they don't always help people and groups change their mind. We look at postmortems, prospective hindsight, legitimizing descent, self-serving bias, and pre-parade or backcast to see how this tool can affect and improve decision making, given the most recent academic literature.

Let's actually talk about that. So, for everybody's benefit, this is a crew, Danny Kahneman, Annie Duke, Michael Mauboussin, we've all known each other in separate ways, and then, maybe three or four years ago, it started pre-COVID-

Annie Duke:
Pandemic. Yeah.

Danny Crichton:
... Pre-pandemic. We got together and we all had lunch and everybody brought with them their different mental models and toolkits, and we had a lot of fun and we started doing this on the regular. Decision making inside of investment firms. Danny has advised and then, I think in the process of advising firms and insurance companies and seeing things that led to the book, Noise, which was the most recent about the inconsistency of decision making, we should talk about that.

Annie has really sculpted a lot of partnership and team thinking and frameworks about how to think in bets, how to think probabilistically, decision making traps, much of which also I think was probably early inspired by Danny. And Michael has been doing this, and probably one of the few voices in the public, and to an extent private markets too; influencing, arguably a generation, Michael is way more humble and hates when I do this kind of stuff, but of young practitioners that are thinking about their processes.

We have adopted a combination of learning from Michael and from Danny and from Annie, things like pre-mortems, where we anticipate when we're having a decision, what could go wrong. I've indoctrinated people internally with this quote that, "Failure comes from a failure to imagine failure." So think about all the things that can go wrong and what things might be able to effect with time or talent or money thrown at it. That is something that most of the time, when people are making decisions, particularly in the venture world, they're thinking about what can go right. How can this be a multi-billion dollar business? How could it be a fund maker? I am always [inaudible 00:03:27] the raining on people's parade in the corner. What are some of the tactics? Pre-mortem is one. Michael, would you explain what a pre-mortem is for people?

Michael Mauboussin:
Well, I want to talk about a pre... This is what I want to talk about. So this is, and I know Annie and I talk about this-

Danny Crichton:
Go for it before you go. People cannot see this visual. When Michael started on Zoom, he had maybe a hundred books behind him.

Annie Duke:
No-

Danny Crichton:
Then, it grew to about 150, but-

Annie Duke:
He's now being encroached upon-

Danny Crichton:
Yeah, no, it's like it's consuming him, his physical space-

Michael Mauboussin:
It is consuming-

Danny Crichton:
Carry on.

Michael Mauboussin:
By the way, maybe I will describe a pre-mortem, very quickly. So you are about to make a decision and you gather as a group, and you pretend that you've made the decision, and you launch yourself into the future. So let's say it's a year from now, and you pretend this decision's turned out very badly, and then, each person individually writes down the explanation for why this decision turned out poorly. And then, we combine those decisions and we think about whether we should decide about it differently.

So going through this, this idea was popularized by Gary Klein, who did an adversarial collaboration with Danny Kahneman, which was great, a great paper. Okay, so I know Danny's been very enthusiastic about it. So as far as I can gather, there are, sort of, two key components to this. One key component, and this is where Klein opened his HBR article, was this idea of prospective hindsight, which is based on some research done by Jay Russo, some collaborators many years ago. And the idea of prospective hindsight is the fact if you put yourself into the future and you pretend the actual outcomes occurred, then that sense of reality allows you to open up your mind and think about more alternatives.

The second thing seems to be this idea of overcoming group think. We're barreling toward a decision on something, as we get closer to the sense that decision's going to happen, then doubts get to be suppressed. In fact, people who are doubting it are, you know, you start to question their loyalty to the organization, so on and so forth. And so, this creates an opportunity for those people to chime in essentially, to counterbalance the argument. It appears this idea of prospective hindsight to replicate the Rousseau et al experiment, that has not replicated. So the idea that this future thing, looking back, creating more alternatives than what has, has not replicated. Now, this idea of overcoming group think and so forth, has.

Now, the last thing I'll say before I turn over to Annie is, every time I talk about pre-mortems with investment organ-- well, anybody, but certainly investment organizations, everybody loves this idea. People love it. So somehow there's an appeal to this and people feel like they're getting something valuable from the exercise, but now I'm keen to understand what the mechanisms are, especially, if half the argument is not replicating.

Annie Duke:
Over at Wharton, we've been working on this problem for over a year now. Actually, we talked about the sort of first results at the last lunch that we had, which was, when was that a year ago or so? So this is work that I've been doing with Linnea Gandhi, who's the lead researcher on this, who actually Danny knows very well, and Maurice Schweitzer. So we have tried to replicate the result, the Rousseau result, so let me just explain what that result is, is that if you say to somebody, why don't you think about the things that can go wrong and that's the framing that you use, versus imagine that something has gone wrong, why did it happen? Okay, so the second would be a pre-mortem. The first is just a prompt to sort of imagine what might unfold.

Basically, the finding that they had was 30% more reasons, if you do it prospectively. The first thing is, it's an old study, very small N. Your alarm bell should go off whenever you have a really small N because it means that it's not very many subjects. So we've tried to replicate this across a bazillion different scenarios for people, like planning, how they're going to stick to lent whatever they're giving up for Lent, to exercise goals, to predicting performance in basketball, like predict... I'm going to make predictions about how I'm going to do predicting how you'll do on an anagram test. We have a toy example that, actually, I developed from an actual example from my book that's coming out in the Fall, which is about public works project. So we're asking people to do this, just imagine what could happen or framing it prospectively. And we-

Danny Crichton:
An all-in anticipation of some performance towards some known goal.

Annie Duke:
We have been unable to replicate the 30% more reasons, we just haven't been able to do it. And these are modern day studies, pre-registered, really large ends. We're talking about running 2000 people through these studies and we've done it a bunch of different times. So we haven't been able to replicate it.

Danny Crichton:
Danny, I think your enthusiasm stemmed more from overcoming group think and making sure that those doubts that are suppressed rise to the surface rather than this idea of necessarily hinging on, solely, prospective hindsight. If I understand your view.

Daniel Kahneman:
My hunch was that you have a group, they're close to a decision, and it's based on the idea that pessimists are really unpopular.

Annie Duke:
Yes.

Daniel Kahneman:
And they are especially unpopular when a group is converging toward the decision, so you need to legitimize dissent. And this turned things around because the way that Gary frames it, it's going to reward ingenuity. You are trying to find a good failure, that will make you look clever, and it's that inversion that I think is very powerful. So I think that the cognitive thing is very minor and I'm not surprised it doesn't replicate, but the social thing looks really powerful.

Annie Duke:
Let me just be clear, we're just now going to start seeing what happens in groups. There's a few things that we've found that make us, kind of, skeptical in terms of groups and legitimizing descent, the question is, does something about framing it as perspective hindsight help you do that? The things that we've been having trouble getting are behavior change, and again, we're doing this with individuals, not with groups. So let me just be really clear about that, we haven't gotten into the group work yet. We've been having trouble getting behavior change if you do a pre-mortem, in other words. So people can sort of imagine what might go wrong, but we don't actually see them clicking on more links to do research or much changing of their plans. We also see that there's just as much self-serving bias if you work prospectively than if you work in prospective hindsight

Danny Crichton:
And just define what self-serving bias is.

Daniel Kahneman:
This is that you believe you're going to succeed, that you're bias in your own favor-

Annie Duke:
Yeah, so [inaudible 00:09:53]-

Danny Crichton:
Which in our world of venture capital entrepreneurship is a prerequisite because you have to have over optimism.

Annie Duke:
Yeah, so-

Daniel Kahneman:
You want to be optimistic-

Annie Duke:
Exactly, so this has to do with, when an outcome is poor, I attribute it to things that are external to me. Something happened to me-

Danny Crichton:
Mistakes were made, but not my fault.

Annie Duke:
Yes.

Danny Crichton:
Yes.

Annie Duke:
And when I have a success, I attribute it to things that are internal to me, things that I did, decisions that I made. So we know that this is a very strong bias in hindsight, so if I get in a car accident, I'll say it was the other person's fault. This is one from my child, right? So, does poorly on a test and it's the teacher's fault, and the test was too hard, and everybody in the class agrees, and so on and so forth.

When he does well, he studied really hard and he was great. So that's thinking about a result that you've already gotten, it turns out, and this really shouldn't be that surprising, that if you say, imagine that you're taking a test in a week and you get the test back, it's graded, and you've done quite poorly. So that's a classic pre-mortem and you say, "Why?" They say, "The test was too hard because the teacher put things on it that weren't taught in class, and also the teacher doesn't like me." And then, when you say, imagine it's a week from now, and you get the test back, and you've done quite well, they say, "Well, I studied really hard and I'm very smart and I pay attention in class." That particular bias doesn't go away when you do things in prospective hindsight. So, that's just-

Danny Crichton:
So just to repeat that, after the event, success or failure, there's a narrative attribution. If it was success, I'm amazing. If not, it's all the other guys' fault.

Annie Duke:
Mm-hmm.

Danny Crichton:
And if you do the pre-mortem and say, imagine that you succeeded or failed-

Daniel Kahneman:
You have the same thing-

Danny Crichton:
Right.

Michael Mauboussin:
But my expectation, I always say when I talk about this, this will not change your mind about the plan. If you've made a decision and you're committed to the decision, you're still going to go through with it. So this will not change, but what it can do, you can find loopholes that you can close. You can find little things that you wouldn't find otherwise, so it's worthwhile. You should not expect that a pre-mortem would lead a group to change their decision.

Danny Crichton:
When we sit around the table and we make a decision, and I actually had this conversation on our Monday meeting, just the other day, and I was like, okay, let's imagine all the things can go wrong. If one of these things goes wrong, I'm okay. If we took the process, we took the risk, we think that risk adjusted, we're underwriting it, and it happens, right? Then, fine. I would be less okay and consider it a process failure if we got surprised and didn't anticipate that thing, and then, I would've said, okay, this is a failure now. The pro of that is, we're thinking about the expansive set of possibilities of what might occur and how things can unfold. The con is, if and when it doesn't fold, I've already prepared myself that, oh, we expected that that bad thing would occur, and so, I don't have to update my priors.

Daniel Kahneman:
It's really not about the priors, I think. It really is about changing your plans. That, to me, is the value of the postmortem. It's about closing things, it's about, oh, this could happen, we haven't covered the left side. I mean, that's basically what happens in the pre-mor--... And you can see that in a military context. Where could they come from? And then, oh, we hadn't thought... You take the perspective of the other side, if it's a context and you see more things.

Danny Crichton:
I always love the DARPA line. I think it was the DARPA line. Their mission is, to create and prevent strategic surprise. They want to surprise the other side because, obviously, it's an offensive advantage and they want to ensure that they are not surprised.

Annie Duke:
In the work that we do on pre-mortems, we're trying to address the things that people think that they do, right? Not that Danny Kahneman thinks they do, who's a much more realist about this, but that people who are using pre-mortems on a daily basis believe that they do. So the reason why we're looking at the self-serving bias literature and seeing if you get that same effect prospectively, is that that's showing that, well, here's a bias that it definitely doesn't get rid of. If you're going to do this, you need to specifically ask about what could go wrong that's within your control and what can go wrong that's not in your control because left to your own devices, you are going to be, sort of, imagining, not that your investment process was bad, but that something unlucky happened to you, just as an example. So it's like there's certain things that you can do in terms of changing the way that you do the pre-mortem that helps to dampen that bias, but not the pre-mortem itself.

We were trying to look at like, does it make you change your decision? Which you don't believe it does, but most people do. So we're just trying to show, it doesn't really get you to change your decision, it doesn't necessarily get you to do more research. Now, when it's paired with a pre-commitment contract, then, you can start to get these changes, in terms of behaving differently toward what you find out, that pairing, I think, is really powerful. And the thing that we have found, that it actually does do really well, is it helps to reduce overconfidence, which is actually huge. Not in the sense that it brings you below a control, but in the sense that it doesn't bump it up and what I think is really interesting is we've done what's called a pre-parade or a backcast, one group doing a pre-parade or a backcast, which is the opposite of a pre-mortem. I did really well on the test, the teacher has given it back to me, what happened? Why did that happen? So that's a pre-parade.

And what we find is that pre-parades massively increase over confidence. So they cause you to be really poorly calibrated to what is true of the world. Whereas, when you do a pre-mortem, it keeps your confidence at bay, as if you hadn't really done anything going into it. And sometimes, depending on the task, it will actually bring it below control levels.

Danny Crichton:
Hm.

Annie Duke:
And it depends, a little bit, on whether you have control over the outcome or not, but it actually does help with that particular bias and really significantly so.

Michael Mauboussin:
If you said the distinction between just looking at scenarios, and outcomes, and a pre-mortem, is it the pre-mortem we somehow pretend we're in the future and that the event has happened and now we look back? Right? So why do I need the time travel, Danny, if it's not a cognitive thing, why do I need time travel? Why don't I just say, stop, everybody's in the room. We're going to give voice to the people who might be dissenters. Write it down privately, so we're not going to talk about it. Just write it all down. Why do I need this future to present thing, versus present to future thing?

Daniel Kahneman:
This is really interesting because it links to the topic adversarial collaboration. One of the thing that I talk about, and I call it the 15 IQ point, which is what happens to a researcher when they find that hypothesis that they had was falsified by the data, and it's instantaneous, you see why it happened, and you couldn't see it earlier. And I think I understand why, and this is that, when I'm in the current state, I have my theory and you say, this particular result, which violates my theory, I simply can't see how you can get there. You know, don't get there from here, but now, it's happened. And now, all I have to do is tweak my theory so that it's compatible with that. That turns out to be quite easy, but you're not going to do it unless you're forced. And this is really powerful.

Danny Crichton:
So in that case, science as an institution has a forcing function of peer review and somebody basically reducing your status and increasing their status by saying, you are wrong. And the avoidance of wanting to be... The avoidance of not wanting to be wrong, induces you to want to be more correct or less wrong and so, you're more likely to change your mind because there's punishment, there's a stick to your status, if you didn't change your mind.

As Danny Kahneman pointed out, the point of using tools like pre-mortems is ultimately to change your mind, to choose a different direction for your decision than you might have otherwise chosen. Yet pre-mortems on their own, may not be sufficient without other tools like pre-commitment contracts as Annie's research has found, or as Josh pointed out, systems like peer review that create adversarial collaboration, that forces decisions to improve extrinsically outside of our own biases. Nonetheless, they're a useful tool because no failure should be unimagined. Imagining failure isn't an excuse not to make a decision, it's a method to reduce strategic surprise.

Part 2: Changing one's mind

Danny Crichton:
In part two of our Risk, Bias, and Decision Making lunch, Danny Kahneman, Annie Duke, and Josh Wolfe discuss whether people change their minds, particularly on subjects that matter, why people care about dissonance reduction, and what circumstances lead people to changing their minds at all.

Daniel Kahneman:
And that's the topic of that [inaudible 00:00:23] collaboration. I think people just don't change their mind on anything that matters, so-

Josh Wolfe:
Wait, wait, okay, double down on that. In anything, people don't change their mind?

Daniel Kahneman:
On anything that matters to-

Josh Wolfe:
That matters.

Annie Duke:
I agree.

Daniel Kahneman:
To a first approximation, they don't. That's it.

Josh Wolfe:
Because why? Their identity is so tied into what they believe?

Daniel Kahneman:
Because everything is tied.

Annie Duke:
Yeah.

Daniel Kahneman:
I mean, you know, you believe things, the people, and that's something that you're committed to, and the people you love believe the same thing, and it's linked to other things that you believe and things that you have said. And impossible.

Josh Wolfe:
I remember we were talking about this, and you were like, "The reasons that you think you believe the things are not the reasons that you believe them. You tell yourself a story, 'Oh, it's rational, it's empirical, it's this and that.'" But no, it's because you went to high school with this person, or your wife or lover's person...

Annie Duke:
One thing that people don't actually understand is that your actions inform your beliefs just as much as your beliefs inform your actions. I think that people think we look around the world and we're super objective, and then we form a belief and then that informs the way that we act. But your actions inform your beliefs just as much. Lots of studies have been done. There's this old work that's just in this whole body of work on cognitive dissonance that ties into a lot of this, where you can just show that if I come up to you and I say, like, "I've got this sign for this political candidate. Will you put it on your lawn?"

You put it on your lawn, and I now ask you... Beforehand I sort of measure how much you like that candidate, what the strength of your support for that candidate is, and then I come to you a week after you've had the sign on your lawn, and I measure the strength of your support for the candidate. And guess what? You think they're more awesome. Why? Because you put a sign on your lawn. And then, to what Danny said, is it's like, well, what are you signaling to your friends, like everybody who comes by your house, everybody who comes to your house? Now there's some scandal. It's something where had you known about this before you had formed any beliefs about the candidate whatsoever, who do you support, who don't you? If you knew about this, you probably wouldn't support the candidate.

But now you've got a sign on your lawn and maybe you've done some canvassing. And now you learn this new information that if you were fresh to that choice, you'd be like, "No way. No how." But now you say, "Oh, it's just the establishment is trying to get them." Because otherwise, how can you form any sense of consistency for yourself internally, but also consistency toward those who view you?

Josh Wolfe:
So in the first instance, I think that's priming, right, where you're sort of being primed with this sign. It's influencing your behavior.

Annie Duke:
It's different than priming, but yeah, it's not priming.

Josh Wolfe:
What is that tactic that's being used? Because I'm handing you something for free, you're putting it on your lawn, it's shaping your views.

Daniel Kahneman:
It's internal consistency. This is dissonance reduction. So if I'm doing this, I must love this candidate. So it works backward. I mean, in general, this is the way it works. It works backward. I'm doing this. Why am I doing this? And then you find, you develop the argument, the rationalizations. It starts from something you do, from something you say.

Josh Wolfe:
Now evolutionarily, consistency was a virtue because you're somebody that could be relied upon, and you're predictable, and you're not creating dissonance in other people because you're varying how you behave in certain situations. Then we had, what was it, with John Kerry, it suddenly became this liability that he changed his mind.

Annie Duke:
Flip-flopping.

Josh Wolfe:
It was flip-flopping, right? And at least Bush II, at that time, was consistent, right? He was a decider and he was consistent. Science has an institution of encouraging dissent and changing your mind as you were just describing, but what are the other incentives to get people to reward changing your mind and shedding consistency?

Daniel Kahneman:
I really think it depends on so many factors. The question is too general. The one thing that really mattered to people, forget it, I think. Just don't bother. I mean, my stepson, who is an expert on Russia, so he had read what I'd written about not changing your mind. And he said, "What could be done to change the Russian's mind about what's happening in Ukraine?" And I said, "It can't be done. Anything that they hear, they will say is a lie." I mean, it's defended. So it's everything that you are going to invent in order to protect what you believe in now. So in the first place, it's got to be about things that people don't care too much about. Because if they care too much about them, it won't happen. And they shouldn't be committed. And that, going back to the postmortem, I think, that is useful. I mean, people are not attacking the project. They're not saying, "Oh, it can't be done." They're saying something clever that other people haven't thought about. That's the incentive.

Annie Duke:
I think that a lot of this needs to be handled in two ways if you're going to increase the chances that someone's willing to change their mind. Even if you increase that a little, it's really good for the individual, it's really good for society. I agree vehemently with Danny that changing the Russian's mind about this invasion, there's too much. It's like, look, we were in Vietnam way longer when we already knew we were totally losing, and we didn't withdraw. That's just very common. We'll be very protective of those kinds of resources that are put into something.

If you're thinking about for yourself or within an organization, if you can get a little bit more flexibility in terms of mind-changing, you're better off. And there's essentially kind of two ways that you can do it. One is to set the circumstances under which you would change your mind or change course in advance. One of the problems is that when you're sort of facing down that decision to change your mind, we're all terrible at it, right? I mean, it's a little bit what Danny was getting at with why prospective hindsight might help because you have your theory, you've dug a pretty big trench that is your theory. As you're trying to think forward, you are defending that belief. But if you can say, "But what if it were wrong?" it turns it into something that's in the future that's sort of separate from the person that you are right now.

Josh Wolfe:
Is it also that it's rule-based, that you're-

Annie Duke:
Well, so then you combine it with rules.

Josh Wolfe:
Because you're setting a conditional. If this turns out to be this in the future, then okay. But if I get dis-confirming evidence, then I will change my mind.

Daniel Kahneman:
Well, what really does it is the specificity.

Annie Duke:
Right.

Daniel Kahneman:
It is not being wrong. It is being wrong in this particular way. That is... And that's what happens in research. That's the 15 IQ point that I'm talking about. You give me this particular result, oh, I can see where this happened. In advance, seeing why my theory is false, no way. But this particular failure, sure, that's easy. It was an idiotic experiment to begin with.

Josh Wolfe:
Interesting. So if something is narrow, you might accept that specific thing versus the totality of it, in an error on a page versus the entirety of the tone.

Daniel Kahneman:
Because it's not an error. I mean, in research, the kind of thing that I talk about in adversarial collaboration, but you saw my theory is basically all right. I mean, I need to tweak this. Or more often you say the experiment was no good. I mean, I describe in the piece that we're talking about. I described doing that with my wife, and we were conducting a series of critical experiments where we were committed to change our minds. And then instantly as the result, as the specific result came in, it was really immediate. She immediately knew why it had happened. Now, if you tell me that this happened, I can see how this happened. While keeping my theory essentially constant, I can either find a flaw in the experiment or a tweak in the theory, or, very often, you have misunderstood my theory. You're applying my theory in a way that it shouldn't be applied.

Danny Crichton:
It's extremely hard and perhaps even impossible to change our minds since we have built up entire systems of knowledge and socialization to ensure that our decisions are reinforced by the evidence and people around us. But that doesn't mean we can't find strategies that open up paths to changing our minds. One key is specificity, being very clear that we aren't changing the totality of our experience, that theory, in Danny's parlance, but rather a single point of evidence. And the other, as Annie noted, is to set up criteria in advance of what should trigger our minds to be changed. Yet even with these tools, changing our minds is extraordinarily difficult. It remains a rich area of academic study.

Part 3: Overcoming the Odds

Danny Crichton:
One of the hardest parts of decision-making is assessing risk. There's the objective function of risk, the actual likelihoods of different outcomes occurring, but then there's our assessment of our likelihood for success, which is heavily weighted by our own optimism or pessimism about the situation. Many endeavors, such as building a startup, have terrible base rates for success. So does that mean that we need to be overconfident in our abilities in order to take the right level of risk? In part three of our Risk, Bias and Decision-Making lunch, Annie Duke, Michael Mauboussin, Danny Kahneman, and Josh Wolfe discuss optimism, base rates, overcoming negative expected values, population versus individual risks, calibrating risk assessments, and infectious amplification of optimism within groups. We'll kick off with Michael.

Michael Mauboussin:
Who would ever start a new company, who would ever launch your product, who would ever buy a venture capital deal, if you knew the base rates? It's kind of, well, maybe a venture because you get these out-sized payoffs, but if you knew the base rates, it's kind of a depressing, sobering realization. That's the big question is how do we have optimism and base rates live in the same world happily?

Danny Crichton:
I had a conversation, it must have been, I don't know, 20 years ago with George Gilder, who at the time was sort of [inaudible 00:01:15], and George has his pros and cons, and there was another entrepreneur that I had met, which is Mark Gerson, who founded Gerson Lehrman Group. Both of them said the same thing to me in two different conversations in the course of two days, and it really stuck with me. And it sounded like it was arrogant one was an entrepreneur, one celebrated and wrote about as a pundit entrepreneurs, and it was, "Experience is overrated." And I was like, "What do you mean? You're supposed to go and get experience." You're supposed to learn how to do that. And you study all these great people and they're like, "Experience is overrated." And when you look at Gates or Zuckerberg or some of the famous, or Eli, they had never done the thing before that they were going to do, that then they would be celebrated for, that everybody else would go study for.

My answer to you on the base rate thing, I would pause it, is it's frequency versus magnitude, and the base rates might be a very high probability of failure in any of these things. Elon was extremely improbable. Zuckerberg was extremely improbable. Gates was... But the magnitude of the success that they achieved begot the copycats of saying, "I can do that." And then it brings in your question of overconfidence or over optimism because it is required in entrepreneurs. Stan Druckenmiller had this great quote when he was an early investor with us. He said, "I'm giving you money for the same reason that they gave me money and let me run the desk at 27 or whatever. And it's the same reason that they put 19 year olds on the frontline of war. They don't know any better than to charge full speed ahead."

And so that increases the sample size of people that are going out there to do the improbable. And for the ones that are lucky or right or both or whatever the mix is, the magnitude of their success is what becomes the proof positive for other people to try to gamble and replicate it.

Daniel Kahneman:
I think there is a big difference between the prospective and looking back. When you look back at great successes, you find people who defied the base rates and who took unreasonable bets and all the really successful people are major surprises. And they're major surprises because they seem to have denied all the obvious things. When you look prospectively, those things they did had negative expected value when they were doing it. And there's research on optimism I think that I cited by [inaudible 00:03:25], that I cited in Thinking Fast and Slow. That really makes that point very nicely. The expected value of having a good idea is negative. So it takes that extra optimism, and basically the optimism is what does it. It's not that people take risks. I mean maybe you and they're professional risk-takers, but most people when they take risks, it's because they don't know the odds. I mean, it's not because they think that they want to take the risks.

Annie Duke:
I don't want to confuse the base rate for an individual versus the base rate for the portfolio. So I don't think any of those people divide the base rates if you're thinking about it as a portfolio problem. You could think about it as, humanity as a portfolio, right? Are you going to succeed if you try to walk from Africa to Europe. On an individual basis, you're probably dead. But from a humanities' standpoint, that is positive expected value. And I think that it's true in terms of innovation and entrepreneurship. Now you're on the owning the portfolio side. For what you're doing, clearly positive expected value, but if you look across your portfolio, you're expecting something like 70% of them to just not exist in however many years. And I think that we need to not confuse that. The question is, given that it's good for humanity or the set of people who are trying these things, and that across that set you have positive expectancy for the species as a whole, versus the individual doing it.

Does optimism allow the individual to actually try? And I think the answer to that is yes, but there's a separate question that needs not to be confused with that, which is does that optimism actually make it more likely that person will succeed? And that doesn't seem to be true. So William James, the father of modern psychology was a big fan of optimism. He felt like optimism is what makes you succeed. It's amazing. And Don Moore just makes the logical point of, if you can jump seven feet and you're standing on a crevasse that's 7.1 feet, optimism might get you that extra 0.1 for sure, but it's not going to get you 25. The idea that it might have such a huge impact on the chances that you as an individual succeed just on a logical basis doesn't seem to hold true. And then he's actually done experiments.

And what he finds is that people have a very strong belief that people who are more optimistic will be more likely to succeed a task. That's true, that's a very deeply held, a human belief. But as with many deeply held human beliefs and intuitions, it turns out not to be true, and optimistic people don't do better on tasks than people who are less optimistic. So I think that we don't want to confuse those two points, that first of all, we don't want to confuse what's the base rate for me as an individual versus for the portfolio. And then also what is optimism actually doing? It's getting me to try, which is good across the portfolio, but it's not making me more likely to succeed.

Daniel Kahneman:
There is a very big difference between the group and the individual. As an individual, it's a negligible effect, effect of optimism. In a group, it's a very big effect.

Annie Duke:
Right. Because you get to basically replicate if it's getting you that extra 1% or even... What if it's getting you 50 BIPs? Or 25 BIPs? Or two BIPs? But that's reverberating across and compounding across everybody.

Danny Crichton:
So I'm visualizing something. When you gave the example of the person jumping across the crevasse. Let's say that they could jump seven feet, but it was seven feet on average. Sometimes it might be six, you might do six, sometimes you do eight or whatever. But the optimist might just underweight the negative consequences. And so they might have gotten just lucky. It wasn't that the optimism actually gave them any physical strength or launch them or vaulted them that extra foot. It's just that they might have just plunged in the first place and then they got lucky. And that's Michael's domain.

Annie Duke:
Well, let me give you a sort of problem to kind of ponder. So LENIA has had people doing anagram. Give them anagrams to do and they're going to get them and they're going to come up on the screen and it's how many are you going to be able to do in the time that we're giving you. And you give them a practice round so they can get a sense of it. And then you say, "What's your prediction on how well you're going to do?" People are just horribly calibrated. They'll get four on average, but the average person guesses they'll get nine. Now if you do a pre-parade, by the way, they guess they'll get 12.

So they're really poorly calibrated on this task. Now what that means is that they're more likely maybe to start an anagraming company. So that's great, but they're going to be very poorly calibrated about that. So when we're talking about things like pre-mortems, the idea about the pre-mortem is it gets you to be better calibrated. It gets you to understand, here are the weaknesses, here are the things that might go wrong. The reason why you're so excited about it is because you have the sense that it's going to reduce overoptimism. You just have that sense in your head, right? So you sort of have two things living in your head at once. Overoptimism is good, and overoptimism is bad. So that's the thing that we have to remember, is that it depends on how are you thinking about it, right? Because sometimes you'll think it's the best thing ever, and sometimes you can see very clearly that it's not.

What if for example, you were betting on doing these anagrams? And you're totally miscalibrated, you're going to lose everything, because you were like, "Oh, this is the best thing ever." Now is it good again across their portfolio that people willing to start an anagraming company? Of course, right? Across the portfolio. But for the individual, having the willingness to take the risk, I think is amazing. Overoptimism I don't think is necessarily amazing, because all we talk about is ways to reduce that. And I'm not sure that we're not confusing in that sense, risk appetite with optimism, because I for example, am someone who is really not optimistic at all, but I have a very big risk appetite. But I am really not an optimistic person, I'm quite pessimistic.

Danny Crichton:
So what's interesting, when you mentioned individual optimism, and Danny before was talking about the group, there is something that feels different in infectious amplification of optimism in a group. And I think about soccer hooligans and crowds. You think about armies, you think about the scenes in movies that everybody loves, which is that galvanizing speech from the coach that's like, "Let's go get them." And yeah, there's something about that amplification, which feels visceral. I mean, your hormones are going, you're flooded with whatever neurotransmitters and biochemical in your body, and you are amped up. And there's something that I feel like when you are amped up in the same way that some of these people in warring factions, these blitz creeds were given speed or drugs or meth or something that... Literally, right?

I mean ISIS and I think some of the Nazis, they were given drugs and they went into war like crazed. But you see these people coming out of the stadium crazed, and it's this amplification of optimism and confidence, which I actually do think skews the odds differently than the increased optimism or confidence for an individual. The individual jumping over the cliff, but my guess is that the confidence or the optimism compelled them to do the thing but luck played a role, whereas I could see that there's something amplifying in the group dynamic which actually skews the odds to seize the city to win the game.

Annie Duke:
I don't know. I mean the coach is coaching them all week telling them how the other team's going to eat their lunch. Now at halftime, getting them physiologically amped up, I'm sure that that's really helpful, but the whole week before that is just pessimism. Is just like, "You need to be in this formation and they're going to walk right through that." And what are we talking about? Are we talking about being physiologically ready to go?

Danny Crichton:
Amped up, yeah.

Annie Duke:
Okay. On meth, apparently. Josh has told me.

Danny Crichton:
For the record, none of us are taking meth.

Annie Duke:
On meth. Right. Are we saying that it's a good thing to be miscalibrated? I just cannot believe that you would be able to demonstrate that, that miscalibration, that we're supposed to encourage people to be miscalibrated. And I think again, it's a difference between saying, there's lots of different types of people in the world and there are people who will take no risks and there will people who take lots of risks. Across all of those people, if they're all more perfectly calibrated, I think that they will all do better in their life, and we as a species are so lucky that there's that spread. And they're the Elon Musk of the world or the Bill Gates of the world who are just like, "I'm willing to take on this risk. Let's do it, because I think what I could do would be amazing and it doesn't really matter to me." And then I think we're also very lucky there are people like, "I'm not eating that, that's a new berry."

Daniel Kahneman:
It seems to me that if you think about execution, that optimism is good in execution. It's not good in planning. I mean, obviously optimism in planning is a disaster. But in execution, especially with groups, then it's clear that believing that you can do it means that you persevere more. It means that you're more resilient to failure. I mean, all the good things happen. The odds versus risk taking, I'm not sure the distinction is so profound. I mean, my sense is that it's very good for somebody who is going to take the risk to believe that it will work. I mean, why worry? He's committed. So you were speaking of the wonderful, and I agree that it's very good that we have a lot of diversity in risk-taking, but if we have a lot of diversity in optimism, would that be very different? It wouldn't be. So it's still consistent with the theory that I'm pushing that basically when people take risks, it because they don't know the odds.

Annie Duke:
I agree. I'm going to recuse myself on this opinion because having been a poker player for so long, I'm just like, "Yeah, I was getting late five to one and I was only 23% to win, but I was making so much money on that bet, whatever." I'm like, "I was getting the right price." And I remember actually taping a tournament once and I had no pair. I had Ace high. I was pretty positive that my hand wasn't good and the person made a bet and I went, "Oh, I'm getting 11 to one here." And I literally said out loud, because there were people watching me sort of narrate this play, I said, "I'm getting 11 to one." I've never been sure enough about anything to fold here. How could I possibly know? I can't be 90% sure my hand is no good here. That seems stupid.

So I'm just going to call now. It turned out that I won the hand, but I was fully expecting to lose the hand. I just sort of recognized the state of knowledge that I have, not being able to see the player's cards, if I'm getting 11 to one, I can't possibly be smart enough to fold. That was kind of my statement about it. So I think that I, having done so many reps, and I think that's the difference, is that a lot of people that are going into this thing like founding a company, they've never done that rep before. This is on my 10,000th rep. I know I'm going to lose, but I'm getting the right price and I'm going to win enough of the time.

Danny Crichton:
We were talking to Sebastian Mallaby earlier, and he wrote this new book on venture capital, The Power Law, and he wrote more than he got. And he has this wonderful line about venture capital, which is that it's an industry that is there to manufacture confidence. The individual founder may not know the odds, they don't know the risks that they're taking. And from a portfolio level, the venture capitalist really can encourage and tell people, "Hey, if this fails, don't worry. We're going to back you in your next thing", and can help to induce people to take more risk, even in spite of them not knowing the odds.

Annie Duke:
But I mean, I will say, I think that in venture, not all venture... In college basketball coaching for example, I think people take it too far because...

Danny Crichton:
Take what too far?

Annie Duke:
Sort of encouraging overconfidence, encouraging, just continuing on no matter what, just keep banging your head against the wall. There's a huge cost to it going with the heuristic of, go at all costs. Because if it's the case that it's very clear that the venture is failing, right? So let's say that the venture has $6 million of $40 million they've raised in the bank still. So they've got $6 million left, but it's very clear that the writing is on the wall. This thing is not going to work. From the founder's perspective, they think they've just got to keep going until the end. Returning capital would be a disaster. There are certain VCs that also sort of behave that way, that you're supposed to drive it into the ground. But the problem is that just return that $6 million to the venture capitals, they can now redistribute that sometimes back to you and another venture, but in a way that's going to be net good for everybody.

And then I think that people also aren't thinking about the knock on effects of that is that you have employees who are taking less cash comp in exchange for equity that everybody has figured out is not worth anything at this point, and you've now trapped them. These people are productive, smart individuals, trapped now in endeavor in until you run out that last $6 million, and they can't move on to something that's going to be net better for them and for society.

Danny Crichton:
Well, the rational thing and the fair thing, and maybe the right thing, is give back the money, right?

Annie Duke:
Yeah.

Danny Crichton:
But people don't do that. When we encourage a founder, "Hey, look, writing's on the wall." Right? We're basically saying we don't believe anymore, we don't want to fund it anymore. You still got some money, let's wind things down. We have an instance right now with one of our companies, the partner at Lux has been saying for quarters, "You got to cut your burn, you're going to run out of money, you're not going to be able to raise at your current valuation. It's going to be painful." And the CEO is a true believer or is bluffing and not signaling his doubt and is taking pedal to the metal to the very precipice and will and is comfortable if it doesn't work. At least I know when he's telling himself the story, a priority. If it doesn't work, at least I know I tried to go to the very end.

Annie Duke:
Yeah, so this is something actually Danny and I have had some talks about, which is just like there's this counterfactual problem. When you're in an endeavor, what we hate, we just hate not knowing, the only way for you to know for sure how it would've turned out is if you keep going. If you quit, you're never going to know, and you're just left in this sort of counterfactual cloud of what ifs. So there's like this siren song of certainty causing you to go past the point at which you ought to be doing that. And it is, it's unkind not just to the investors, but really to the employees. But I think that this is true and there are basketball coaches for example, college basketball coaches or football coaches who have that attitude of don't have a plan B, which I think is very destructive to those individuals.

And I think that what Don Moore would say is it doesn't necessarily mean they're going to be less successful anyway if you were to have a plan B. I think that we just have this attitude about that. And I think a lot of it comes from our own issues about sunk cost, our own issues about uncertainty, these counterfactual problems, our identity, consistency, endowment, all of this stuff that's kind of driving us into this keep going. So in my new book that that's coming out in the fall, I talk about Ron Conway specifically, because Ron Conway was really well known for getting founders to return capital. A lot of what Danny was referring to in terms of this idea of, don't do it when you're in it and sort of think ahead and be specific, like you said that before and I just want to really hone in on him saying, be specific.

So what Ron does is he says, "I agree with you." He doesn't disagree with them. "I agree with you. I think you're going to turn it around. Let's sit down. If we look over the next six weeks, what are the benchmarks that you're going to hit in the next six weeks that would tell us that you have successfully gotten it going in the right direction?" It's like what Danny was saying about, "Well, I could imagine specifically what would happen with this theory, but not like this second, you need to get me somewhere in the future to be able to get there." So he's saying that to them. He's saying, "I agree with you. Your theory's totally correct. You are going to completely turn this venture around. All right, so let's think about what would that look like. How would we know? What were the benchmarks?" And then he makes an agreement with them that if they don't hit those benchmarks, that they'll return the capital. Now do they always do that? No. Do they do it more than they otherwise would've? Yes. Is that good for literally the whole world? Absolutely. Including the employees.

Danny Crichton:
So what's interesting is historically in venture we would fund to milestones, right? Okay, how much money is going to accomplish in what period of time and who's going to care? And you fund to that point, and you over fund a little bit, maybe everything times pie in time or money. And then the milestone comes and you hit it or you didn't, and then you would avoid future losses because you don't have a sunk-cost fallacy. You say a priority, this is the milestone. In the past decade when cost of capital was effectively zero and money was abundant, there's this amazing scene that captures this when we talked about this in the context of this current moment where you've got WeWork and Uber, Theranos all have these documentaries or movies about them. So if you watch the scene from WeWork, which I'm enjoying a lot, he's got this scene where he says, "Who wins in a fight? The big guy or the tough guy or the strong guy? Or the crazy guy?"

And he's at this precipice and he's going to fail and they're running out of money and he decides he's got to go crazy. And he starts telling people, "We're not going to be a $50 billion business, we're going to be a trillion-dollar business." And everybody looks at him is like, "You're crazy. You're insane." And they started to then calculate rationally, if he's this crazy and this insane, maybe he'll actually pull it off. And we talked about Elon. Elon at so many different points, you find out a year or two after the thing happened, he was like, "We were facing death. We were staring into the abyss. We were on the precipice, whatever." And he bluffed, and he won. And it only encouraged him to bluff bigger and bigger and bigger. And so far it has worked.

Annie Duke:
Well, except, I just want to say again, survivorship bias.

Danny Crichton:
Totally.

Annie Duke:
The problem is that then people go back and say, "That's the way that I should behave." And you need to realize for every person who bluffs in that situation and it works, there's insane number of people-

Danny Crichton:
That it'll fail.

Annie Duke:
... Where it completely failed.

Danny Crichton:
Humans love optimism. We feel that people are more successful when they're more optimistic, that optimism can tear down barriers that stand in our way. But that flies in the face of evidence and statistics. Fundamental base rates aren't base rates for no reason. Across the population, those rates will sustain themselves. And so we have to carefully calibrate our risk assessments to compensate for over optimism, perhaps with a pre-mortem or a pre-commitment. Even so, the human desire to avoid, as Annie puts it, the counterfactual cloud of what ifs, means that even when we objectively assess risk, we still have a desire to build an optimistic case to go bold. We'll fake it until we make it, even as many, many people never make it.

Part 4: Hot Hands

Danny Crichton:
In the fourth and final segment of our Risk, Bias, and Decision-Making Lunch, Josh Wolfe, Annie Duke, Dan Kahneman, and Michael Mauboussin discussed the phenomenon of the so-called hot hand: the idea that a basketball player or an investor can have a streak of good luck that allows him to actually increase the odds of success on their future plays. Annie is going to kick us off.

Annie Duke:
I remember my brother saying something to me in poker, like you'll see people who for a six-month period or even a year period will just do amazingly well. And then it'll be, "Oh, do you think that they're really amazing players?" And he would always say, "I don't know. I have to see what happens over the next year." So the idea is, let's see. There's going to be people who are all doing this sort of nutty stuff. Some of them are going to shake out and have some success, and then you watch those people, and you see if it's repeatable for them.
Now, obviously poker's a little different because in business, you get advantages just from sort of brand value that in poker, it just all comes down to how good are you at trading. In trading you can see that too. There's people who have been, it's always regression to the mean, right? When you look at allocators, they're always drawing down from the people who end up doing really well the next year, and vice versa, and it's because they don't understand the power of equilibrium.

Josh Wolfe:
Michael's old colleague, Bill Miller, used to say, yes, reversion to the mean, but means can move. Poker, hot hands; academia, hot hands; venture capital, hot hands. In both academia and in venture capital, there is a brand value that can accrue. You might get better collaborators, you might have more resources. So there's sort of a positive feedback effect that from path dependence that can increase your probability of incremental future success. In poker and in basketball, there have been so many studies on the hot hand first they came out and said yes, it existed. Then it came out and said it doesn't exist. What is the current status?

Daniel Kahneman:
It exists.

Josh Wolfe:
It exists?

Daniel Kahneman:
Yeah.

Josh Wolfe:
In what domains?

Daniel Kahneman:
In domains where It was said that he doesn't.

Annie Duke:
Baseball, baseball batting average.

Daniel Kahneman:
It exists in basketball. There is a book that's worth reading on that. It's a fun book. It turns out that Amos Tversky made a mathematical mistake. It's a very small one. It's absolutely crazy. Very, very hard to see the mistake even when people show it.

Josh Wolfe:
Okay, wait, did he know that he made the mistake? Number one.

Daniel Kahneman:
No, he never knew.

Josh Wolfe:
And I like the way that you're framing this, by the way, very narrowly. Amos wasn't wrong. He made a little mistake.

Daniel Kahneman:
Yeah, because it was unusual, but there was something else basically, and that's a failure of the entire field. Amos was the only person in the universe who could say the null hypothesis is true and people would take him seriously because saying there is no hot hand is saying the null hypothesis is true, and everybody else would've been stupid. But coming from him, yeah, people bought it and it was wrong. Now, what is absolutely true is that there is less of a hot hand than people think.

Josh Wolfe:
Less than they think, but it exists.

Daniel Kahneman:
It exists, definitely.

Josh Wolfe:
Michael?

Michael Mauboussin:
I agree with all of that, of course, and I think the evidence is it's just not a significant thing, but it exists. Statistically it exists, but it's not a significant.

Daniel Kahneman:
No, it is significant. Have you seen the book?

Michael Mauboussin:
Significant in terms of statistically significant or meaning?

Daniel Kahneman:
Meaningful.

Michael Mauboussin:
Okay.

Daniel Kahneman:
Like 7%.

Annie Duke:
I can tell you, so in poker, here's a little bit the way that it works, and I think that in poker it's in intuitive. So in poker, it's really bad for you as a player if you have to win by having the best hand. And you can see why that would be, because we're all going to have the best hand an equal number of times. So that's very bad for you in poker, right? What's really good for you in poker is if you can cause people to fold before you ever have to find out whether you had the best hand or not. That's really the key. I need to be able to get you to fold your cards so we sort of never discover, okay? So that's the key to winning at poker.

Now, obviously there's other things that have to do with what are the hands that you select? Do you fold at the right times compared to somebody else? These are other things that really matter, but there's this one thing really matters, can I bluff you? Or maybe I'm not even bluffing you. Maybe I do have the best hand, but I don't know that I have the best hand with enough confidence to not hope that you fold. So what happens when I've won a bunch of hands in a row and the table now perceives me as someone who's what's called on a rush. And so on a rush is you think that the fact that I've won in the past will predict that I win in the future. Now, statistically, absent human behavior, that's not true.

Josh Wolfe:
Every hand should be independent.

Annie Duke:
It is independent, absent human behavior. But luckily, you're playing with humans at the table. And so what happens is now when I come in bedding, you're just less likely to want to play against me.

Josh Wolfe:
So the perception of the presence of a hot hand can actually lead to the presence of the hot hand.

Annie Duke:
Correct. So if there were no human behavior, it wouldn't help me that I'd want a bunch of hands in a row because when I bluffed, you'd just be an algorithm that would say, here's the probability, I'm just going to look at the Nash equilibrium and I'm going to decide what I should do based on those equilibria. I can just create a table which is going to give me something that's game theory optimal, but that's not how human beings are.

So now I am winning more with weaker hands, actually, because what tells you how weak the hand is that you can play has to do with the probability that I'll successfully be able to bluff somebody. If I'm in a game where it's impossible for me to bluff anybody, I have to play hands that are very good. If I'm in a game where it's really easy to bluff everybody, I can play really terrible hands because I have no need, I could have two blanks, or as we would call them two napkins. Can I win with two napkins? If I'm in a game where I can win with two napkins, then I can play two napkins.
The hot hand then allows you to lower the quality of the hand that you're playing because it increases the chances that I'm able to bluff somebody off the hand. And now you can see how that sort of becomes circular on itself.

Josh Wolfe:
I think we're going to have to wrap. Whenever I post pictures of us at lunch, people say, "My God, if I could only be a fly on the wall when you know this crew gets together," and now we get to have some flies on the wall.

Danny Crichton:
And with that, our lunch is over. The psychology of decision making has massively expanded in interest among academics over the past few decades, partly owing to the success of Danny Kahneman's collaborative work with the late Amos Tversky. Yet even as frameworks and theories are fleshed out and experiments are conducted, we are learning about evermore layers and nuances of how to improve decision making and get risk right. In other words, our understanding is evolving, and perhaps despite all evidence to the contrary, we are changing our minds about how to change our minds. Thanks for joining us. This has been Securities by Lux Capital and definitely click to subscribe.

continue
listening