Season 1 Episode 6: Professor Jean-Pierre Benoît

In this captivating talk, we delve into the world of Game Theory for Climate Change with the esteemed Professor Jean-Pierre Benoît. Unraveling the complexities of environmental challenges, Professor Benoît sheds light on how Game Theory can offer invaluable insights and strategies to address the pressing issues of climate change.

Season 1 Episode 6: Jean-Pierre Benoît Edited transcript


Section one: Professor Jean-Pierre Benoît and game theory

Chris Caldwell: Professor Benoît, thank you so much for taking the time to come in and speak to us. It’s a great honour and a great privilege.

Prof Benoît: Thank you for inviting me here.

Chris: So, you’ve got a great passion for maths. You’ve got a great passion for trying to understand human behaviour through maths. Could you explain how you got into economics and where your passion for game theory originated from?

Prof Benoît: When you speak of a passion for maths, well, I think I have a passion for logical thinking, and careful thinking–let me put it that way.  Economics, as you know, has vast applications to all kinds of problems and solutions. It is a very important subject for society, but what I appreciate in economics, at least the kind of economics that I do, is a mathematical approach.  By mathematical approach, again, I would emphasize not so much fancy mathematics, but just a very careful, carefully laid out reasoning, a very logical step by step approach where you can go and say, ‘Oh, that’s what I assumed. Was that a good assumption? Was that not a good assumption? This is what seems to follow. Does it really follow? Does it not follow?’  That’s the kind of thinking that I like: very precise thinking.

Chris: Where did that kind of passion come from?  What really got you into game theory?

Prof Benoît: Game theory and so-called decision theory, which looks at how people make decisions, are, to some extent, interpersonal decision theory or interdependent decision theory:   decisions where lots of people are making decisions.  If you think about it, I might say, ‘everything is game theory’.  That might be a bit of an exaggeration!  It’s true that there is just a range of phenomena that you can analyse through that lens, even if you don’t think of it at first. Auctions, how you auction, might be a very obvious example, but I’ve also looked at immigration policy and the implications of immigration policy.  It’s just a very useful systematic lens for analysing a range of phenomena, say a blue wall of silence. There’s just a whole range!


Section two: behavioural economics and social change

Chris: You are chair of the Economics team here. You have been at London Business School for 16 years. What changes have you seen in the field of economics over that time? I think the obvious example would be the growing evolution and importance of behavioural economics, in economic thinking and in the way that economics is presented in the press.

Prof Benoît: I think you’ve hit on a good one. Actually, when I started studying, game theory was the hot topic as well. Let me come back to that in a second.  Now, the change we’ve seen in behavioural economics and, maybe, the greatest change among the students is a greater interest in social issues.  Issues such as discrimination and climate change are, as you know, very pressing topics, so we’re putting those more and more into the curriculum.  Not only does it interest students, but, of course, it is just important that people study these things.  

Behavioural economics has made a big impact and maybe not a completely uncontroversial impact in these areas.  Just loosely speaking, we could say that it’s introducing a more nuanced view of a human being than, let’s say, very rational and self-interested. We are looking at that more carefully.  Some of my work you mentioned looks at phenomena and challenges interpretation. Some of it looks at supposedly behavioural phenomena and questions whether that’s really what’s going on.

Chris: The fundamental premise of economics is that people are in some way rational, you know, they somehow act in their best interest.  I’m not sure that’s always entirely true!

Prof Benoît: It’s certainly not always entirely true and behavioural economics has pushed that a lot. At the same time, in what we might call traditional economics, they do not view it as entirely true either. The question was really, is it true enough? Is it a good starting point for analysing behaviour? So, should I start from that premise and see how far that takes me?  

Even before behavioural economics, many economists, certainly myself, wouldn’t try to push too hard that it is all rational just because that’s the model.   It is a good starting point; now let’s look at deviations. Behavioural economics has emphasized that maybe you should consider the deviations a bit more seriously.  I think that’s a good point.


Section three: meme coins, tulips and seemingly irrational behaviour 

(…or, why the better-than-average effect (BTAE) is actually rational)

Chris: Do you think that we’re living in a less rational time than before or are meme coins just tulips from the good old days? There seems to be a perception in the press that we’ve gone off the edge of a cliff and everybody’s lost their minds and people are doing things which are completely contrary to their own interests.

Do you think that’s true or is it just a product of media?

Prof Benoît: It would be hard to say that people are more rational than they used to be.  I doubt that’s true, but I haven’t really studied that question in that form.   To answer the question, I first have to understand what does it mean for someone to be rational or for them to be irrational?  What is the evidence that I would look at?  When you say tulips, I guess you’re thinking about tulip peaks, auctions and that kind of thing.  So, is that rational or is it irrational? It certainly has a degree of both in it. There are a lot of phenomena that look irrational that aren’t necessarily irrational. From that point of view, I would just say, let’s step back. 

When you’re in a particular time, things look crazier than if you take a big perspective.  Part of my work, and, again, it comes from a rational tradition–it’s not that I want to push, push, push the rational part of it, but part of my work does look at phenomena which are apparently irrational and questions that. If I can give you an example, there’s something called the better-than-average effect. Maybe you’ve heard about that.  If I bring people into a room and I say, ‘’Do you think you’re a safer driver than most people, let’s say less likely to cause an accident?’’ 80% of people say yes. So, most people think they’re better drivers than most other people. There are many studies like this. They don’t all show that, but many of them do show what is being called the ‘better-than-average effect’.  People say, okay, that proves people are irrational: they’re overconfident. 

I have a paper with my coauthor Juan Dubra that says, well, let’s be a little bit more careful.  If I want to say that shows people are irrational, that they’re overconfident, first I have to define exactly what I mean by overconfidence. Then I have to define exactly what I mean when I say, ‘’I think I’m a better driver. I’m sure I’m a better driver. I’m 70%’’.  If you define those terms very carefully, you’ll see that the fact that 80% of the people say, ‘’I think I’m a safer driver than the average person’’ doesn’t mean they’re overconfident.  They could be perfectly rational. By rational, I mean using all the information in hand in the mathematically correct way, even being too rational.

 This may be surprising, but the better-than-average effect is perfectly consistent with everybody being rational, nobody being overconfident. There are a lot of phenomena like that which are quoted constantly as proof that people are overconfident, but it just doesn’t show it.  It really doesn’t show anything at all.

Chris: Just for my own curiosity, is there a difference between males and females in the question?  I don’t think I know a single man who would say they are a bad driver.

Prof Benoît: You’d better ask!  Yeah, so there are sometimes when you can see gender differences; it depends on the question, on what you’re doing.


Section four: Rational polarization in climate camps

Chris: That brings us neatly on to rational polarization, another big topic for discussion today.  You wrote an article a few years back on the nightmare of having climate deniers coming across for Christmas dinner. Could you explain climate scepticism in the frame of rational polarization?

Prof Benoît: Well, what do I mean by rational polarization?  Could it be that we’re disagreeing more and more without it being the case that just one of us is stupid and the other one is smart or rational?  Now, first of all, if you bring up climate change, even though there is disagreement, I would point out that if I compare the situation to 20, 30 years ago, there’s a lot less disagreement.  Sometimes we forget that.  We’re actually moving to convergence on the issue of climate change much more than the opposite.  

In fact, I just saw an article, I think it was on the BBC website, talking about oil companies hiring PR agencies and funding scientists to spread climate disinformation to cast doubt over the humans who are responsible for warming and to exaggerate the degree to which there’s disagreement.  Part of the point of the article is that could explain why we are where we are, with so many people disagreeing.  Whereas, if they hadn’t done that, there would be even more agreement than there is. 

So now, if I say that to you and you’re a believer in climate change, you say, ‘’yeah, okay, that all makes sense, I understand that.’’  Then, I might speak to a sceptic, who says ‘’all the people who disagree, it’s because they don’t like oil and they’re actually captured by the renewable industry who’s paying for a lot of the research.’’ Okay, so there are a lot of vested interests on that side. Now this is really just a symmetrical argument.  If the argument that people have paid scientists to criticize oil is a valid one, well, how about using it not to criticize, but to defend oil?  If Climate change is a valid argument, why isn’t the other argument a valid one? 

I have to step back and say, well, wait a second. My starting point was science can be bought and PR can be bought, you see. So, maybe there’s a rational polarization. We’ll have to really start the discussion there. Who do we think can be bought? Why is this more plausible than that?  You see, there’s a different way of looking at it.

Chris: Now, I could be going in to showing my own personal views on this and the asymmetry of organization and wealth on one side, and the relative lack of organization and wealth on the other.  For centuries, the oil and gas industry, including the petrochemical companies and the states, have been a very, excuse the pun, well-oiled machine. They’ve been very, very good at it. And, on the other side, you’ve had a lot of people who have been at heart very well-meaning, but not the industrial behemoths on the other side. There has been an asymmetry of weaponry.  It’s improving, so things are more counterbalanced, but over history, I think it’s been like this.

Prof Benoît: Let me push back for a second, and I’m not pushing back because I don’t believe climate change is man-made. I certainly do. And, I believe we’re on the precipice of a disaster. Nonetheless, let me push back.  You can say, if I do a poll, I can see scientists in the United States tend to be more left wing than right wing.  They’re probably not big fans of oil.  It probably doesn’t take that much for them to have a distorted view, and maybe even though there’s a long history of oil, renewables are catching up and, actually, some of them have a lot of money.  Maybe it’s not as clear cut as you want.

The point I really want to make is, now we’re having a kind of discussion that sounds more promising than just putting on a graph again and saying: here’s the time and here’s this and this. Now, you might say, ‘’yeah, that might be true’’.  And then, if you tell me, ‘’well, they’re better funded’’, I might say, ‘’Yeah, you’ve got a point.’’  They are well funded. That’s hard to disagree with.  Now, we’re going down a different path and maybe a more promising path. It has just changed the tenor of the discussion and it’s acknowledged that it is not necessarily that I’m denying it or that I’m just an idiot who refuses to listen to anything.

Let’s face it, I believe the science, but when I say I believe the science, it’s not because I’ve done the science. It’s really hard to do the science. It’s really because I believe the scientists. So, now you have to ask people, why do you believe the scientist and which scientist do you believe and why do you believe this outlier scientist?  And so, again, you see it is a bit more complicated.

Chris: Oh, yeah, a deeply polarized world is not a very happy place. That’s not where you want to be. 

So, another good example of rational polarization might be the current energy crisis where you’ve got a big spike in all of our gas bills.  You have one side of the argument saying, ‘’Well, that’s because, you green fools, we are not investing in the infrastructure that we need to be keeping our oil and gas networks going up.’’ And you have the green guys going, ‘’You fools, why didn’t you put more money into insulation and into more renewable energy?’’  Both of them take the same single factor, oil and gas prices are high, and they come to completely different conclusions about it.

Prof Benoît: You’re basically telling me they are coming to different conclusions on what to do about it, right? Should we invest in renewables or should we invest in fossil fuels? All right, let’s just be careful and step back a little bit.  When I speak of the polarization, I could mean polarization in our beliefs, our understanding of the facts, or a polarization in what I want to do about it.  The example you’ve given me, I would say, is put more heavily on the ‘what do I want to do about it?’ question.  In fact, both groups might agree.  If I am an environmentalist, I would say, ‘’yeah, if you invested more in fossil fuels, I guess there would be more fossil fuels.’’ And, if I’m a fossil fuels person, I’d say, ‘’okay, yeah, I guess if we put five times more into renewables, there’d be more renewables.  I’m not sure they really disagree on that basic fact. What they disagree about is what’s the better route to take. So that’s already a different kind of disagreement.

Now I want to speak of careful thought. If I just think of polarization, of course, in the press it can mean a lot of things.  If I’m going to be an academic, then I’ll have to exactly define what I meant by it and where it looks rational and where it doesn’t look rational.  Polarization over what we should do about it is different than our beliefs about what is the underlying cause.

Interestingly enough, your example, and you know, we have to flesh out exactly what you meant by the example, but on the face of it, it’s consistent with everyone agreeing completely on that cause and there are even possible solutions. People are just disagreeing on which they prefer. Now, what might be irrational is maybe the fossil fuels person will say, ‘’renewables are just too far from being the solution’’ and the renewables person will say, ‘’no you’re wrong on that.’’  If they’re using confirmation bias and refusing to look at the data, that’s what I might call the irrational part. So, there’s probably some of that in there, but a lot of it is just ‘’no, you know what? I want to feed my family for the next 50 years and maybe in 100 years there’ll be another solution.’’ and something else is going on.

Chris:  To try and break through rational polarization, you suggested that novelty, just new information that can’t be looked at in the frame of your current biases and your current thinking, might be a way through it. In the climate sphere, there’s a general call of, ‘’facts aren’t working anymore.’’  We keep on coming up with facts and more facts and minds aren’t changing because of new facts.  How do you think novelty as a concept can fit into that?

Prof Benoît: When we say facts don’t work anymore, part of the problem is I just keep giving you the same kind of facts. You know: look at the temperature, look at this and look how much people are driving cars etc.. And then it’s like, well, it didn’t convince me before, so why do you think it’s going to convince me now?  If you just keep pushing the same fact, the same fact, the same fact then why are you surprised?  You say, ‘’You idiot! If you just repeated the same fact seven times in a different way, why are you surprised that I’m not convinced?’’

What you might need to do is take a different angle. In fact, we had a little bit of that in our vested interests discussion. You know, you could see how we both agree that you might be able to buy off people. There’s something called vested interest. Okay, so now maybe we should look at that because now you gave me a different angle. You might say, ‘’you know, the scientists you think are making a lot of money, well they’re actually not making a lot of money. Most of them have tenure and then a school’’ and there’s the common interest you see. So maybe that’s going to convince me more.

If I look at climate change, I plot the temperature, I plot what the people are doing and some sceptics will say, ‘’well, I just don’t believe that it’s due to what humans are doing. That’s just a small factor. I don’t believe that.’’ So, maybe an argument I could have would be: ‘’oh, let’s just look in general; how much do the actions of mankind affect the environment apart from climate?’’ Maybe we can say there’s been a huge effect on the whale population. I don’t know. I’m just making something up.  Maybe, if we get that angle, we could then first agree on whether it is plausible that humans would have an effect or not.  We find a place to agree.  

When I phrased it that way, it almost sounds silly to think the same kind of argument repeated over and over is going to result in a different outcome.

Chris: But there is new information.  For example, we had 41 degrees here in London. That’s new. That’s a record, therefore it’s new. But, if you have a look at one media outlet or another, if, say, I was in the U.S. and I asked Tucker Carlson on Fox News or asked Greta Thunberg or Al Gore, I would end up with entirely different answers on this.  It’s a new data point, but, if I was a good conservative, I would go and see what Tucker Carlson said; If I was a good progressive, I’d see what Al Gore says and that would frame my belief system in and around it. How do you break through that circle?

Prof Benoît: First, you told me that’s a new data point, that’s a record, right?  What if I told you, ‘’actually, in 1974, the old record was…’’  I’m just making this up, but let’s say I said, ‘’41 is a new record, but, actually, it was 40 in 1974 and then the temperature went down’’?  You might know, but you’re probably going to hesitate because you haven’t checked 1974.  Even if what I’ve said is correct, you’re probably not going to go, ‘’well, I changed my mind completely.’’  So, this is all more complex because, actually, the science is very… well we just toss this out, but the real, correct data that we need to present is very complex and that’s where trust comes in and understanding who I need to believe to some extent. 

Now, you’re right, if I go listen to Tucker Carlson and then I just listen to him—and I don’t want to suggest that Tucker Carlson is the only plausible thing. You know he’s not, but then, on the other hand, I don’t want to suggest that everyone who goes to look at him is an idiot. They’re not.  They have other reasons for looking at him.  So, that’s a problem. There’s a problem with who I choose to believe in and why I choose to believe them. I would say that the bias, the confirmation bias happens when I just choose to believe people because I want to believe their conclusions, you know, because I like it.

Now, what might sound like confirmation bias, but really isn’t, occurs when you say something that I already believe and I tend to think you’re probably right. That’s not really confirmation bias because that is consistent with what I’m already thinking.  It can be hard to disentangle those.

Chris: The way I looked at that was in a game theory type of frame where people are looking for their position because they’re part of a group and there’s a benefit in being part of that group. If you can understand the position of the group, you can be a better member, a better contributor; you can be a better conservative or a better progressive or whatever it might be.  And, there are personal benefits to you. Even if you might know that the facts aren’t really right, it doesn’t matter because the benefits of the rational decision are what you get from being a part of the group.  It’s almost irrelevant whether the facts are really 100% correct or not. 

You understand there is a bias, but you lean into that bias because being part of this group is to your benefit.

Prof Benoît: Yeah, I think that’s a good analysis of some things that happen. You know, if I look at climate change, of course, if I care about my grandchildren, then I’ll say there’s a lot of cost if I’m wrong about climate change. So, that’s a question: who I’m caring about.

If we look at COVID, there are a lot of costs to not getting a vaccine.  To me personally, the chance of dying is much, much greater, but you’re right: there’s a cost to being the only one in my group who has the vaccine and being ostracized by them.  Now it’s more complicated. Why would that group believe that?  What are the other underlying things? Why am I going to view all those?  And then, I put in the word, is it ‘rational’, is it ‘irrational’? Does it matter if it’s rational or does it matter if it’s irrational?  So, I just want to emphasize here that it’s all more complicated than it might look like at first.


Section five: rational accidents and the free-rider problem

Chris: I was lucky enough to be at the Royal Geographic Society to hear your TEDx talk in person. Great talk! And, I will include the link in the description for anyone else who wants to go and have a look at it. Could you explain what, as a first step, what is a rational accident?

Prof Benoît: Okay, let’s just think of some accidents. So, the BP Gulf oil spill or the space shuttle exploding or rogue traders, which I might call an accident in the sense of, broadly, of just a wrong thing occurring, or doctors administering the wrong medication. So, we could just think of all these ‘Three Mile Islands’; a typical analysis would say, okay, well, what went wrong?  Is it that the company is prioritizing profits over safety?  Is it that the workers just don’t care or they’re lazy? You know, they’re distracted or something like that? Okay, let me fix the errors that were made.  That’s what we might call an irrational accent or just an accident. 

So, what do I mean by rational accidents? I mean it is possible that actually the workers are doing everything right, and we’ll come back to what I mean by that, and they actually care. It’s not that they’re lazy; they care about the accident; everything is right.  But, somehow, when it’s right individually, it’s the right thing for me to do yet, when we all individually do the right thing, it’s the wrong thing. When we all coordinated it, it becomes the wrong thing. 

Okay, so let me just give an example.  This is not exactly an accident, but maybe something like this has happened to you: you have a meeting at your company and there are 15 of you and you’re going to make a decision, maybe hire someone, and there’s a 20-page memo you’re supposed to read—its long! So, you get there; you read it; you’ve read it; you’re ready to prepare.  Then people start to talk. And, you know what? All the points you were going to make, well four other people have made them already.  

You are thinking, why did I waste my time reading that memo?  I mean, I could have been better off not reading the memo. Now, actually, I said I could have been better off, but that’s not what I mean. I don’t mean that like I could have been out clubbing instead or watching TV or having dinner with my wife. Suppose I’m a good worker. Okay, what I think is, I could have done something better for the company.  It’s about a better use of my time. You see, you don’t say it is literally a better use of my time not to read those memos.  The problem comes when we all think that way—nobody reads the memo.  And now, we’re going to say, ‘’well, wait a second, we were all trying to do the best thing, in some sense,’’ and I’ll come back to that again.  Nothing was done wrong.

 Okay, now, if you look at accident reporting of the shuttle, chemical plants and a lot of things, and you see careful analysis, you’ll often hear this phrase, ‘’it was an accident waiting to happen’’. And what does that mean? ‘’It was an accident waiting to happen?’’ Well, there are all these procedures that should have been followed and half of them weren’t followed.  And, if I look at a chemical plant, well when a chemical plant spews deadly chemicals, it’s not because someone came in and flipped the wrong switch. You know, they’re not that stupid. There are like seven different safety things: there’s an alarm that should go off, there’s a heating blanket and someone should be checking the temperature gauge and this and this and this.  There are a lot of things that need to go wrong for that kind of accident to happen.  But, when you look, you see that a lot of the procedures weren’t being followed. 

I look at the space shuttle. There are very definite procedures, but you find they have not been followed. Then, you ask, ‘’well why not?’’ I will come back to the memo and yeah, you know what? They should all be fired. But actually, this one doesn’t need to be followed if the others are being followed. There’s a lot of redundancy built in, but the redundancy itself, which makes it safer, makes me not really need to put in the effort.  In the sense of the memo, I could be doing something else to make the company safer, other than being the 14th person to check something that’s been checked 13 times.  Too many redundancies are working against us. In that sense it’s a rational accident. 

If I follow the reasoning I just gave you, I’d say, ‘’oh, I think everyone else is going to read the memo, so I won’t read the memo.’’  But everyone thinks that, so if I’m really rational, I should say, ‘’wait a second! Everyone thinks that, so maybe I should read the memo.’’   Now, maybe everyone thinks that and everyone reads it, so maybe I shouldn’t read the memo, but maybe I should. Okay, so now it’s actually more complicated.

Now in game theory, which you mentioned before, if I write it all out very carefully, I’ll see that, yes, you know what the result could be? It could be that we’re all doing the right thing. We all care about safety. We all care about everything, and yet we’re getting the accident. That’s what I’m calling the rational accident. For instance, you’ve put in more redundancy than you should when you factor in that people are going to think, ‘’Should I read the memo or not?’’

Chris: And can you see any parallels with climate change?

Prof Benoît: I’ll give you a parallel and a non-parallel. Okay, so part of what I said is kind of like what’s called the free rider problem. Right. Why should I read the memo when you’ve read the memo?  If I look at climate change, one of the things that makes it very difficult is we know we all have to reduce, but, actually, if everyone else reduces, then I don’t really need to reduce–you know, even as a country like the United States, as big as it is. If all the other countries really went down and the United States went down just a little bit, maybe that would be good enough. So that’s like a free rider problem. What I’d like to do is do less and let you worry about it.  There’s a cost being placed on everybody else and I’m going to ignore that.  And, that really drives climate change a lot. That’s a lot of the problem because we can have all the discussions and, again, I still want to walk away and say, ‘’my oil’s fine; let somebody else take care of the problem.’’ Then we’re screwed when everyone thinks that way. 

 A change in the parallel is that part of the problem is I’m not really worried about you being hurt, just me being hurt.  And that maybe aggravates it even more. Whereas, in the rational accident, even if I am worried about you being hurt, that’s not the problem. It’s not that I’m not worried. I really don’t want the chemical plant to explode. It’s just there’s that element of the free rider problem–I don’t have to reduce it because someone else is doing it.  With climate change there is that element and, on top of that: ‘’well, do I really care what’s happening to those countries?’’  Sometimes people advocate for climate change and say, you know, you better care because if you are worried about immigrants now, what do you think is going to happen when the water goes up and half the planet is unliveable?

Section six: democracy, climate change and misplaced certainties 

Chris: Do you think that democracy is up to the task, that our current systems are up to the task of getting through this?  It’s so much more in the interest of the smaller nations, particularly smaller island nations.  They desperately, desperately need this to happen. This change 100% needs to happen for these countries and is less of an issue for the massively wealthy countries who could be spending money on building dams and stopping sea levels from rising around their particular cities.

Prof Benoît: One difficulty is, even though we are moving in the right direction, the scientists tell us we’re not moving fast enough and that makes these kinds of problems more urgent.  We don’t really have 80 years to move in the right direction.  We saw that problem with COVID.  There was exponential growth. You know, last week we were okay; we don’t really need to do anything; Oh, we were just a little bit bad; a little bit bad last week means disaster this week and super disaster next week. So, to that extent, I really have to depend on the experts and I really have to go with what they’re saying.  And, if by democracy, you mean just really count on people making an evaluation on their own and having a plebiscite on doing that, that’s not really going to work. 

On the other hand, the populations are moving more to ‘’let’s do something about it.’’ In that sense, I don’t know that a dictatorship is going to do a better job, but it’s the kind of situation where, yeah, we can’t just wait 40 years for everybody to realize the problem—then it’s too late.

Chris: You seem to spend a lot of time thinking about the costs of misplaced certainties. We live in a very uncertain time.  I think we live in a lot more uncertain time than we did even a year ago and certainly before the pandemic.  And, obviously, looking forwards, there’s a lot of uncertainty in climate change. You know, are we going to find this great panacea, this great solution to the problem?  We think things will be bad, but we don’t know how bad.  How do we deal with the uncertainty? How do we do we just embrace it? Do we just roll with it?

Prof Benoît: Uncertainty is really tough. It’s really tough to understand, to interpret correctly and to know what to do in the face of uncertainty and to evaluate things and decisions, given that there was uncertainty before the decision and uncertainty after the decision. And a lot of mistakes are made that way and a lot of wrong interpretations and maybe even sometimes professional statisticians misspeak.  One thing I have in mind is, before Trump was elected and we looked at the polls they said Hillary Clinton would be elected, but actually what a careful person would have said is she has an 80% chance of being elected, And, I remember, actually, the day before the last poll said 70% chance.  I went to the office and I said, folks, have you seen that 30% chance Trump will be elected?  That’s huge!

When I say statisticians made a mistake, what I mean is that after the election, a lot of them said, well, where did we go wrong? But I say, ‘’what do you mean went wrong? 20% is big. If there’s an 80% chance she will be elected and she’s not elected, I didn’t go wrong.’’  I didn’t say 100. I didn’t say 99.1.  They’re kind of wrong in saying they were wrong. One problem we have is we tend to interpret whatever is leaning towards an outcome like certainty. You know, weather forecasters say 20% chance of rain and we think, ‘’oh, it’s not going to rain.’’  That’s not what they said.  A one out of five chance is a really big chance. 

That’s one error that we make and we’re not certain how to then evaluate if the person was good or bad at guessing. Last time you said there would be a problem and there wasn’t. But no, I didn’t say there’d be a problem. I said there’s a 60% chance or a 40% chance.  It’s a very different statement. 

Another way we have a problem is when we have a small chance of a very bad outcome. So, there I see if someone says, ‘’Oh, I don’t know, you’re exaggerating. Climate change is only a 20% chance,’’ what they actually have is a 20% chance the planet will be destroyed. That’s huge. You can’t focus on the 20%.  You have to focus on putting all that together, but we’re not very good at doing that.  So, that all makes it very, very difficult. Then you say it might happen, it might not. Then you’re really not sure what to do and you really want someone to give you advice: ‘’This will happen.  That won’t happen. This is the right thing.’’ That’s not going to be the case and it makes all the decisions a lot worse in lots of ways. 

Let me give another example. Do you remember Hurricane Katrina?

Chris: Of course.

Prof Benoît: So, you know it wiped out New Orleans and there was woefully inadequate preparation.  This was a motivation for one of my papers.  I said, ‘’well, why?  Why did the government not prepare for this?  It was a category five hurricane.’’   If I look and I do the math and if I look at a president’s term and I say, ‘’well, what is the chance a category five hurricane will strike within that term?’’, the answer is something like even 2%, or 5%, if I remember correctly. So, that’s really low, but it’s a real disaster!  It’s not low meaning you shouldn’t prepare.  You should prepare, but it’s low meaning that if I do prepare and I come up for election and say, ‘’well, don’t worry, I spent a lot of money preparing for a hurricane.’’ The people go, ‘’hurricane? There was no hurricane. What kind of an idiot are you?’’  And the opponent who comes says, ‘’Don’t worry about it,’’ who’s looking right? Is it the person who said don’t worry about it or the person who said worry about it? You know, that makes our decision making bad.

 If I look at COVID and the pandemic, it’s not that there were no warnings. There were tons of warnings, but the warning wasn’t in the next four years.  This pandemic is stretched over time and there’s a long decision. If there are two candidates running and one says, ‘’we’ve really got to prepare for the pandemic’’, and the other says, ‘’no, we don’t. Just don’t listen to those guys,’’ who’s going to look right?  The guy who said, ‘’No, we don’t.  No, we don’t’’… until we did and we didn’t prepare.  That uncertainty and the combination of how certain is it with how bad is it going to be leads to really bad decision making.

Chris: Yeah, which is unfortunately where our politics are at right now. Nobody looks and everyone promises everything and delivers on very little. It’s quite a disturbing time actually, to be honest.

Prof Benoît:

It is! It is. And what’s disturbing, too, is a lack accountability, which is, if you speak about democracy, is something we really, really need.  So, whatever you think of Boris Johnson, let’s say, what drives me crazy is when he says, ‘’well, I got all the big decisions right. I got all the big decisions right.’’ He sent sick people to the care homes!  Was that a right decision? How can it be right unless those people don’t count? I don’t know–explain to me!  What’s worrisome to me is he could just make that statement. I’d be happier if he said I got most of the decisions right.  That wasn’t the right decision. Maybe it’s not as bad as I think, because if I look, I don’t know, it might give me some explanation for why he did that. But, if he can get away with just making that claim, irrespective of what we think of him, that’s worrisome. 

It gets more worrisome if I look again at Trump on COVID.   If those statements don’t impact you, then we’re in big trouble because the one thing that democracy is supposed to do for us is at least hold the politicians accountable and say, ‘’okay, you didn’t get that right. You know, that’s not right. Let me find somebody else.’’


Section seven: To believe, or not to believe: the role of ‘experts’ in creating unity and difference

Chris: In all of these big issues that we’re talking about, be it climate change, be it democracy’s issues in dealing with things and particularly polarization, all of these are very human issues. You’ve lived in North and South America. You’ve lived in Africa, lived in Europe. Do you see there being big cultural differences in the way people look at it, or are we all human?

Prof Benoît:  Well, I think both are true. It’s interesting, if I look at behavioural economics, not as behavioural, but if I look at what role culture plays, maybe 30 years ago, countries would have ignored that more or less completely.  Now, there’s a lot more work on culture and history, and difference.  But, on the other hand, there might still be an inclination for a lot of economists, myself included, to say, ‘’well, before I go to the cultural route, let me step back and see how much the different prices in the country are affecting something? How much are the actual different conditions? How much would you actually be behaving in the same way if the institutions were the same?’’  So, there’s a lot of commonality driving that. 

You know, on the face of it, the first thing a game theory model would do is abstract away from all those differences and see how much could be explained just by using the same model.  That would be my inclination, but I wouldn’t push it too much in the sense that the next step should be, ‘’okay, let me take a step forward and say, well, you know what, the culture matters.’’ You know, partly in what you were describing before, I have to live in a society with other people who have this way of doing things for whatever reason and it might be possible that if our society also had that way of doing things, we would react, but we don’t.  And, that needs to be my starting point. I need to move from how people are going to react to that. So that certainly can’t be ignored.

Chris: And, this goes back a little bit because we’re talking about how we want to believe in experts because you can’t do all the study yourself. It makes perfect sense. One really interesting part of your paper was talking about how the more of an expert you are, the more likely you are to be polarized.  We have been relying on experts to tell us stuff, but those experts are more likely to be polarized. 

Prof Benoît: So, it’s not exactly that.  Let me take the first part of it, which is that we want to believe in experts.  Then I said you should believe in experts.  Now, we have a lot of people who say, ‘’well, who cares about the experts? I don’t believe the experts.’’  Okay, is that a good thing or a bad thing? 

First of all, we need the experts to be believable. Part of the problem when we ask, ‘’why don’t people believe in the experts, or why do they believe in them less?’’ is the experts themselves. So, the experts, they want to have a TED Talk.  You mentioned I have a TED talk, but do you know what? I hate TED talks even though I have one.  What I hate about TED Talks is that you typically have someone come in and they go, ‘’let me show you something’’ and they make a point—Boom, boom, boom, boom, boom. Like, 20 minutes of, ‘’all the data shows this.’’  And, as a group, it doesn’t show it. It’s not even close. It just makes a very compelling story, you know, and some of these are not experts. Some of them are experts. 

This is like an incentive to try to sell my stuff, and I don’t sell my stuff by saying, ‘’well maybe this is true, maybe that’s true and here’s a footnote; this is what I did…’’  I have to push it. I don’t really know if I have to because I try not to, but that’s the problem. The experts sometimes don’t deserve to be believed. I come for an interview and I am ‘’an expert,’’ and then you start to ask me about anything and, well, I got the mike I might as well answer.  But, you know, no, I’m not an expert on 17 different things. I’m better off saying, ‘’in my opinion. ‘’ No, you know what, let’s shift to another interview where you go ‘’I just have a regular guy off the street and I’m going to get his opinion. ‘’

If you want to get my expert opinion, it should be on something that I’m an expert on and I should give it in a believable way.  Not just believable…Sometimes I get mad at the experts. I see them and I say, ‘’no, you don’t know that! You just can’t make that statement!’’  So, that’s a problem. 

Now, I say you have to believe the experts. In a sense you do have to because you can’t do it yourself. I can look at a chart of the COVID, but I’m not an epidemiologist.  There’s a lot I don’t understand.  When I go to the doctor, the doctor says, ‘’well, what treatment do you want to follow?’’  I say, ‘’well, you’re the doctor!  You can maybe explain it to me in half an hour, but you’re distilling five years, ten years of experience into half an hour.  Glad you gave me your opinion!  You explained it to me, but, ultimately, I’m not the expert.’’ And, ultimately, in some sense, I’m going to have no choice. I have no choice for the climate either. That’s why, when you tell me we had a record data point, I almost push you, ‘’do you really believe that that’s important?’’ I really need the expert to tell me, ‘’no, it’s not that important’’ or maybe ‘’it is that important.’’  I don’t know. Maybe they’ll tell me 41 is so incredible that it matters. I’m not in a panic, but we have to believe in some sense nonetheless.

I won’t exactly go to my paper, but experts don’t always agree and that can make it difficult.  That can get me to go to the ones I agree with and not the ones I don’t.  But we can see the shift, I think, in climate change.  There’s been a big shift in the consensus among experts.  30 years ago, there was a lot more disagreement than there is now. It doesn’t mean you can’t find one who doesn’t agree, but the science, the hard science, tends to move. You know, they’ll come on board.  They might polarize in the short run; they’ll look at the same evidence like, ‘’Oh, that was my point’’ or ‘’that proves your point,’’ but with enough evidence for long enough, they tend to then move together. And, I think that’s what we’re seeing with climate.

Chris: There are fewer and fewer scientists who are arguing against climate change, but that doesn’t mean that it’s necessarily filtered into the general discourse. There are still a lot of people who will be dismissive of experts. Again, we’re back to the polarization side of things where people will be listening to someone who is not a scientist.  People listen to, say, Tucker Carlson—not a scientist, has strong, strong opinions on these things—and they take that on. Is there a way of breaking that cycle, of getting people away from looking towards non-experts, towards people who are just big voices?

Prof Benoît:  Part of the problem, as you said, is that people say we don’t believe the experts, but it is maybe they believe the wrong experts because Tucker Carlson can sound very convincing.  People can sound like an expert, you know, and they start to quote people and they’ll say something which, maybe if you’re watching it with your relative who disagrees, you won’t be able to immediately counter.

It’s funny, when people say they don’t believe in experts, the bigger problem is they’re believing in the wrong experts.  It’s not just that they’re going with their own experience. That’s one way not to believe experts, but they’re just believing the wrong ones, partly because they want to partly because they think there’s a bias. That’s really the cycle that we have to break. 

Again, I really think the problem with climate change, with COVID is these are the ones where we don’t have the time. I do think, in the long run, climate change is moving in that direction. People have come on board a lot, but there’s still a big resistance.  That’s really a question of restoring trust.

You see, sometimes people say, ‘’Oh, how can I write something that’s more convincing? How can I make like the next big seller,’’ like Gladwell or something, which uses the science but doesn’t really use it in a way that I would like—sometimes does, sometimes it’s too imprecise.  And then, it gets very hard if you’re reading a summary of a paper.  I might tell my students, ‘’go read the paper,’’ but it’s really hard to read a paper. I can’t really say that seriously because I understand that to go read a paper and to really read it, even if it’s in your field, is hard. And, if it’s not in your field, it is doubly hard.  But, if you just read the abstract, you’ll get a misleading viewpoint because in the abstract they are trying to sell the paper and not give you everything. 

So, that has that ‘the experts themselves are not reliable’ part. Now, I don’t want my headline to be, ‘’don’t trust the experts,’’ but I do want to say the experts themselves need to make sure they’re reliable, not just selling what they have to say.


Section eight: risk assessment and the rapid unfolding of new tech 

Chris: A part of the content of the rational accidents thesis is about understanding the risks, like in a proper assessment of risk. We’ve got an issue in tech where we as humans are really good at developing tech really quickly and we’ve got a great desire to be implementing this tech and pushing it out there, making changes, making the world a better place.  With anything new, we want to understand what the risks are. How can we deal with that, the issue of technology and not understanding the risks that come with it?

Prof Benoît: This is a prong which goes back in time.  We can see that with the nuclear bomb of course.  After the nuclear bombs were dropped, there were some physicists who were deeply regretful.  They were saying, ‘’I was just doing my science, and this is pure science,’’ and then, ‘’whoa! That’s the nuclear bomb! I didn’t know that would happen, but it did and now there’s nothing that we can do about it.’’

You know, even if they had known ahead of time it would be a little bit tricky because sooner or later some physicists, all the physicists, would understand it and someone else would have the nuclear bomb. So, it’s not just a simple question.  There’s not just the lone tech genius or the lone nuclear physicist and all we have to do is time travel back and shoot that person and we don’t have that technology anymore. We do, you know, and it’s going to come after, but that’s an old issue and you’re right, tech is getting way ahead of us.  We could say, ‘’oh, it’s going to do this, it’s going to do that,’’ but it’s not only that we don’t understand the risk—it is almost the risks of the risks and, again, they’re not linear and they just go too, too fast. 

There are things that are very dangerous—the ones where by the time I understand it, it’s too late. And then, there are others.  By the time I understand that a car driving at 90 miles per hour is dangerous, it’s not too late or, I mean, it is too late in that some people were killed, but we can make the speed limit 70 miles an hour. You know that this design was dangerous. Let’s fix that design. There are some of those risks which are dangerous, but they’re also kind of manageable. There are others: ‘’Oh, I didn’t know the Internet would do that.’’ Well, you know where it will have a lot of trouble. We’re not designed to stop that kind of progress because there’s the progress and there’s the downside. So, I really wouldn’t know what to do about that.

Chris: Yeah, because there’s also the other danger of technology and the innovation being so, so powerful: we’re really good at building stuff, we’re really good at figuring stuff out, so people believe, ‘’well, don’t worry about it. We’ll figure it out. Just give us a few years.  And, you know, we do understand this problem now. We’ve got lots of smart people working on the climate issue. There’s no need for us to do anything about it now because we’ll figure it out.’’

Is that a major kind of danger for action? It seems counterintuitive because you should have faith in humanity. You know, we’ve done great things.

Prof Benoît: Well, you know, if we go back in time again, further back in time, we had Malthus warning about overpopulation and the world getting too big. Earth’s population was so much smaller then! What the story will say is, ‘’we can anticipate technology. Technology has made incredible strides. You know what?  We eradicated smallpox.’’  So, there are a lot in that sense. If you say there is optimism, it’s not insane to think that it can be fixed. But, on the other hand, it is insane to count on it.

Chris: Fair point! So, I’d say there’s a 30% chance of finding something to fix out of a 70% chance (I’m picking the numbers out of the air) that we won’t.

Prof Benoît: Let me flip it. Even if there’s a 70% chance that we will find something, there’s a 30% chance, even if I was going to be more optimistic, there’s a 30% chance we won’t. That’s huge!

Chris: I’m thinking about the area of polarization and how can hearts and minds be won over a bit quicker than they are at the moment to try and get us to the point we want to be at.  Is there a message that you would give to communicators on the climate or climate transition or any advice that you could give on how to break through that?

Prof Benoit: Well, I mean, if I’m going to base it on my work–not everything needs to be based on my work. Well, if we talk in terms of novel messages and, actually, you know what? I can go back centuries to the French philosopher Pascal.  He said, ‘’if you want to convince somebody of something, try looking at it from their point of view.’’  And you might see things going right from their point of view. So first, acknowledge that, and then give them the other point of view from which it’s wrong, and maybe that’s a good starting point.

I can see, for instance, if I don’t trust scientists because I think they’re all in left-wing institutions, whatever that might mean, I can see a point there. You know, about why you might have some scepticism.  At least that’s your viewpoint. So now I’ve understood your viewpoint, but look here at how much the oil companies have to play with. Actually, a lot of these scientists don’t. And, I can show you the politics.  Even if generally that’s their politics, it isn’t here because now there’s a consensus moving. So, I think acknowledging to the extent you can helps.  I don’t want to go whole hog and say, on the one hand this on the one hand that; everything has a point of view.  Some things just are stupid and wrong, so I can’t go that far.  But, on the other hand, first, just try to see to what extent that’s true and see if that can help as well.

Another issue is there’s two kinds of denying. One is ‘’I truly don’t believe it’s a problem.’’ The other is, ‘’I don’t want to pay that cost.’’  They’re not the same.  And, even if I say I don’t believe, it may be that I don’t want to pay the cost, you know.  That requires a different argument. Of course, it’s easier to get you to believe if it is all dovetailing.  I think there’s a point to look at to what extent it’s the costs and the technologies, to what extent it would change if, you know, I subsidize renewables for people who were in oil… Well, that shifts some things they do as well.

Chris: Are you optimistic?

Prof Benoît: Oh, let’s say I’m hopeful rather than optimistic.

Chris: Normally I ask one question, which is why should people care about what you care about, what you’re passionate about? I suppose that the most appropriate question for you wouldn’t be in academia. It’d be more like, why should people be passionate about, be interested in game theory or in microeconomics?

Prof Benoît: Well, I think it’s because I want to look at the surface phenomena and understand that I just see people acting in a certain way. And then I might conclude, ‘’Oh, you’ve done X, that means you want Y,’’ because that seems to be the immediate consequence. You know that it doesn’t necessarily mean that.  I have to understand the interactions of what’s going on. I could look and you might see it as a consequence of what we all did, in a sense, as the rational accidents. And I say, ‘’oh, I guess it means you don’t really care if there’s an accident.’’ And I know it doesn’t mean I don’t really care if there’s an accident.  I do. In fact, I care so much that instead of reading the memo, I was out back doing something else, which was twice as important. 

How can that make sense? Well, because if no one reads the memo…so now the game theory is going to put it all together and then I’ll get to understand because, you know, we have to have better results. So, we need to all get together in the company and chant Safety first, safety, first, safety first. Is that going to work?  Is that the problem or is that not the problem? Well, maybe that has an impact. I don’t want to say it has no impact, but I need to understand the game theory as well.

Chris: Well, brilliant. I think that’s a very nice way to end it. So, thank you so much for your time!

Prof Benoît: Thank you! It was a pleasure. 


Watch the full video on our YouTube Channel

Watchh the full video and connect to the podcast on our podcasts page