How To Immunize Your Mind Against False Beliefs

With Andy Norman - Author of 'Mental Immunity'


Episode description:

Description

Today’s guest is Andy Norman, philosophy professor at Carnegie Mellon University and the author of a fascinating book, Mental Immunity: Infectious Ideas, Mind Parasites and the Search for a Better way to Think.

Andy argues that it’s possible to immunize the mind against harmful beliefs, just as it’s possible to immunize the body against germs. He and Eric discuss the evolutionary origins of skepticism, ideas that weaken the reasoned inquiry, how to decide whether a belief is reasonable, and applications of mental immunity in real life.

Join Eric’s ‘Truth, Disinformation & The 2024 Election’ Class at The University of Chicago

It’s open to everyone via Zoom. It will discuss what’s going on in the coverage of the election, with a wonderful collection of guest speakers, educators, prominent political reporters and polling experts.

It will convene every Monday evening, Central US time, in the nine weeks leading up to the US election and one week afterwards. Don’t miss out…

Register now: https://masterliberalarts.uchicago.edu/landing-page/noncredit/trust-and-media/

 

Transcript

Eric (00:02.262)
Andy Norman, welcome to In Reality.

Andy Norman (00:05.147)
Thank you, Eric. It’s pleasure to be here.

Eric (00:06.974)
It is nice to have you. Now, many of the people who end up on in reality come from the policy or tech or media worlds and their attraction to solving the problem of misinformation come from the effects they say in their, they see rather in their immediate world, like, you know, national security or public health or mainstream media. You teach philosophy at Carnegie Mellon. What drew you to the question solving misinformation?

Andy Norman (00:38.353)
Yeah, well, philosophers have been trying to combat problematic information with questions. I kind of think of the Socratic method as a wonderfully powerful way to, and this is getting ahead of myself a little bit, to sort of inoculate minds against some of the worst forms of infectious information. But the big picture story here, I’ve been thinking about evolutionary dynamics for a long time.

Turns out that the ability to create and spread deceptive information has been locked in an arms race with the ability to spot and ignore it for millions of years. In fact, probably before human beings became human beings, animals were trying to figure out how to gain advantage through deception and those advantages by the ability to detect and ignore deception.

Eric (01:38.414)
That’s fascinating. I want to get to the question of how these mental heuristics or mental facilities evolve. I wanted to talk about the idea of mental immunity. That is where you have planted your flag. Have you written a book called Mental Immunity, drafted a white paper with Sander van der Linden of Cambridge who…

has been on in reality previously, arguing that cognitive immunity should be considered a field of science, comparable to biological immunology. My immediate reaction, and maybe the reaction of many people listening, is that mental immunity sounds like a metaphor rather than a science. Do our minds have literal immune systems?

Andy Norman (02:33.721)
Yeah, well, so I’ve been working with some of the top people in the field, including Sander on this question for the last couple of years. And I’ve taken the somewhat unorthodox stance that the mind has an evolved set of defenses designed to protect it from information that can disrupt it. And that that’s been operating beneath our, under our noses…

…for a very long time and that we should take talk of mental immune systems, literally. So Sandra and I just wrote a paper that we hope will come out soon in a very respectable scientific journal, basically saying, hey guys, this thing is real. Mental immune systems are real and we need to understand how they work if we’re gonna combat this mis and disinformation

Eric (03:26.726)
One way to react to that is to say that, well, of course, there are many aspects of mental functioning that have been well documented. So asymmetric risk assessment, motivated reasoning, confirmation bias, all well documented by people like Danny Kotteman who won a Nobel Prize for doing that. But those systems are adaptive.

They helped our ancestors survive and reproduce. So they’re not dysfunctional. They’re kind of how we operate. We’re hardwired in that way to be vulnerable to misinformation.

Andy Norman (04:07.941)
I love the way you’re putting that, but I put it slightly differently. It’s not that they always function well, it’s that they did function well enough in what scientists call the ancestral environment. So they worked well enough for our ancestors to help our genes get through the winnowing process. But that doesn’t mean they’re particularly good at lighting on the truth.

When when truth pulls one direction in usefulness or biological advantage pulls the other direction a lot of times it’s the usefulness or their ability to kind of choose say adaptive fictions that wins out in the tug of…

Eric (04:48.782)
That’s interesting. does that mean that in the battle between truth and usefulness, that to steer your own mind or the mind of people around you or the mind of a culture towards truth and rationality, that we are basically fighting an uphill battle against our own evolutionary mindset?

Andy Norman (05:16.169)
In some ways, yes, but it’s also true that people can learn to value truth more and learn to be more skeptical of useful seeming falsehoods. a typical scientific training. So I think we’re sort of by default tuned into the usefulness of information and we tend to favor information that’s useful and disfavor information that seems harmful problematic in one way or another. But what a good science education or a good training in philosophy will actually teach you to care more about truth and to care less about selfish advantage. you can actually, by going through that training, you actually retune your mind’s immune system to filter out different types of things. So it’s an uphill battle, yeah,

We have, there are millions of people around the world who have received the kind of education that helps them navigate this very complex information environment that we find ourselves in.

Eric (06:27.892)
I would argue that among other kinds of training that you mentioned in addition to the scientists and philosophers would be journalists who are working at high integrity institutions and have learned the arts, the judicial system, the intelligence community.

Andy Norman (06:50.701)
Excellent point, right? And these institutions have played an important role in helping us combat misinformation in previous decades. But we suddenly find ourselves in a world where bad actors can weaponize basically the new technologies of the internet to shift the balance. And right now our ability to spot and ignore bad information is out of balance with our ability to produce and distribute

Eric (07:16.782)
Mm.

Andy Norman (07:20.293)
And I think the great big challenge of our age is to boost our ability to spot shed that information and which means juicing the immune system so that it can compete again on an equal

Eric (07:34.986)
I couldn’t agree with you more that this is one of the challenges of our generation right up there with climate change. Now, in your book, you mention a half a dozen of what you call immune disruptive ideas. So they are ideas that are abroad in the culture that operate against forming the kind of mental immunity they’re describing and the mental habits that you’re describing as part of the training scientists and philosophers among others. Can you describe some of those disruptive ideas and how you counteract

Andy Norman (08:05.659)
Mm -hmm.

Andy Norman (08:11.981)
Yeah, so broadly speaking, there are set of ideas sort of can lodge in people’s minds and prevent them from really thinking things through clearly and prevent them from filtering out problematic information with enough skill. So one of them, for example, is just the idea that I’m entitled to my opinion. suppose you trot that. So almost everybody mouths that platitude these days on the left and the right.

And it’s true in a legal sense. We should be allowed in free societies, we are legally entitled to our opinion. But that doesn’t mean we can use that phrase to simply excuse irresponsible believing because irresponsible beliefs can harm others. And where our own freedoms to do what we like, our own freedoms to do what we like run into a limit when that well -being of others becomes, comes into play. So if you’re in the habit, of saying, I’m entitled to my opinion when somebody questions or believes, you’re actually misusing that idea to gain yourself a degree of immunity, sorry, immunity from criticism. But that actually enhances your susceptibility to falsehood and deception.

Eric (09:25.079)
Mm.

Eric (09:41.568)
Hmm, interesting. how do you, what’s the sort of, the vaccine, I guess I would say against that immune disruptive idea.

Andy Norman (09:52.658)
Yeah, so that particular one, I argued that if we make a distinction, if we’re clear about the distinction between legal rights and moral rights, then we can prevent that true, that platitude, everyone is entitled to their opinion, from basically putting an end to the kind of critical conversations that we need to filter the worst ideas.

So if you’re trying to have a conversation and encouraging somebody to rethink something and they just go, I just go away. I’m entitled to my opinion. They’re preventing the kind of conversation that could actually help them enhance their own cognitive software, so to speak. our computers get software updates sometimes every week or every few days.

Eric (10:40.642)
Mm -hmm.

Andy Norman (10:49.937)
And they need those updates to stay adaptive in our complex information environment, to function well. If we don’t do the same thing, if we don’t continually debug our minds, our minds quickly become maladaptive and they don’t serve us well. So right now I’m thinking about writing a book about how to debug the mind on the theory that we’re just not doing enough of and not doing it well enough these days.

Eric (11:20.366)
Well, that is another attractive metaphor. Just to play it out a little bit, another bug, if you want to call it, or at least a dead end to a reasonable, challenging conversation would be religion or other kinds of emotional beliefs, say patriotism, which you are, both of which you’re not supposed to challenge.

And arguably both of those things have served well and are useful in many contexts you can imagine. Are they bad things in your opinion?

Andy Norman (12:04.197)
Yeah, so I argue in the book that the word faith has a kind of a double meaning. On the one hand, it can mean like trust in your fellowmen, have faith in others is, I think, for the most part, a good thing. And if we want to celebrate faith in that sense, that’s great. And to the extent that the world’s religions promote faith in one another, trust in one another, you know, more power to it. But often the concept of faith is used to excuse dogmatic adherence to dysfunctional ideas.

And I bet by dysfunctional I mean both false and dysfunctional. And when the concept of faith is used that way, like if that’s an article of faith for me, so don’t ask me to rethink it, if that becomes part of your thinking, you’ve compromised your own mind’s immune system. And you’ll say, yes, this aspect of my thought has some pretty challenging implications for the world’s religions.

Eric (12:42.846)
Mm -hmm.

Andy Norman (13:03.805)
and also for our political ideologies. I think if we’re gonna think our way past the kind of polarization and ideological rigidity that has our society functioning so poorly right now, we need to go pretty philosophically deep and rethink some very core questions about what it even means to be reasonable.

Eric (13:26.978)
I would love to dig into that right now. What does it mean to be reasonable? What makes a belief reasonable? What are the earmarks of a set of standards for reasonable belief?

Andy Norman (13:41.285)
Yeah, the basic story here is actually surprisingly simple. Many people, including some of the top philosophers and scientists in the business, have this idea that a reasonable belief is one that is supported by enough, by good enough reasons or good enough evidence. And that’s not wrong exactly, but it creates a mental picture in the mind where the goodness of any one belief depends on the goodness of the thing…

Hold on.

But that, course, just pushes the question down to the premises of the argument, to the evidence. But we can always ask about evidence, well, what makes that so good? What makes those reasons good reasons? So there’s an alternative way of thinking about reasonableness. And that was put forward by Socrates way back in ancient Greece. And the idea here is that a belief is reasonable, a belief or a claim is reasonable if it can withstand tough questioning.

Eric (14:42.422)
Hmm

Andy Norman (14:43.973)
And so if you can subject a belief to rigorous questioning and it withstands that and still looks pretty tenable, that’s a far better measure of reasonableness than the one that many of us default to reflexively. And to reverse, to shift your thinking in that way, I think can go a long way towards inoculating your mind against some really seductive but problematic ideas.

Eric (15:15.688)
we’ll just talk about, about that. But I, I assume you’re familiar with, Jonathan Rauch’s constitution of knowledge, which is, Jonathan Rauch, another guest on in reality. and he certainly, agrees with the premise that subjecting any claim to sort of rigorous disproof, you know, fallibility, if you will, and, and that that is the measure of…

…solidity of any particular claim, opening your mind to that kind of sort of constant questioning from others and also within yourself, that’s a difficult mental habit to cultivate. How do you do

Andy Norman (16:05.091)
That’s a really perceptive question, Eric. I think to many, can feel destabilizing, right? Like if I’m supposed to question everything, what if the bottom falls out? What if my whole belief system falls into disarray? I think that’s a common fear. And it’s a fear that most undergraduates bring into their first philosophy classroom. And what you have to do to reach them and teach them how thrilling and beautiful…

…and enlightening philosophy can be is to teach them to let go of that fear. You don’t have to cling tenaciously to your core beliefs because it won’t all go to hell. your beliefs won’t go to hell in a handbasket if you start questioning even the core ones. It turns out that done in a respectful, collegial way with, know, the key is to keep the questioning friendly rather than acrimonious.

If you can surround yourself with people who will gently challenge you to rethink things, but be supportive in your search to find alternatives that work, that continue to function well in the way that you need them to, that’s the key to continual intellectual growth.

Eric (17:30.062)
Andy, can you suggest some tactics to do that? can imagine that particularly, and we’ve certainly observed this in political discussions, that when you challenge people’s beliefs, however unreasonable they seem, if those beliefs are tied up with their identity, their political identity, their sense of self, their self -esteem…

…it becomes a very difficult conversation to have. So in a practical sense, you could have the best of intentions, but it would be hard to communicate that when you’re challenging someone else’s beliefs. What kind of, what practical advice would you

Andy Norman (18:12.177)
That’s exactly right. I think the most important thing we can do is learn to become better listeners. So I always, I make it a point, I always try to listen first and try to understand the point of view I’m facing in a conversation. And if you understand it sympathetically first, then the process of just asking gentle clarifying questions. So Socrates was really good at this. He would ask basically clarifying questions. Tell me more about what you mean by that. Turns out if you

deliver questions like that in ways that don’t seem combative. If you can keep a curious tone of voice and if you can either effect or genuinely say, you know, I really want to understand your point of view because I’m sure there’s something to it. I’m sure that you’ve got valid insights to share. Just help me understand them. If you come into difficult conversations with that attitude and postpone the moment where you say, yeah, I’m not so sure about that.

Show the other person that you respect them, show the other person that you care about them, show the other person that you’re willing to understand their reality. And then you can ask them to consider alternatives. And when you do, you don’t just attack and leave them with no alternative. You say, hey, what if we reframed your point, which I agree has some point, what if we reframed it this way? Because to me, that seems much less problematic but it still gives you what you want. Does that make sense? So one of the real masters of this is the blues musician, Darryl Davis, who you should have on your show, by the way. Do you know about Darryl Davis?

Eric (19:56.29)
Mm hmm. Yes, it does.

Eric (20:04.814)
Yes, the KKK

Andy Norman (20:08.465)
Yes, exactly. He deconverted, I think, 100, Klansmen. Brilliant. Using basically the listening strategies I’ve just detailed. Actually, there’s more to David Darrell than that. He’s also just enormously charismatic and persuasive. He’s a master at this.

Eric (20:29.024)
-huh, -huh. So, Darryl embodies, and what you’ve just described, embody the way to have a conversation with someone who may be harboring unreasonable beliefs and not helpful beliefs. But there’s also your own tendencies, the sort of default setting, as we said earlier, maybe towards things like motivated reasoning and other sort of mental shortcuts

lead you down a path towards unreasonable beliefs. What are the mental exercises you should do to make sure that you are, that you’re building your own mental immune system?

Andy Norman (21:14.021)
Yeah, I mean, think mind to debugging has to start at home. You should debug your own mind before you presume to try to debug others.

Andy Norman (21:28.505)
And look, nobody has a belief system that’s free of mind bugs, that’s free of falsehoods and motivated reasoning and confirmation bias. So problematic ideas creep into all of our belief systems, which means that every conversation we enter into, we’re bringing some baggage to it. And that baggage can distort our point of view. So you have to bring humility and a willingness

So even if you reach a point in the conversation where you really want to change the other guy’s mind, you have to keep in mind that maybe it’s you that needs to change your mind. And so the kind of dialogue that really works to dissolve destructively rigid thinking is profoundly open and charitable. And you have to maintain that vibe.

Eric (22:03.843)
Ha ha.

Andy Norman (22:24.581)
throughout these difficult conversations if you want to change other people’s thinking. But that means you have to be willing to let them change your thinking. That’s the price. Price of admission.

Eric (22:37.296)
Let’s scale this question up to the culture at large. There are organizations like the News Literacy Project and school systems that try to teach the art of critical thinking and news literacy. Are those the paths to building up a society’s mental immunity?

Andy Norman (23:05.327)
Yeah, I think so. I do think that the, so we’re big fans of the news, Newslit folks. We’re working closely with some of the best critical thinking instructors on the planet. And we’re big fans of both critical thinking and media literacy. So yes to all of that. But I also think that the concept of critical thinking is in some ways outdated.

I think we need to move beyond it. It’s kind of a black box thing. Nobody quite defines what they mean by critical thinking. There was a study of University of California professors and they asked them, do you teach critical thinking? And 98 % said yes. And then they said, well, what do you mean by critical thinking? It turns out only 20 % could define it, what they meant by that. So it’s kind of a feel good term.

We all just assume that, the kind of thinking we do, that’s the truly critical thinking. The kind of thinking they do, yeah, that’s not sufficiently critical. Now, we have to get past that, those tired and kind of self -congratulatory beliefs. Critical thinking is a good thing, but it’s a limited concept that doesn’t do much to clarify what it means to think well. And to see that, it helps to realize that thinking can be too critical…

…for its own good. Think about the conspiracy theorist who thinks to challenge all kinds of things that you just take for granted. In some ways, they’re being more critical than you are, but are they thinking better? No. In other words, out of control questioning can consume your mind in the same way an autoimmune disorder can consume your body. So you need to learn that it’s judicious questioning, questioning that goes far enough but not too…

Eric (25:05.742)
…Well, that does raise the question of trust. And I, for example, believe that vaccines by and large are great inventions and have saved millions of lives over history and probably saved tens of thousands of lives during the COVID crisis. But I believe that not because…

…an epidemiologist or have done research into, you know, MNRA molecular realities, but because I trust that the scientists who came up with these vaccines and the journalists who reported on their effect are trustworthy. There’s a leap of faith there. And I wonder how in you know, in a mentally immune state, handle those leaps of…

Andy Norman (26:02.799)
Yeah, I’m gonna push back gently on this and suggest we not use the word leap of faith here since that can be confused with the kind of obstinate irrationality we talked about before. But there is an extending of trust where you don’t have complete information. That’s certainly true. And that’s an element of good thinking. The best thinkers trust that other people who have studied matters more get it basically right, for the most part.

Eric (26:12.898)
Hmm.

Andy Norman (26:32.719)
Yeah, in fact, there’s no way in this complex world with so much division of cognitive labor that anyone can master and become expert in everything. We all have to rely on the expertise of others to think and function well. So yes, so I’m using slightly different language to describe what you’re calling a leap of faith, but I’m acknowledging your insight here. I myself have a great deal of trust in what community of immunologists have done to improve public…

Eric (27:06.836)
…In the, when it comes to the question of trust, how do you give it judiciously? What are the measures you wanna see before you extend trust to immunologists or anyone else that you decide you can rely on for the best approximation of truth?

Andy Norman (27:30.747)
This is kind of the big, great big million dollar philosophical question, right? How do you skepticism, suspicion and trust? How do we balance criticism and faith? And I think the way to think about this is to be really context, context, very context sensitive, treat each case on its own merits. And instead of just bringing broad principles into the situation and imposing them, the trick is to really be very present in the particular context you’re operating in and look and see whether trust is merited. Is this source I’m trusting truly reliable? Well, let’s see what the evidence says.

So if you can resist the temptation to sort of just bring sweeping assumptions into scenarios and instead say, let’s actually get fully present and try to learn everything with all the relevant things we can about this particular case. That’ll guide you

Eric (28:39.115)
If you don’t mind, Eddie, let’s take a specific example. I care about media a lot and media’s credibility and worry about the trust crisis that professional journalism faces. Taking some news report, say, about stories about the war in Ukraine, say, what…

…you know, what are the criteria that you would use in any specific situation or about any specific source of news? You know, obviously we can’t go on the ground in Ukraine. We’re aware that media bias kind of persists across even the most well -intentioned journalism institutions. Give me an example of how you apply that sort of situational awareness when it comes to trusting the news.

Andy Norman (29:39.493)
Yeah, I don’t know if this will fully address your question, but there’s a wonderful story dealing specifically with the Ukraine, Russia’s attack on Ukraine. So prior to Russia’s invasion of Ukraine, Vladimir Putin, the US intelligence community caught wind of Putin’s intentions to invade. They also learned that he intended to spin false narratives about…

…Ukrainian aggression. So they were they were gearing up to do a massive disinformation campaign to sway public opinion around the world in favor of their of their assault on Ukraine. The Biden administration decided to share this intel with leaders of nations all around the world. And what they essentially did is they essentially pre -bunked Putin’s preferred narrative.

which had to do with Ukrainian aggression rather than their own. And so by the time, so by getting there first and priming people to be on the lookout for deceptive information, you can inoculate minds against some of the worst deceptions through pre -bunking or mind inoculate, through mind vaccination, if you will. And it was an amazingly successful operation.

Eric (30:42.146)
Mm -hmm.

Andy Norman (31:08.773)
that helped to ensure that Russia has very few addled allies in its Sultan. So that’s, think, one of the shining examples of the use of information to pre -bunk or inoculate against later arriving deception. So I think there are morals there. But again, I’m not sure that fully addresses your question, but perhaps it partially

Eric (31:16.887)
Mm -hmm.

Eric (31:38.05)
Well, applying that to sort of your daily news consumption, you might say, and I’m freelancing here, but you might say, I am going to make an effort to be aware of the biases of the news sources I have. And I should anticipate that, just to draw an example from recent history as we are recording this, that the…

…the nomination of Kamala Harris is gonna be greeted with a great deal more joy among certain left leaning media sources than it will be by right leaning media sources and that I should take, know, dial down my belief system on both sides. That might be one way to sort of pre -bunk news that you can anticipate or news attitudes.

Andy Norman (32:31.725)
Yeah, I like the way you’ve developed that. think being constantly aware of the motives or the likely motives of the sources you’re trusting is important because there are people out there with less than pure motives and they will use misinformation to deceive. We call that propaganda in some cases. There’s pseudoscience that will try to sell you nostrums that don’t really work.

Eric (32:44.077)
Hmm.

Andy Norman (32:58.885)
thousand different ways to deceive people. And our minds are in a constant battle to develop immunity against the new variants, if you will.

Eric (33:13.53)
Let’s talk then about the science of mental immunity that you believe should develop a lot of adherence and scientific credibility. Tell me about the research that you envision that needs to be done and the questions that need to be answered by this fledgling science.

Andy Norman (33:35.035)
Yeah, so my term for this new science is cognitive immunology. And I think it can be as revolutionary as the original immunology was. What original immunology did for our bodies, I think cognitive immunology can do for our minds. And by the way, it’s not just millions of people that vaccines have saved, it’s billions. If you look into it, the numbers are amazing. So cognitive immunology has three basic questions as

as I conceive it. And again, this is a discipline that’s just taking form now and I’m privileged to work with some of the people who are on the cutting edge on this. But I would say that what we need to understand scientifically is how do our minds filter out problematic information? So how do mental immune systems work? Why do they fail?

Why is it that they sometimes fail in such spectacular fashion? How is it that our minds filtration systems can sometimes get them so horribly wrong? And what sorts of forces hijack or manipulate them to make them go wrong? I’m gonna wrap those together into question number two. The third one is how can we make them work better? So what do we have to do to our minds to boost the mental immune function to get healthier?

Eric (34:48.617)
Mm -hmm.

Andy Norman (35:00.454)
idea filtration and information filtration. It turns out that by looking at the history of immunology of the body, you can learn all kinds of interesting things about how we can approach this problem.

Eric (35:16.985)
Interesting. For example.

Andy Norman (35:20.699)
So you had mentioned earlier something that I talk about in my book, that certain ideas can disrupt mental immune systems. So this is a term I coined just by thinking through the analogy. So once you realize that the analogies between the two domains are extremely strong, you start to experiment with concepts like mental immune disruptor. So there are things that enter our bodies and disrupt our bodies. Well, turns out there things that enter our minds and…

…disrupt the functioning of our minds. So simply by coining the word mental immune disruptor, you begin to become more aware of the ways in which ideas manipulate us. If we want to maintain our autonomy, our freedom, if you want to be somebody who can think of yourself as a genuinely free thinker, then you can’t let bad ideas just pull all your marionette strength.

Right? So we need concepts like mental immune disruptor and mental immune booster, mental immune disorder, mental immune dysfunction, and healthy mental immune function. We need these concepts to think clearly about thinking. And I’m suggesting that we develop a new paradigm that brings all of these concepts to bear to help us think more clearly and more capable.

Eric (36:34.412)
Hmm. Hmm.

Eric (36:45.632)
Looking back over history, there are, as you’ve pointed out, and you pointed out in the book as well, many, many instances of mental viruses doing great damage. you think of, mean, you can think of many examples from the 20th century, communism, Stalinism, and Nazism, the religious wars in Europe.

Eric (37:14.418)
The list goes on forever. Myanmar in just the past couple of decades. Are you optimistic about the future of the species gaining mental immunity in the face of tendencies that have evolved over millennia and also the digital and the sort of digital environment that makes infection widespread?

Andy Norman (37:19.281)
That’s right.

Andy Norman (37:36.591)
Yeah.

Andy Norman (37:42.833)
I love your questions, Eric. That’s such a great question. So if you look at history and examine the times in which the deceivers grabbed onto a powerful new information technology and weaponized it. So religious zealots weaponized the printing press and led to 150 years of religious warfare that tore Europe apart and ended up killing about a third of…

…all Europeans. Hitler and Goebbels latched onto radio and television and used it to completely derange an entire society. And that led, of course, both to the Holocaust and World War II. And in Myanmar, was zealots who weaponized Facebook to lead to genocide. So when new technologies come along…

…bad actors will seize on the possibilities and begin to develop ways to take advantage of others. And time and again in history, human beings have evolved defenses or to rebalance things. So we eventually brought the persuasive powers of the printing press into balance again and overcame the wars of religion and that led to the Enlightenment, a huge, wonderfully progressive time.

We eventually beat back Hitler and Goebbels and led to the great peace. It’s been a wonderful time of human progress since then. I think we can do this again. The internet is a big can of worms and the challenges are many, but I think that in the same way that we evolved immunity to those other forms of deception, we can do it again and emerge from this.

happier and healthier than ever. I don’t want to seem like I’m whistling past the graveyard here. I have days where it all seems too much as well, but on the whole I’m hopeful.

Eric (39:54.818)
All right. I think that’s a great place to end it. Andy Norman, your book is Mental Immunity. I really enjoyed this conversation. Thanks so much for being on In Reality.

Andy Norman (40:06.703)
My pleasure, Eric. Let’s do it again sometime.


Created & produced by: Podcast Partners / Published: Aug 8 2024


Share this episode:

All episodes are streaming on these platforms: