🚀 Add to Chrome – It’s Free - YouTube Summarizer
Category: N/A
No summary available.
00:00
[Music] Welcome back to Misguide to the podcast.
00:16
I'm your host, Matthew Faciani, and today I'm joined by physicist and author David Robert Grimes. David's career spans medical physics and cancer research, statistics, public health, and science communication along with a past life in music and acting.
In this
00:31
conversation, we dig into why the just give people more facts approach falls short, how conspiracies take hold, and what effective media literacy really looks like. We also talk about the social and structural forces that shape health far more than any single hack.
00:46
also why changing our mind should be seen as a strength and how to rebuild trust in institutions without simply demanding it. Let's dive into our conversation.
David, thank you so much for joining me. It's great to talk to you and meet you and explore all the really cool things
01:02
you've been doing over the years. And before we dive into some of your most recent work with science, communication, and critical thinking, I'm curious if you could just share more about your journey and how you got into science.
because my understanding is you did some acting and music and how did that turn
01:18
into medical physics? So all I'll say is my career path has always been haphazard and confused and that that actually could be the the story of my life.
I did as as an undergrad I studied physics. I almost studied law and then somehow decided that law sounded terrible and then went
01:34
into physics instead. Didn't really think about it.
Just thought that's interesting. I'll play with that.
I like doing maths so I guess that's why I I did that. But or as the Americans say math I think singular but we say math so anyway ended up in university was offered a PhD in medical physics like looking at
01:50
radiation dose modeling or nonionizing rad dose modeling found that really interesting ended up posttocking into cancer research and then into communication but the whole time through college I was supporting myself with acting gigs which that's that's still a passion of mine I mean obviously I I
02:06
have friends who went professional and some of them who are well known and they're great but I I didn't have the uh to go that way. So, I stayed doing music and acting as as a hobby and I still do it and I love it.
But yeah, the science became a career path. Although I always joke that that as an academic career
02:23
path, it's about as unstable as a career in the arts. So, I might as well have gone either way.
So, I guess when you pursued like your background was looking at cancer research, like how did you focus more on that specific for medical physics and
02:39
math and all that stuff? Like how did you Did they do you start more topically or more in the the methods of math and physics and stuff like that?
So methods are really important. So I always joke I I mean I'm not really medical phys my my PhD was partly medical physics but it was never
02:54
something I practiced. It was just a mathematical question about how do we get a dose that's beneficial for a patient without inducing cancer or minimizing the chances of inducing cancer.
And that's a classic minimization problem if you want to put the really nerdy slant on it. But my
03:09
real interest was using mathematics and and physical methods to answer medical questions. So when I moved to Oxford to do cancer research uh with Mike Partridge, my my old professor, brilliant man.
He was originally a physicist as well and he'd been in cancer research for years. And the idea
03:25
was, you know, we were looking at radiotherapy and what could we do to improve it? What can we do?
What do we know about how tumors grow? Uh how they use oxygen, things like that that we could apply these methods to that might actually help patients.
So I've always been really fascinated by
03:42
sol solving problems and what I do is I have this nice arsenal of different tools. I have I'm a chartered statistician as well.
So I have statistics tools and I have mathematics tools and I have a little bit of physical intuition and whether it's public health which I do a lot of now or vaccine research or cancer research.
03:58
They're all very disperate problems but you can use very similar tools to try and shed light on those problems. So I work a lot with medics, with oncologists, with clinicians of different types, with public health doctors, and they'll say, "Look, I I mean, let's say with public health, how
04:13
do we vaccinate the most amount of people in a certain time?" And I'm like, "Aha, there's a there's a branch of maths for that." And that's really what I like doing. It's coming at medical problems from a slightly different angle.
But the reason to answer your question, you get a similar set of tools. And even though the problems are very different in their scope and their
04:30
application, there's underlying mathematical and physical truths that you can map one to the other. And that I guess the one nice I think that's probably what drove me to physics when I was undergrad that it gave you a passport to do a lot more than just physics because I would be a terrible physicist.
I guess as a jobing physicist
04:47
I would suck because I'm only really interested in medicine but I'm very interested in using mathematics to help medicine. Gotcha.
Yeah, that makes a lot of sense. That's a really interesting connection.
So, in your academic work, you you're bridging, you know, math and the medical
05:03
world, and I'm curious how that translated then into your science outreach, science communication. Was that always an interest of yours or as you were doing your PhD and post-doal work, was that something that kind of evolved over time seeing uh, you know, all the pseudocience out there?
Was
05:19
there one particular thing that got you interested in doing that? There's a few things that kind of triggered it.
And again, like I said, haphazard and disorganized would be the story of my life. This is a chaotic, nonoptimized system, but what I what I would say is that I guess maybe the
05:35
theater or maybe the fact I was big into English literature and things like that. Communication was always something I was I found relatively intuitive and I was always very good at repackaging concepts for people in ways they could understand.
Even when I was, you know, still doing a PhD and I was teaching or or
05:51
demonstrating, I found that I I was pretty good at that. But what I started noticing and I started off with a mistake and a mistake a lot of people in science communication make.
I started off in my head with an information deficit model of the world. I realized I was incredibly privileged to got the education I I was getting and and the
06:08
opportunities I was getting. And I naively around the age of 24 or 25 I naively assumed that if we just gave people more information and clearer information, better decisions would be made.
And that is partially true, right? But that and that's the information
06:24
deficit model where people just don't have enough information. But this is before the rise of social media to the extent that it now dominates.
We now realize that the opposite problem is is partially true that too much information makes it impossible to parse. People don't respond to messaging in a fully
06:40
intellectualized way. I mean obviously you've you've written about this.
I've written a book about this as well. People you know they they there's a lot of an emotional component.
So I started doing outreach work just writing pieces because occasionally journalists would hear me say something and go can you explain that like that's really easy
06:55
explanation can you write something but I would get the feedback from people people like would read the piece online I want to say it was in the Guardian or whatever else and they'd write horrific things underneath it that really showed me and it made me really fascinated particularly with conspiracy theories they were particularly obsessed with the notion
07:11
that if I wrote something say pro vaccines or or pro- nuclear energy or pro you flidation that I was obviously a vested interest of of some weird and they had a weird conspiracy in their mind about it and that got me really interested in studying the mathematics of how conspiracies would work and and
07:28
spoiler alert I did a paper on this in 2016 it went viral they wouldn't it turns out even if you want big conspiracies to last it's just really hard to do for a whole load of you know nice mathematical reasons but that back and forth and realizing that my model that you this is just information
07:44
deficit problem was wrong and then I got really interested what is correct here and it turns out that human psychology is a massive factor in this. We don't just interpret messages in a dispassionate intellectualized homoeconomist kind of way.
All the information we get we parse through our
08:00
own emotional filters our own ideological worldviews. We have a nasty tendency to you know confirmation bias and motivated reasoning where we will bend things to fit our worldview.
So I've now gone back to the idea that if we can just in some ways get the spirit
08:16
of science in there and that is scientific inquiry that we are willing to change our worldview or change our ideas but then we have to divorce ourselves from our ideas. The problem is for a lot of us our ideas become ourselves.
You know like the the the silly example I always give is you know
08:33
you support your local football team you love your local football team and someone goes those guys suck. Why do you feel personally offended when they say that?
because you've put emotional capital and intellectual capital into identifying with that banner. Ideas are not a million miles off.
We get very
08:48
passionate with our ideas. We shouldn't we should be promiscuous with our ideas and willing to discard them at a moment's notice if a better idea comes along.
But that's not how we've been, you know, cognitively built. So that's a bit of a challenge.
Yeah. Yeah.
Absolutely. I had a similar
09:04
journey of grappling with the information deficit and thinking that oh if only I could just give clearer and more concise scientific knowledge to people that would help and I think there is a bit of a journey of seeing that that learning curve and yeah so much of
09:21
it is filtered through lens of identity that's that's a big part of my work is understanding how these identity processes influence us and how we can understand them so I'm I'm curious as as you've been in the space for a while. I I think you wrote your book in 2019 if
09:37
I'm correct. Yeah.
The first the first edition. The first edition.
And I'm I'm now working on another book and up. You know, you know, you know, you know the joy of it.
I do. The joy.
The world is burning and you're trying to get deadlines. I know.
It's a surreal experience. Um um
09:54
so I'm curious like your your thoughts on the the evolution of this space because like we've seen you know there are so many problems in the the misinformation broadly speaking that world of of pseudocience and misinformation and conspiracism you know
10:10
10 20 years ago and now it's it seems like in some ways it's gotten worse. Um there's also been more attention to it.
So I'm just curious any insights that you want to share of like the last 10 years or so being in this space. Have
10:25
you seen good things or bad things or changes in a in a meaningful way? So I wear two hats on this cuz I'm still very much an academic who researches this stuff.
I I always think like I do science communication as an outreach thing, but I'm technically just a
10:40
scientist who is okay at communicating. But what I have noticed is this and I um I don't sound Cassandra like but at the end of of my book when I was writing about it and I actually said at the beginning as well I go this is going to get a lot worse.
So I wrote this before the pandemic and I wrote you know the problems we're
10:56
discussing here are going to get a lot worse. And I remember an editor going to me yeah but like we're getting better so like is that not a bit negative?
And I I said okay I will justify that statement because it is going to get a lot worse. The same editor rang me just like three months of the pandemic and went you know
11:12
I think you were right and they said I know but the reason why and this is really important to realize the signs have always been there this going to get worse and it's not we're not at the zenith or the nadir depending on how you define it here of what the worst will be
11:29
overwhelmed with information that we are particularly bad at parsing. We are particularly interested in information that makes us angry, information that makes us react, that make solicits some kind of visceral reaction.
However, we're very bad at asking the fundamental questions of that information. Where did
11:46
that come from? Is it reliable?
Is it in context? Has it been framed in such a way as you're trying to provoke an emotive reaction rather than a rational one?
Right? So we have social media which is you know absolutely basting us with this stuff and we don't
12:02
have the critical thinking skills and tools yet as a society to not be affected by that. We have terrible information hygiene all that kind of stuff.
But the the seeds of that work were very clearly sewn I think years ago. I remember being in 2016 when that conspiracy paper went viral.
I was on
12:18
BBC and it was about conspiracy theorists. Um, and the the the hosts of a show asked me at one stage, you know, conspiracy theorist though, it's a minority thing, isn't it?
It's all these weird people that live in their mother's basement and wear tinful hats. And I was like, well, look, the data would suggest
12:33
that a quarter of the population roughly are susceptible to conspiratorial ideation, which is, you know, that they they would they will call to conspiracies as an explanation even more mundane explanations are possible. Well, that's a very simple handwavy version that Matthew can definitely correct
12:49
better. But I I said if a quarter of the populations are susceptible to that, I said well then conspiracy theorists are your brothers, your mothers, your cousins, your friends.
And again, those seeds are always there. We we saw politically in 2016 election that was absolutely dominated by disinformation
13:05
and people falling for that. And again, the pandemic, I think, has laid it bare.
We currently live, we're recording this in 2055, where Robert F. Kennedy Jr.
is somehow the human health and sciences secretary. Right?
We are not at the worst of it yet. But it's all
13:21
symptomatic of the same problem that we have a terrible ability as a society to parse being overwhelmed with bad information and we fail at those basic critical thinking skills and it has allowed bad actors to bring us to where we are. So that you know as the evolution of that space it's all it's
13:38
all gotten worse but the prognosis was always that it would get worse. I I I I I don't think we should be shocked by that.
I think we should be shocked at how poorly prepared governments have been for this, how poorly prepared learned institutions. I mean, I every week I'm advising, you know, different
13:56
learned bodies who don't seem to realize how much of a problem this is. They'll they'll they'll go, "Oh, but people won't misunderstand that." I said, "They won't have to.
It will be misunderstood for them." I go, "You have to be on guard for how this information will be understood." At the moment, you're on the back foot. So, we still have a very
14:13
reactive rather than proactive approach to misinformation and disinformation. Yeah, I I completely co-sign with with all of that.
I I've been having a lot of the similar thoughts as I've also saw a lot
14:28
of this coming and getting worse with time. uh you know you speak of like mathematical modeling like we there's probably some papers that you might have seen about just even the the asymmetry in misinformation on social media compared to factual information and just
14:43
like mapping out how it would go over time and how the network clusters of the misinformation would dominate because they're so emotional so viral and then the factual ones kind of for insular don't really spread as well that was in 2015 I think it was Walters I can't remember his surname there was a paper
14:58
on that that literally looked at that and people that was being pessimistic and I remember reading that paper going I think he's being optimistic. Yeah.
Right. Like if anything some of those papers were underestimating how quickly this would happen and and the scale.
So it's it's really unfortunate and yeah I was thinking the same thing during the
15:15
pandemic. I was just like depressed but also not completely surprised by the lack of institutional preparedness and the lack of investment in communications and social science.
And and here in the United States there were people that admitted that after the fact. Um, I think Francis Collins, the leader of the
15:32
NAH at the time was saying, "Oh, we we didn't think that um, you know, there'd be so many people that were against the vaccine and maybe we should stop that we didn't think we didn't think." And end sentence there. Well, yeah.
It's just like there's this
15:47
lack of incentives. there's this whole like macro discussion of like why that's the case and why we don't incentivize science communications, science outreach and how it is this always like reacting instead of being proactive.
And now we're to the point where yeah,
16:02
universities are being attacked very aggressively in a way we haven't seen in the states for a while. And research funding is cut in a way that's so much more extensive and at scale.
And now finally universities here are like maybe we should actually be more
16:19
proactive and learn how the communication landscape works in this digital age and this AI age. And it's like okay great we're finally it's like are we always destined to be this as humans I wonder of like cycles of boom and bust of things get good and then we
16:35
forget about being thoughtful and proactive and then we wait until they get bad and then we try to fix them. I just wonder if we could ever break that cycle.
I think, weirdly enough, I think there's a philosophical answer to it in some ways and I it just popped to my head there and I mean I've mentioned it before, but I always think of Poppers
16:53
Popper's paradox of tolerance, right? And this was you probably know this well, but I'll say it to listeners who don't.
He was writing after the fall of Nazism and he'd lived through it and he'd have to he'd have to flee and everything else. And he was saying what happens to a society, you know, we want to be open-minded and tolerate everything.
You know, we want to be
17:09
liberal and progressive, but what happens if you have a set of actors in there who are willing to undermine that? How long do you tolerate them?
And his argument was if you tolerate intolerance in a society in this case in his model, they will eventually come to dominate and therefore your entire tolerance. So
17:24
he said actually you should resist say you know fascist views or things like even if you you can you can be tolerant but you should still have to reject you know things that would undermine that tolerance. I think we have a similar parallel with information.
I think we have a thing where we always and this goes back to the fault of information
17:40
deficit model. We always do more information is always better.
But if information or actors putting out information are putting out poisonous stuff there. If you get a vat of wine, right, and you add one cup of poison to it, you've just poisoned the wine.
You haven't suddenly got like mainly okay
17:56
wine, right? And I think information is similar.
And again, we have a problem where we have our, you know, our our social media echo chambers, but we also have, you know, you you have things like Joe Rogan, 100 million listeners, right? You have people like that platforming the worst people.
And the defense is
18:12
always, oh, we're just listening to alternative points of view. And I'm like, but you're not, though.
That's an advocacy without any push back, without any editorialization. For example, you know, Joe Rogan, whatever you if you like him, don't like him.
I'm picking him for example. I could pick on Charlie
18:28
Kirk with I would love to pick on Charlie Kirk. I could pick on any of these guys, right?
But they all do the same thing like, well, just asking questions. I go, but you're not qualified to ask the question, right?
You're not a good interlocator because you don't push back and go, sorry, where did you get that information from? A good TV host or good radio host for all
18:44
the flaws of it, their job is actually to push back. If they have a politician on their show and the politician makes a bunch of statements, they should be, if they're doing their job correctly, going, "I'm sorry, that's not actually correct or how can you justify that statement?" They're supposed to probe.
But we've gone into the all information
19:00
is entertainment phase of existence where people will listen to a podcast and think they've learned something because, you know, they heard something about vaccines on it. I'm like, you probably didn't though.
It's unlikely you learned anything from that. And then people, you know, are appalled for that.
Like, but I spent like three hours
19:16
listening to that. I'm like, yes, and you probably learned nothing.
If anything, you probably polluted your information source. You've added poison to your wine.
And because you don't have the skills or the wherewithal to parse that and to and to play that like skeptical agent, you're now disinformed.
19:32
So, it goes back to that weird parallel to the paradox of tolerance. We've we've allowed bad actors absolutely overwhelm and this is not a defense of traditional media like that in many ways that was failing for years.
But the one thing that we should give them credit for is that most journalist
19:48
organizations have some modicum of factchecking have some modicum of journalistic standards and integrity. We've thrown that by the wayside and saw as freedom and actually it's imprisoned us and that's a real problem.
Yeah, it's definitely been concerning to see this this most recent wave of
20:05
podcasting hosting all types of of people without any push back and and that's been something that again maybe this is overcorrection of some of the issues with traditional media where then people go to independent media and the openness that comes with this and then
20:21
we become so open that we just let whoever is on there say whatever without any push back and I then wonder if there's is going to be more uh correction to that where instead of just hosting, you know, whoever who's spreading all kinds of lies or whatever
20:36
agenda they have and you you just have them on there because they're controversial, it's like getting to the point now where people are like, "Oh, at least some some audience members of like Joe Rogan are like, "Maybe you should push back whenever you have, you know, the FBI director on on your platform instead of just letting him say whatever." Like you're just being used
20:52
as a political tool at that point. Yeah.
But that requires a savviness on the part of listenership. And this does not I have to phrase this carefully because if I do it wrong it'll sound snobby and it's not what I mean.
What I mean is the skills you need to to critically parse information they're not intuitive. They have to be trained in.
21:09
So we have to learn from an early age to go hang on you know what question should be asked there. So in Finland they started doing this very very earlier on.
They're teaching kids media literacy and the idea is but they do it as an exercise and they're going look let's you know let's push and they start very
21:25
young I think the age of sixish doing kind of doing this it turns out that Finland for that reason is a country that has much higher trust in traditional media institutions because it knows the people learn very early on something about factchecking something about journalistic obligations and
21:40
integrity your job like if you know you're right if you're the Joe Rogan if you were actually a journalist your job would be to push back when people make a claim or at least ask for qual clarification and go sorry you've said this is this true where did you get it from the hard questions as we used to
21:55
call them back in the day I make myself very feel very old saying that but it requires training and understanding to know why we ask those questions it's not just to be pedantic it's so that we don't get taken for a ride by by bad actors um the problem starts in our education system from day one we don't
22:12
teach media literacy look adult literacy full stop now I I can't speak for the states, but I know in Ireland, which is a very highly educated country, about 23% of our population will struggle to read the directions on a packet of aspirin. All right?
When you go into
22:27
other and that's basic literacy. When you go into greater forms of literacy, digital literacy, media literacy, the the adult literacy goes way down.
So for media literacy in the Europe, it's about 50% of people are media literate, meaning 50% aren't, right? Uh it depends on what country it varies
22:44
a bit, but it's usually around there. So, for example, I will know that something that's in the New York Times is probably more reliable than something on my racist uncle's Facebook page.
A lot of people won't. Now, I don't actually have a racist uncle.
That's very mean. But it's the analogy I see.
I
22:59
mean, when the alt-right stuff gets shared, it's if you don't know the difference between source integrity, you're going to get taken advantage of. So, we need to be training people from day one from from from our education system.
The only way we can immunize ourselves against this stuff is that we
23:15
have to start really early going, "Look, there are people online and offline who will take advantage of you, who will feed you a diet of nonsense, to make you pliable, to make you buy stuff, to make you scared because it cares profit. Here are the tricks you can ask, not to be
23:31
like cynical and just dismiss everything, but here when you hear information is probably how you should stop and wait and think and analyze it." So I I think that that's a massive gap in our system that we I see it in Europe and I see it in the states when I'm over there as well. We we need to look at
23:48
media and digital literacy and at the moment we struggle with basic adult literacy which is a real problem. means you underinvested hugely in our in our educational systems.
And I think there's a huge argument that we need to go right back to the drawing board and make sure that everyone is protected against this
24:05
immunized against this information that will harm them and harm their society in the long run. Yeah, absolutely.
I mean, I I've been studying media literacy and working on different games and tools and pre-bunking stuff to to try to get that
24:20
at scale, but like even if you develop some nice game that helps people learn media literacy online, you know, it's a it's a tiny drop in the bucket, you know, is a drop a drop in the ocean compared to the waves of disinformation and and misleading content online. So, I
24:38
always think like how do we do this at scale? Like and then and you're right like ultimately starting with the education system and the public education system and and having curriculum where there's media and digital literacy built in instead of like after the fact trying to push these games or tools out to people who you
24:55
know may not click on them if they're not interested in media literacy. So it's always like how do you reach the people who who need to get to school.
Yeah. And because and what I always say the sometimes the cynical argument you hear from people and you used to occasionally hear it years ago and I always hated it back then was what do
25:11
you care if some people are you know idiots. So firstly I don't think they're idiots right I think they've they're unfortunate they've been misled on things but I always point out that that's like saying what do you care if some people have this disease we all live in a society we all vote right we
25:26
all shape the future for ourselves and our children and everything else. So it's not enough just dismiss people say oh those people are silly or they or you know that's a polite way of saying it people say much worse.
I'm like we want we need to understand why and we have to realize that people these aren't just
25:44
people that we can dismiss as as being fools. These people have been taken advantage of.
They've been exploited. They've been manipulated.
They've been misled. And because of that they are likely to do harm.
This goes back to the fact that that we and this is the why
26:00
the the the the libertarian model has never worked for me. We all are part of a society.
No one works in isolation. So you know years it was a tin full of hats in the basement thing we used to laugh at.
I'm like, well, actually, we're all interconnected and if people have, you know, it's like a disease
26:16
outbreak if a rake of people we I love the term viral disinformation because I think in a lot of ways and I I wrote I work a lot with you probably know Sander Vander Lindon like him and I work a bit and I think it's a great term because actually it does transmit you can spread
26:31
you can infect people with falsehood and they can be infected by it innocent people and they can make terrible decisions about their well-being. I work mainly in the health space, so I usually see it in terms of health.
I mean, I've dealt with patients who have died because they've gone down the road of
26:48
believing something they read on a Facebook post or on a TikTok. And I mean, I wish I'd say I've only seen that a few times.
I've seen that like over 10 years. I'd say I've seen it about 100 times.
And it's it's horrific, you know? So, it it's you can't just dismiss people as being idiots.
People are
27:03
scared. They're vulnerable.
They're looking for information, but if they don't know how to look in the right places, and they don't realize that there's bad actors who will manipulate them, and they don't know what questions to ask, they're vulnerable. So, we should be looking to places like Finland and going, "What can we do early on?" Because if you learn these skills early
27:19
on, and Matthew, you obviously have them brilliantly. I mean, this is this is bread and butter to you, right?
But it's it becomes much less effort if these skills are intuitive to you. Like, you've learned them so early.
Like if I read a newspaper article now, even in a reputable newspaper, I will often
27:37
have they asked the right questions of the sources they're quoting. Have they shown due skepticism and due factecking on the back of it as well?
Right? And sometimes they fall short and you can see and go, well, that was bad.
But it's become so natural that you don't think about it. If I read a post online, I'm like, okay, that's wrong.
I know what
27:54
source to look at and I can check it. There's a disparity there.
or they're obviously doing this for rage bait or clickbait, whatever else, and I can dismiss them entirely from then. But if you don't know that stuff, how would you ever parse it?
Like the most common comments I get on Instagram and and
28:10
Twitter and things is people, I'm overwhelmed. I don't know what to believe.
And that's a problem because we live in a site, it's only going to become more overwhelming, particularly as AI generated slop and disinformation starts to feed into the already terrible amount of disinformation we had. So, we needed
28:26
those skills like yesterday, but we definitely need them now. Yeah, I totally agree.
And I've been trying to do my part and and sharing these skills with as many people as I can. And I think, you know, one of the big challenges is then also addressing those other social and emotional factors
28:44
that influence why people are susceptible to these falsehoods. So it's it's you know the the critical thinking skills and the deficit of critical thinking skills but then also the I often think about the the needs that a lot of this health misinformation can
29:01
provide people in terms of feeling a sense of control and agency. Um, and I was curious if your thoughts on this of how how science communicators could do a better job of providing that sort of agency and like I've been thinking a lot
29:17
about how even the the term like do your own research is now like a a pjorative like oh people are they don't really know how to do their own research but really what I think's behind a lot of that is this sense of agency of I took control over my health for example and I
29:32
was the person who sought this out and that's why I feel personally connected to this information. And in comparison, if they're just reading about some health guideline from a medical organization that's just telling them you should do this, it doesn't feel as personal.
They don't feel as in control
29:49
of it. So, I'm like curious like how do we navigate that emotional agency need of control that can somehow be disconnected from health information, reputable health information.
Whereas like we have this like human need where we want to feel like we're the ones actually doing
30:06
the stuff and feel personally connected to it of course. So the first thing I'd say is I mean I understand that urge but I mean for I'm someone who knows a fair bit about medicine from what I do but when I get sick I go talk to a doctor right and I don't tell them what to do.
I
30:21
suggest and have a conversation with them. I've read something.
What about that? But they've ever when my car is damaged.
I might know a bit about like how a catalytic converter works, an engine or something, but I go to the mechanic, right? I might know what I want my house to look like, but I get a carpenter in to
30:38
do wood on it. I don't try to do it myself, right?
Because we realize at least intuitively in most situations that we can have some age differ but the people the experts in those fields the reason we pay them well and we go and solicit their advice is because they
30:53
actually know more than us. They know more than us watching a few Tik Tok.
So firstly we need to stop denigrating expertise because you know what that you watch three Tik Tok videos does not a medical degree make. Right?
And this is not to put people down. This is the reality of it.
the subjects that I think I know until I go and speak to people
31:09
who really know them and I walk away going, "Yep, I don't know that at all." Um, that person is so much better informed than me, right? And that's okay.
That's we're not experts in everything. So, the first thing is to realize that no one's trying to do you out of stuff.
So, a lot of I took control of my own health narrative
31:26
is because, and you'll see it a lot, the most common forms of health disinformation start with here's what doctors don't want you to know. Mhm.
Why would a doctor not want you to know that? Of course they would if it was true.
But what you're doing is you're setting up a narrative to put you at odds with the experts that are trying to guide your healthcare and trying to help
31:42
you, right? And so the first thing is that agency is working with experts.
It's working with your medical team. It's not like working against them and and you can be proactive.
I think it's really good when patients go, "Hey doctor, I read this." In my experience, particularly in cancer research, the amount of patients go, "I saw a video
31:58
about vitamin C or this or whatever. What do you think?" And a good oncologist go, "Look, that's really interesting.
you read about that, but the studies wouldn't support that, you know. And then they go, "Okay, cool." And then they've been proactive, but it's not like they're going to suddenly discover something and the oncologist will slap his forehead and go, "Eureka, why did I
32:15
never think of that, right?" You know, that's one thing. But the other thing is that determinance of health, I think this I might lose an American audience on this because I go very socialist on this, but it's true.
The amount of your health that you control, you know, as in as in things
32:31
that you do that you can the the levers you can pull for your own health constitute about 20 to 30% of your total health outcomes. The other 70% are genetic, institutional, environmental, and hugely socioeconomic.
32:47
The biggest predictor of how your health is going to turn out is the post code you were born in because that tells you how rich you were, right? Um, it's why America spends way much more on healthare than Europe does and has way worse health outcomes even for the richest people.
The average American
33:04
dies at 79. The average Brit dies at 85.
Right? And and again, you're spending four times as much in healthcare.
That is because the determinants of health are socioeconomic and they are psychosocial. They are complicated.
So this idea that we're like, I took control of my health. I optimized seems
33:21
to forget that health care doesn't occur in a vacuum that um that that the well-being doesn't occur in a vacuum that things are interlin if you don't have say good primary care or people in your society can't afford it or there's environmental you know lack of regulation that's making people sick you
33:36
can spend all the money you want on healthcare or look up all the videos on Tik Tok you're still going to die younger and again this is really powerful if you look at this if you see maps of his states I had a talk recently where I I watched a professor who's Irish but works in the states in public health and
33:51
he showed an area in in um down in this I won't name the state because you might work earlier but in a single area separated just by 8 miles in the same city the life expectancy in one area was 65 in another was 82
34:06
right he goes 8 miles away these are different stories and that's a problem because what's that showing you is this massive inequality systemic inequality is bad for healthcare in general but this kind of libertarian model that we can optimize our health, we can choose our health. It is the most middle classy thing I've
34:22
ever heard, right? Cuz if you want to optimize health, get poorest people out of poverty and get them good healthare, you'll improve all society, even the richest wellbeing.
And that blows people's minds. But I'm like, it's a system.
It's, you know, like there's no magical formula that you can just apply
34:39
to you. You have to apply to a system.
And I think it's really hard for people to understand intuitively particularly in a very individualistic society like the US is where you know it's all about me me and my but I mean the joke is sure America was set up on tax avoidance. So you know this libertarian mindset runs
34:55
deep but it's behind a lot of health disinformation the idea that you can somehow optimize and I hate that term your health when most of the determinance of health are way beyond an individual's control. Yeah that that's so spot on.
So I I appreciate you sharing that because
35:11
that's certainly been uh you know weaponized in in our most recent health leadership of you know focusing on individual actions and then ignoring all these social determinants of health and you know making these these these changes of you know French fries are
35:28
fried in seed oils versus beef tallow as if this could have any meaningful impact on health. But it's just it's frustrating to see that focus on these things that are not going to have tangible health outcomes of course and not any focus on what you speak to,
35:43
you know, this like 70% that even, you know, the social components of it that we could actually change like even if we're still trying to understand how genes and stuff influence our health, we can at least we we see like you see the the zip cones having such a strong impact. So, I'd like to see the United
35:59
States get to a point where we can at least have those types of conversations and have policy changes that could hopefully make a positive change in our public health. I should I should be clear.
I'm not just singling out the states as a punching bag here. It feels having a hard enough time.
I don't want to put that in there because this has crept into Europe too.
36:15
Even though Europe has mainly socialized medicine, you still have now we see the class distinction a bit more cuz I will see people sharing. They'll send me a link to a video on Instagram and they go, "Is this true?" And it's always someone saying that you do this, you'll optimize your health.
And I always I always I joke that it's the most middle
36:31
class thing I can see because you have to be fairly wealthy and well off to care about optimizing your health, right? To be able to spend sometimes several hundred quid a month on snake oil means you're usually pretty healthy already or you've you've had all the socioeconomic advantages that allow you
36:48
to squander your money on nonsense. where go if you actually look even in Europe at the distribution of areas of where poverty is very high you have much earlier death as well so it's not just an American problem it's just America has a uniquely libertarian mindset to it and doesn't have any public health well
37:03
it does have some public health but it has less socialized health than most of Europe yeah we're constantly defunding any efforts for public health but you know I it's it's such a tough time at the moment so I'll I'll try to to challenge you with um find Where is some optimism
37:21
in in this space if people are looking for okay like what are some some ways that we can move forward? I know you we're been talking about how it's probably going to get worse before it gets better and it will but that's not necessarily that's not necessarily this is like you know when you first realize you're sick you know look your your
37:37
fever is going to get higher and before it breaks right there is some optimism I would say and this is a weird kind of optimism but I would say that for years we did not recognize this problem I know I've been obviously you have been saying it for a while I've been sounding alarm bells for
37:53
10 years over this but people would say it was alarmist 10 years ago they'd be like Oh, it's not. It's a minority sport.
It's only 5% of people. They would look, for example, the classic one is they'd look at the amount of people that always avoid vaccination and it's been three and 5% in most countries.
And they go, "Well, that's your problem." And I'm like, "But that's contagious."
38:09
And they're like, "No, it's not. It's always consistent." And I'm like, "You have no idea what's coming down the road." Um, the thing is I remember actually sitting this at a panel meeting that I was on about 8 years ago and there was someone from public health England and I was talking about antivaccine disinformation online which I do a lot of work on and they were like
38:25
that's not a big deal because not that many parents use Facebook or whatever else and I was just like but it is a big deal and like I I I'd love to find that that executive now and go what do you think is happening? Um, but here's the thing.
We didn't take the problem seriously and we we said at the
38:41
beginning of this recording that like, you know, institutions still struggling to deal with it. Governments are still struggling to deal with it.
But now we can't deny it. So it's like kind of, you know, the the the bad analogy here is, you know, hitting your rock bottom to realize you have a problem.
38:57
This should be our rock bottom moment where we go as a society, as a collective, no, not again. I go, we cannot live in a world where bad actors can manipulate politics.
They can manipulate society based on a few viral
39:13
disinformation videos. We need to be on our guard for that.
What are we going to do? And you're right, there's no magic bullet.
There's no silver solution, but there are things we can do that long-term will immunize us against that. That's things like, you know, bringing in media literacy education very early
39:29
on, encouraging critical thinking in school, and also being more honest about when we get things wrong. Because that's one thing I say, I mean, I say in the book constantly, and I say it in life constantly, there is no shame in being wrong.
There's only shame in refusing to
39:44
update your worldview when it is shown the belief you had is no longer viable. Right?
This is where we divorce our emotions from our ideas. I often go into something like and I'm very lucky in my job.
I'm surrounded by people a lot smarter than me and we're all opinionated and I'll often come in with
40:00
with an opinion and I'll go I think this and I I think I've really thought this out and someone just quietly devastates it. They pick away at it and go no actually that wouldn't work though would it because this that and the other and I'm left there feeling you know oh yeah that's not right at all.
And it's really
40:16
easy to get humiliated by that feeling. But I I've redirected myself over many years.
Instead of getting humiliated, I get a little bit euphoric because suddenly that person has tempered by fire me towards a better idea. They've stopped me wasting time in a bad idea.
And as long as you don't emotionally put
40:33
yourself into that idea and identify and say this is this is the rock I will die on, you will always be able to refine it. And I think that's really liberating.
So I think the reframing of the education thing and the other thing I say is can we suddenly make changing our mind a liberating thing? a thing that we do that actually everyone should
40:48
high-five us and go, "You're doing awesome." Uh, instead of being seen as a weakness, it should be seen as a strength. If someone comes in with a better argument, you go, "Well, I'm convinced.
That's that's good. That's how it should work." But that's still not normal to us.
Even politicians, they hate. I would be more inclined to vote for politicians that I
41:05
used to have this policy, but I've talked to experts. I've learned a bit, and I honestly think that we need to change it.
I'd be more inclined to vote for a politician who could say that than one that would just say, "No, no, no, no. this is always our policy or it's always this or or never admits to being wrong because actually we're always wrong.
There's no shame in that. It's
41:22
cuz information always changes. We should always be getting less wrong, not more wrong.
Yeah. Yeah.
For sure. And I hope that collectively we can update our our cultural norms to celebrate admitting that you were wrong about something or
41:37
celebrate not knowing something and not having a strong opinion before you're wellversed in it. And I I I do think that people Can I add one thing here?
Sorry. You've said it more eloquent than me.
But can we also add not having an opinion something that's really liberating is
41:53
because social media in particular force you what do you think of this binarizes things polarizes things which is really unhealthy and it misses nuance destroys that sometimes the correct answer is I actually don't know enough to have an opinion but when I read up a bit more on it I'll get back to you. Right.
Someone
42:09
said it to me a few months ago and I wanted to shake their hand because it was just it was such a perfect way of of of expressing and it was a topic that was over here very important. It was in politics but they wanted to go and read more on it before they had like a knee-jerk opinion.
I just remember thinking that is so not what we're conditioned to do but it is exactly what
42:27
we should be doing. Often just going well that that that that looks bad on paper but I actually don't know enough about it.
So, I'm going to read about it and then I'm going to get back to you and then I'll have an opinion. Or maybe I still won't have an opinion, but I'm going to at least get the information first independently of the channel
42:42
that's trying to polarize me into having a strong point of view immediately. Yeah, absolutely.
And yeah, we're not always incentivized to do that, too. And that speaks to understanding some of those media literacy skills of how as soon as we go on social media, we're
42:59
being we're first we're only seeing the extremes because of how the algorithm works, but then also understanding how these platforms are incentivized to keep us on there as much as possible. So, they're going to outrage us and and get us upset and try to be mindful of that when we use social media, I think, is
43:15
also really critical. Yeah, absolutely.
I think that I mean, you said cause for optimism. I think that there's a sea change in how governments at least for the most part well maybe not the current American administration who've benefited from this but hopefully when the Democrats
43:31
get themselves out of whatever the hell they're they're doing at the moment right there is a thing where we have to realize okay we cannot go on like this we cannot solve the problems the world faces when we are can't even agree on basic facts like everyone should be entitled to their own opinion but no one should be entitled to their own facts
43:47
for example take something like climate change um there's a million one things we could do to address climate change, there's a compromise that'll have to be made all across the board for things like that, right? But if we can't even accept the reality of climate change, we're just going to suffer the worst possible outcomes.
Whereas, you know,
44:03
someone might go, well, I I you know, I I I believe in climate change, but I don't want to change this much. Well, there's a compromise.
We could change x amount here, x amount here, and we can make up the balance. But we can't even have those discussions if we can't all agree on basic facts.
And we need to get to the point where we realize that basic
44:19
facts should be beyond, you know, what is sometimes called identity politics. I'm probably using wrong cuz you would know this better than me.
But like it needs to get out of the ideological bubbles. And let's face it, at the moment, there's only really one, but but it could go either way.
I'm keeping my mind open for it. But at the same time,
44:35
we need to get there because this is not conducive. Where we are at the moment, highly polarized that can't even agree on basic reality is a problem.
the the spectrum of opinion is where policy comes in. You can have loads of policy opinions, okay?
And then you can find compromised
44:51
solutions that fix them. But not if you can't agree on the problem.
If you have alternative facts, as someone once said, we we need to avoid that. We need to have the same facts and then go the debate should be about what we do about that, not whether those facts are true or not.
45:06
Yeah. Well put.
And I think this provides an opportunity for us to rebuild our scientific institutions in a way that can connect with with folks in a way that resonates with them. Again, like, you know, paying more attention to communication, incentivizing communication, and trying to work
45:23
directly with communities. That's something that I've been thinking a lot lately is actually having universities and medical centers work with their local communities in a way that they feel there's less psychological distance between themselves and the experts
45:38
because that's so much of trust is a social connection and if you don't know a lot of doctors or scientists and you don't speak to any of them but yet someone on your TikTok is you know directly speaking to you and answering your questions and concerns like of Of
45:53
course, it makes sense that you might trust them more even though they don't have valid expertise. So, trying to understand those social components to getting good information out there to as many people as possible.
Yeah, absolutely. And the and the thing that you know better than me, Matthew,
46:09
is the fact that people are drawn to disinformation for different reasons, right? Some of us, we just want, you know, some kind of epistemic certainty.
the world scares us and and the person that gives the simple answer with the least complication. That's really sexy.
But I
46:25
think it's really important that we all acknowledge and we come to embrace, not just acknowledge, that the world is really complicated. Most things have multiple factors.
There's usually caveats and nuance. So anyone who's selling us a simple answer for things generally, not always, but generally is
46:43
vastly oversimplifying or misleading us into the wrong thing. So I mean that reductive cause they need to get away with and but I understand people's I mean I always separate people who are like the the propagators of misinformation versus the victims of misinformation right I I put them you
46:58
know when people share misinformation I don't always go you're you're being villainous. Often times they just they don't know better.
They've heard this and they now believe it's true. And then there are other people that profiteer off it or do it for narcissistic reasons or whatever else.
So I think it's really important to kind of as you do and and
47:15
in the book and and like you talk about you know understanding and your research indeed the emotional aspects of how information appeals to us and it's okay to acknowledge our feelings. I think it's really good to say look I want this to be true.
Okay. I myself now when I read a newspaper
47:31
article that's agrees with my perspective um I stop myself and go I really want this to be true. So I have to check this twice as hard as I'd check it if it said the opposite because there's the thing where like we we can give a past information that seems to be
47:46
of our worldview when actually maybe the toughest interrogations we need to do is of our preconceptions and sometimes you'll find them shifting and I say also the idea that we can just convince people that we change other people's minds I don't think that's true either I say this in the book and I say this again I think we sew seeds I think
48:03
we give no one has ever changed your mind they've given you the tools and the freedom to arrive at better decisions, better information. And I think we have to keep it like all we can do is seow seeds and go, look, here's a better, you know, or not better.
I often deal with vaccine hesitant parents, good example, or or or
48:19
people who maybe won't be getting their chemotherapy because they've got online. And I know arguing with them isn't going to help.
So what I'll do is I listen to what they're scared about. And often I'll go, "Look, I totally acknowledge where you're coming from." I go, "And your concerns are valid." I go, "But
48:34
here's an idea. Somebody hear me out." I said, "I'm not trying to sell this to you." I said, "I just want you to think about this to see if it describes things as well as your model or not, and just think about it." I go, "Blah blah blah." And I'll give whatever it is.
And I'll say, "I don't want you to go away and tell me which one you think is right or wrong. I just want you to think about
48:50
it. Would that explain things better than what you read on that on that Twitter thread about new cancer cures?
Like, would that m what I you know, I go and again I go, you can call me anytime, but just think about it because you're always planting seeds. You're not like people come change their mind in their
49:06
own freedom when they feel free to do so. But again, we need to, you know, normalize people changing their minds.
I don't don't let us off the hook on this. We're not we're not sants.
You we often as active researchers, you often find out you're gone down a whole road that's
49:21
just not right. That's okay, too, you know.
Yeah. Yeah, and I think it speaks to again the social conditions like we're both in bubbles where people celebrate this kind of changing your mind and being open to new evidence.
So as they should, Matthew, not as much as they should. You come into a work
49:38
meeting and you say, "Hey, the thing we've been doing for the last 6 months doesn't work." You will you will find the ones that start pushing the heels in, you know. Well, that's true.
And I think that's also an important point. You know, the the human element of science is something that I think is worth mentioning.
It's a broader discussion of like, you know, we're humans doing this
49:53
science, too. And that's why we need this self-correcting institution that can push us towards truth.
Because a lot of scientists will be biased to protect their own pet theories, you know, like they build a career finding X equals Y or something and someone says, no, you didn't look at Z. They like, well, you
50:10
know, it's personally attached to them. So they're human and that's going to connect how they they are human.
And I I have to be fair, my other hat I wear, which I haven't mentioned yet, is I am a sleuthan resident for retraction watch. So I spend an awful lot of time finding scientific papers that are fraudulent or
50:26
faked or otherwise not reliable. So the idea that scientists are always perfect is wrong too.
It's very wrong. But the scientific method is how we inquire and we find things out.
So individuals so I often get asked by people well individual scientists can lie think
50:42
about Andrew Wakefield or think about X Y or Z. like yes they can but it's scientists that catch them out and that doesn't make their behavior okay but it's a scientific method that will find this stuff out it's inquiry and skepticism and again it's no one is
50:57
above that law even scientists they can't just go I'm a scientist trust me no I trust you based on the evidence I don't trust you based on your say so right and I think that's one thing that I think more scientists and science communicators are are trying to do a better job of avoiding ing that, you
51:14
know, trust me, bro, kind of top down messaging that doesn't seem to work very well. Like, people really want to see you show your work whenever you're describing why you think vaccines are safer and effective, for example, just saying get vaccinated and then leaving it at that is probably not going to be
51:31
very effective if someone's uh skeptical. So instead actually walking them through the process and and again like providing that sense of of agency and and understanding of like oh I understand more about this process too.
So it's a challenging path to walk but I
51:47
think it's it's critical that we we at least try to incorporate those types of communication strategies in our work. Absolutely.
And the behavioral component is really important and I again that's a lot into into your work as well. Absolutely.
And it it is again it goes back to the you know the the quad eigo
52:03
because I say so is not a very good model of uh of of of putting science out there particularly because science is often I mean evolving it's often wrong. At the beginning of the pandemic I wrote a piece for the Irish Times where I said look because I kind of saw this coming but again that should have did much good.
I
52:19
said this is an emergent threat. We don't know much about it.
A lot of the advice you get now is going to change. a lot of the things you're told about the virus will change.
That's not science failing. That's science doing its job.
That's science correctly updating itself
52:34
as it gets more information. And yet a lot of people struggle with that.
They were like, "Oh, well scientists said 3 weeks ago we should do this." And now they're saying this. I said, "Yes, because in the 3 weeks intram, which is very quick for science, we learned something new about the transmission of COVID 19.
So therefore, we changed the advice about it." So that's okay. But
52:51
because people are still hold up the psychological model well you know you can't be you can't change your mind you should be changing your mind as the evidence emerges and science is a great example of doing it's a it's a systematic method of doing that that's what it is you know it's great yeah and it also speaks again to science
53:08
literacy media literacy and and how it's often taught again here in the states is you kind of like memorize facts sometimes and it's just like okay we're learning about facts as science as like a series of conclusions and results And there's less focus on that
53:24
methodological process of how things change and evolve. And yeah, I think having classes, you know, the first classes I got on this is really like graduate school where I really learned into the method and really dove into that.
And you know what, one of the things I've noticed with scientists, not
53:41
to kick my fellow scientists here, but there is a kind of a snobbery about philosophy of science. So I will often say to scientific because again I spend a lot of my time finding scientific fraud or scientifically invalid results and often it's more wishful thinking.
People have found a result where they
53:57
only could have got there if they you know cross their fingers and tapped their little red shoes and said there's no place like home. And the reason why that happens is because they don't know the scientific method well enough which is philosophical.
It's like look you are trying to disprove your own work. You're trying to whittle away all explanations
54:14
that alternative explanations and you're left with one that's the strongest. But people, like you said earlier on, people get their pet theories and they're like, "This is my favorite theory." And I keep pointing out to my my students in particular, that's not a scientist's job.
You go into lab and you find a no result, that's high fives all around. I mean, yeah, nature mightn't give you the
54:31
paper on it because scientific publishing is all weird, but the fact that you find a drug doesn't work is every bit as important as knowing it does work. And it's far more important than falsely thinking it works.
So I was like I was like you need to get back to our jobs here as scientists is to arrive
54:46
at the closest to the explanation for events that we can not to misguide ourselves and other people. Uh then again when you see these bad results you're like do these people like ever study their Carl Popper?
Did they ever go through their Thomas Who knows? They probably did not.
Probably not. It's Yeah.
I don't know
55:03
how many people take classes like that. I one of my favorite classes in graduate school was a philosophy of science class where we just learned about what are theories?
How does that connect to methods? How do we know what we know?
And I don't even know how popular or how common those types of classes are even
55:19
at the PhD level, let alone undergrad or high school. I didn't do one.
I just thought I remember thinking this like it's mad that I have a PhD and I'm only learning this because I'm just reading random books on it and a lot of my colleagues didn't because I mean again very early in my
55:35
career when I started finding dodgy papers and it was almost always people setting up an experiment in such a way that guaranteed they get a result and presumably because they didn't know any better that their job was supposed to testing their idea not just setting it up so their idea might look like it had some signal and again I think that
55:52
philos I mean this This is where I'm not discharging all scientists, but we need to be very careful about our messaging too and realizing that our messaging can often change because scientists are fallible and that's okay. Yeah.
But we're still better than random people on TikTok telling you random stuff about cancer. I think we can get I
56:08
think we can be there. Absolutely.
Uh well, David, thank you so much for joining me. I really enjoyed our conversation.
Uh where can people find you and your work? I don't want people to find me.
They'll give me your work. You'll you'll find me on Instagram, David_Robert_G
56:24
Grimes. On Twitter, the Cursed Platform at DRG1985.
I'm on Blue Sky, which I never use, but I'm going to try to use it more. And my website is David Robert Grimes.com.
And if you're in the States, my book is called Good Thinking. What's the subtitle of it?
Um, why flawed Logic
56:40
Puts Us All at Risk and How Critical Thinking Save the World. And in the rest of the world, it's called The Irrational Ape.
But apparently my American publishers were not happy with calling it an ape related title when 43% of Americans don't believe in evolution. So that was uh that's why it was retitled.
That was not
56:55
I was curious about that. So that's that's the backstory.
Okay. They they claimed that it was because Americans are like we like a more optimistic title and I'm like is this nothing to do with market research showing the you know the fact that ape would be controversial for a human here?
Uh I I they they never fully answered
57:12
that but I'm pretty sure I'm pretty sure I know what the answer is. Huh.
Well, that's interesting. But yeah, so definitely check out those books and all of David's work.
And uh yeah, thank you so much for joining me today. I really enjoyed our conversation, Matthew.
Keep up the good work. Thank you.
57:27
[Music] [Applause] [Music] [Music]