Bonus Episode: Group Psychology & The Path to Extremism
One of the things I love about studying history is how understanding events of the past provides a lens not just into one particular era but into the peculiarities of the human species - how we think, how we form groups, what we’re capable of - both the good and the bad.
That’s one of the reasons I believe a good history education has to incorporate some elements of social psychology - it has to force us to place the narrow topic we’re studying into some larger context and grapple with what it tells us about the broader story of our species.
In my years of research on Reconstruction, I found myself asking a lot of questions not just about the details of the era but about the psychology underlying it. How were white southerners able to convince themselves that the Lost Cause was true? Why were northerners so open to embracing it as well and why were they willing to look the other way when terror threatened to reverse the gains of Reconstruction?
I don’t think you can fully answer those questions without asking deeper questions about human psychology - questions about how we as individuals form beliefs, how groups construct narratives, and how competing narratives lead to violence and division.
In the summer of 2021, I spoke about those topics with Steve Fein, a professor of social psychology at Williams College who focuses on the role of threat in shaping our beliefs, biases, and actions. It was one of those conversations that truly changed the way I think about how society works - and to some extent the value I place on studying history.
The research he described was both engaging and so illuminating - helping me not only better understand Reconstruction but also many of our modern struggles, whether it's the widespread embrace of false narratives like Q Anon and Donald Trump’s Big Lie, the ways in which partisanship seems to work its way into nearly every issue, or the challenges we have in forming coalitions to address long-term problems like climate change.
The studies that Dr. Fein described reveal these challenges as inherently human, connected in some way to our deeply embedded tendency towards tribalism and desire for belonging. But they also offer some degree of hope. If we understand what’s behind all of the division in modern society, we might chart a more effective, and more united, path forward. History and psychology show us that we’re quite capable of believing false, divisive narratives, but they also show us that we can rewrite those lies and construct more accurate and more inclusive stories. ones.
And exploring that tension - between these forces that divide and unite - is in some ways what this podcast is all about.
Here’s Steve Fein.
Nick: Hey, Steve. Can you hear me?
Steve: Nick. Yes.
Nick: Hey, how are you doing?
Steve: All right, how's it going?
Nick: It's going well, it's going well, thank you so much for taking the time.
Steve: My pleasure.
Nick: I did want to start with, this sounds like kind of a strange question, but this idea of how do we decide what to believe?
Steve: That's a terrifically interesting and complex question, of course. And it's really well put about how do we decide what to believe rather than what do we believe? Because I do think there's a huge motivational component to reasoning. One of the things that first got me interested in social psychology was through the marriage of motivation, and, and cognition. So I think it's a really fascinating mental gymnastics that people often perform, where they're trying to be rational, they think they're rational. And yet, what's driving the bus often is a motivation. You want to see the world a certain way, you want to see the world not a certain way, and how we can justify all kinds of conclusions that don't seem necessarily very rational in the service of that motivation.
One of my undergraduate thesis advisors is sort of famous for a really interesting study on confirmation bias that I think nicely illustrates it. They had proponents and opponents of capital punishment read a set of arguments based on evidence. And they read basically summaries of research. And some of the summaries were in support of how the death penalty is an effective deterrent against crime....But they also read the same number of articles or summaries of articles of research that concluded the opposite.
So you can imagine you have people who are one side or the other of the fence, they're reading this mixed bag of evidence. And you could imagine some different outcomes of that. So one is it's just a wash, and it doesn't have any effect on their actual attitudes about capital punishment. Like the two sets of arguments just cancel each other out. So there's no change. Another possibility is ‘well, the evidence is mixed’ so maybe the attitudes become a little bit less extreme, a little more moderate because it's not so clear whether capital punishment is effective as a deterrent against crime. So they get a little bit less confident in their positions.
But of course, what happens is, after they read this mixed evidence, it's neither of those, instead, they have polarization. So the proponents come out of this more convinced than ever that capital punishment is a great idea, and the opponents do the opposite, they're more convinced that capital punishment is not a good idea….
So they take what they take a much more critical eye about the methodology and interpretations of the data, when the conclusions were against the side that they were in favor of. So they put a lot more work into and become much more sophisticated at seeing the problems, the flaws in the research when the research didn't support their agenda.
And I think people are just very good at that kind of thing where their motivation drives their cognition. So they suddenly become much more critical thinkers, to negate information that goes against their preferred outcome, and a little bit more just blindly accepting of the information that's in support of their preferred outcome. And I think that's just a natural tendency for a lot of people. There are certainly individual differences on this on our situational factors that make you more or less likely to maybe do that. But in general, we're very good at steering our thoughts and interpretation of information in ways that we want to, without realizing we're doing it. We think we're being bias free, we think the other people are biased. There's a concept of bias blind spot where we see bias and other people, especially if they're out groups, but we think that we're the ones being pretty rational.
So that's just a part of the story. I think that motivation could be the sort of guiding star that directs your, you know, how you're investing your cognitive resources.
Nick: So it sounds like we kind of decide what we want to believe and then there's this rational process that is basically just justifying it.
Steve: I think often that's the case, certainly not always, of course, and it depends on the issue. And there can be issues where you're not particularly invested motivationally. And you're going to take a much more rational approach and build up from the data to toward a conclusion. It's not as simple as we're always looking at the motivation first, and then the cognition second. But when you're particularly invested and committed to a particular outcome, then we're much more likely to work in that direction.
Nick: Before we dive into that, because I'm fascinated in understanding that a little bit better, do you have a sense of or do psychologists have a theory for why that's the case? Like, why minds have evolved this way?
Steve: I don't think there's any sort of one answer to that. I think there's a whole host of reasons, from just evolutionarily speaking, we're motivated to be tribal. And we're motivated to, to feel safe and protected. So therefore, we're going to be biased toward information and decisions that seem to keep us safe and keep us part of a coalition of tribe and make that tribe be good. So there's a lot of in group out group kinds of stuff that's often behind the things we're interested in.
And part of it is just easy, it's just easier to continue to see things a certain way. And it's difficult to suddenly realize you're wrong. And you have to rethink things. So we're motivated sometimes purely by sort of cognitive laziness, even to continue to see things a certain way.
And then you add that you've invested a lot of yourself into something. And then you'd have to admit that all that investment was maybe ill spent. And I realized, wait a minute, all this all this time and thought energy I put into x? Well, maybe x was wrong. And that's embarrassing. And we're, you know, there's a drive I think that a lot of us have, I'm sure you've heard of cognitive dissonance - to reduce that dissonance of feeling well, I've thought X all this time, and I've invested in X, but I should have been investing in y and that we don't want to admit that unless we're absolutely, you know, the evidence overwhelms us.
The sunk cost trap is another example of that kind of thinking where I've invested money and time in a project. And I can't get that back no matter what. And it looks like it's a bad investment. So I shouldn't continue to invest more energy and time and money in this. But if I stop and ultimately save myself more pain, and I'm admitting I did something wrong, and I admitted all this money was spent, and we just don't want to admit that unless we absolutely have no choice. So that's why we often invest more and more bad money in, you know, in trying to sell but something that at some level, if we can sort of be dispassionate about it, we realize, okay, we got to, we got to cut bait here and move on. So I think there's a host of reasons but we're just primed to want to seek confirmation, because it's easier, it feels better. And it's very, it's much more taxing to suddenly shift.
Nick: So if that's the kind of order of operations in some ways feels flipped from the way that we maybe think of ourselves or think of our mind working, where we kind of think that we're processing information like a judging jury, and, you know, weighing the evidence and coming to an informed decision. And in reality, it sounds like we actually are making the decision beforehand, maybe before we're even kind of conscious that we've made the decision. And then we're just looking for arguments to confirm that decision we've already made.
Steve: Yeah, and it might not always be quite that clear cut that we made a decision, but we're certainly stacking the deck toward a particular decision where, again, like the example of the capital punishment study, people might go into with kind of an open mind or that want to see what the evidence says. And I'm not making up my mind immediately necessarily, I want to try to reserve judgment, but how are you processing the information? How are you investing your thoughts in terms of whether or not you are being as careful as possible to evaluate this information intelligently? And you just put a little bit more energy in and a little bit more commitment toward disconfirming things you don't want to believe. Eventually, you know, if the information is really clear cut, and one, it turns out, the data suggests capital punishment is or isn't an effective deterrent against crime, you then might be convinced, but you're just processing that information in a biased fashion. The scales are a little bit, you know, unbalanced in how you're processing that information. So you're more likely to come to the conclusion you wanted to. It’s not actually you've decided it, and you're just going to do anything to justify it, but you're biased in a way that's more likely than not to be concluded in this particular direction.
There's some really interesting research on you, if you bring this into sort of the social media world and what tweets go viral, for example. There's a lot of compelling very recent research on how any kind of tweets that have a moral or emotional component are the ones that tend to be sticky, the ones that tend to go viral and get passed on from person to person, and become, you know, a big part of the social media discourse. And tweets that are more based on rational arguments and that are less of this sort of moralizing or emotional component tend not to. So even at that sort of macro level, you see how emotions can be sort of primary in how people think and and how people discuss things. And an irrational part of our brains are often sort of trying to catch up and then explain post hoc: “okay, here, here, this justifies the original conclusion.”
Nick: You mentioned the tribal thinking and kind of the searching for belonging - are those kind of when we say ‘appeal to the emotive side,’ is it those kinds of appeals that that part of the brain is looking to latch on to?
Steve: I think that's probably a good a good part of it, absolutely. Just to take us back a little bit before social media. So as you go back to evolutionary kind of thinking about this, because I find it kind of fascinating, like, where some of this stuff you can trace back to. So the only way that humans could have survived is because we're, you know, pretty puny animals out there, right? We're not particularly strong enough to be fast, we would have been easy lunch meat for the wildebeests out there. And the only way that the sort of proto-human ancestors could have not only flourished, but even survived was basically to work together in what you could consider coalitions. So to get a group of the chimp-like beings to band together to hunt together, to throw rocks at the lion. One chimp-like thing, couldn't have done it, but a bunch could survive. So they had, the sort of beginner sort of chimp-like ancestors that liked working together in groups, and were drawn to that were much more likely to survive and flourish and pass on their genes than the ones that wanted to be more isolated and just run and hide from from the lions and wildebeests out there. So it seems like we evolved to be group cooperating, coordinating animals, and that seems to be completely responsible for our survival and then our eventual thriving.
And you could see in contemporary research how any kind of even symbolic sense of being ostracized by others is really just devastating to us. There's research where you have people play a really, really, really boring computer game with two other people allegedly in other rooms. So you don't see them, but you're each represented by an avatar, and you're each just playing catch with each other on a really rudimentary video game. And you don't know who these people are, you never seen them before, you're never going to see them later. But what happens during the game after a little bit where everyone's just tossing a little ball to each other, eventually, the other two subjects just for no apparent reason stop throwing the ball to you. Let's say you're the subject, and for some reason, they're just not including you in the game of catch. And they found in study after study after study, that when you're sort of ostracized in that way, for no apparent reason, and it doesn't mean anything - it's not a fun game anyway - people feel really, really bad. And you even see if you put people in an fMRI setup where you can see brain activity, you see the same kind of patterns of brain activity as if you got punched in the gut.
So it's physically painful to be ostracized. And it seems like we're just hardwired to recognize that because ostracism back in the day would have been devastating, would have been incredibly threatening - you wouldn't survive. So you have to be able to work together in groups.
And what's really interesting is that you feel this devastation of ostracism even if you find out that those other two subjects are from a group you don't like or you despise. It doesn't matter; you still feel equally bad. And I think the story there, going back to your question about emotions and motivation, I think there's a sense of threat often, and sort of being off on your own and being isolated from your group members. And I think in part that can tell the story of how we're just very quick to latch onto things that make us feel comfortable and safe, including just, you know, something like: ‘is my attitude, X or Y?’ And you have other X members and other Y members, and you form these ingroups and outgroups. And it's extremely important to be considered a good group member and to be allied with people of your group. So a pure rational reading of information might not always get you there; you might realize you feel differently, but emotionally or motivationally, you really want to be accepted, and with the right attitude, the right fashion, whatever it is. So I think we're just programmed as social animals to be ready for these sort of threats and ready for the sense of: ‘am I considered a good group member or not?’ And I think that's part of the story. And there's a bunch of other things there, too. But that's part of why I think our cognition is very easily steered by our motivation. Does that make any sense?
Nick: That makes a lot of sense. When you say that cognition is steered, the implication there might be that we could change our beliefs based on the pressure or perceived pressure of the group?
Steve: Yeah, I think so. And again, some groups have different norms about whether or not everyone has to think the same way, you know, how accepting they are of differences of opinion. There's much more pressure for conformity in some groups than others - and so you're going to get differences. In some groups it's okay to think differently, some groups it’s okay to wear different clothes or to behave in a certain different way and to change your mind about something and try to persuade others that we were wrong all this time. In other groups singing off the same sheet of music is extraordinarily important. And then you have to decide whether you want to continue to be in that group and sing off the same sheet? Or are you realizing okay, this isn’t for me, and I'm going to try to make a move to different kinds of groups.
Nick: One of the big themes is the unification of the White South, and how does the White South come together after a pretty divisive war where you've got poor whites who in many cases weren't quite as supportive of secession or were outright opposed to secession and, and then you've got the wealthy class who sort of led the charge and obviously benefited much more from the system of slavery? And you've got this combination of social ostracization; also very real physical violence, or the threat of very real physical violence; and the economic piece where if you're voting for the Republican Party, you're paying extra taxes, your employment might be threatened, your livelihood might be threatened. And so it seems as though the combination of those things might weigh pretty heavily on the mind of a white southerner who's trying to decide: do I support the Republican Party or do I stay with the planter class?
Steve: Right, and it actually reminds me of one variable we haven't really talked about explicitly is threat and perceptions of threat. And that's actually a lot of what my own research has been about for my whole career, looking at threat and how when people feel threatened, it activates a much more extreme version of everything we've been talking about; and it sort of kicks in sort of evolutionarily rooted kind of attitudes and judgments.
So when people feel some sense of threat, they're much more likely, for example, to engage in negative stereotyping toward other groups, and prejudice and discrimination. And often in weird ways that they have no idea they're doing. Because most of us feel like we want to be fair, we want to be as bias-free as possible. We don't want to make judgments and decisions based on something like stereotypes or prejudice. Now, some people are proud to do that, of course, and seems like more today than and for decades, but for most of us, we want to be as fair and egalitarian as possible. But under threat, all of a sudden, you see these biases become exacerbated.
And that threat could be something like a self esteem based one where suddenly you feel like, you just got dumped by your significant other, you fail the test or didn't get the promotion. Those kinds of things kick in this desire to repair your self image. And how do you do that? Well, ideally, you do it by confronting the threat directly and fixing that - so you don't break up or you fix the breakup, or you get the promotion or what have you. But often we of course, we can't. So what we often do is redirect our energies to feel superior to something else and sort of retain a sense of ‘I’m good and effective.’ And one easy, automatic way of doing that is to derogate, consciously or not, outgroups - or do what we call downward social comparison. We still feel superior to some other group.
And stereotype, for example, can be a very useful tool, like a hammer to knock down somebody else - especially if you don't realize you're doing it, you just genuinely see this other person as other group in a more negative fashion, and stereotypes will give us a lens through which we can see that in a seemingly justifiable way.
And the threat could be economic, the threat could be even something really primal. Like if you put people in the lab, and have them evaluate members of stereotype groups. Under normal conditions, they might be able to sort of rise above that and not show prejudice towards some out group. But what if you put them in a pitch black dark room; all of a sudden, under that condition, it sort of triggers a sort of evolutionary sense of threat. Because in the dark, all of a sudden, we, you know, we can't see as well as others, other animals out there, and we're prey, we're vulnerable. That kicks in our threat system, and all of a sudden, in some studies that have done that, and in the lab, people become more prejudicial toward outgroups. Or any suggestion of a pathogen kicks that kind of thing, or even thoughts of death. Like if you make people think about what happens to their bodies as it decays after death. There's a lot of research that shows that if you prime them with those kinds of thoughts, they become more likely to derogate outgroups.
I just did a study with a senior thesis student last year, where we had people read letters of recommendation that were written for a grad student who was applying for research jobs. And the student was described as either white or black. And the letter either with 100% glowing, or it had a couple of like little red flags, like sort of, you know, damning by faint praise kind of phrases - some things that if you're really looking to exclude people, because you need to winnow down the applicant pool, you could use those to justify maybe why you wouldn't take a chance on a student. And what we found was contrary to what maybe people might have predicted is that those little red flags did not hurt the black candidate more than the white candidate. If anything, they were a little bit more positive toward the black candidate in those conditions than the white candidate. So they didn't use this excuse to sort of justify derogating. And these are white, or non-black subjects, to derogate, the black candidate.
But if before they read these letters, we had them read a newspaper article that indicated that by the year 2040, the shifting demographics of the country are going to be such that whites are going to be in a clear minority. So they read this kind of demographic base threat to their status as the majority group, then they use the red flags against the black candidate much more than the white candidate.
So that's just one example of how people are often able to maybe overcome their biases. You add threat to the mix, suddenly they don't do that as much and then much more sort of primal and again, tribal and things like that. So there's a whole bunch of research showing the effects of threat. So you could think about the South after the Civil War and the massive amounts of experience of threat both again economically, socially, culturally, that people were experiencing. It's much more likely to trigger a very strong counter reaction to what had just happened in the war, and trying to be superior - to find some way of being superior again, over these outgroups that you can take it out on.
Nick: Yeah, it seems like in some ways kind of the, you know, the perfect storm of having a landscape destroyed, a lot of your family members and friends have been killed. And then there's, you know, you pile on economic depression on top of that.
Steve: and the psychological reality that you lost, right? In addition to all the real stuff that you're talking about, just the fact ‘oh, we were humiliated.’ And you want to sort of somehow spin that in a way that you can feel superior again. So in a real sense that we lost to this, our enemy, and how humiliating this is to our sense of pride, and especially for men - the sort of masculine kind of thing that goes along with that - and needing to reassert your dominance over something. And I think that symbolically in addition to the absolute physical devastation, and economic and all that, that people were experiencing. I think that's a big part of it as well.
Nick: I'm glad you brought that up. Because one of the big things to come out of this era is a rewriting of the history of the war and of slavery. But I'm curious, when we put all of this together into society-wide how we come up with historical narratives, or common sets of beliefs. How does that process work in practice? And we've talked a lot about how this works on an individual level. But I'm curious how, as a society, we come up with these narratives and these sets of beliefs in the first place?
Steve: I think that you have certain people who are leaders of a particular group, who can sort of run different kinds of narratives up the flagpole and see which ones people want to salute. Sort of like the tweet research I mentioned, how certain kinds of tweets are going to go viral, or other people are going to retweet them. And they become this thing that all of a sudden large groups of people are going to endorse.
I think that for whatever reason, whether they have political power, or just other kinds of capital, where people are going to pay attention to them; if certain people like that it can capture a lot of attention can say, ‘okay, here's how we're going to reconstruct this, we're going to decide the war really was about this or that - we didn't lose, we won in this sense.’ People are so looking for those kinds of messages of hope or at least minimizing the threat that they've experienced, that they latch onto that and it becomes a collective reality.
I mean, think about the election, a presidential election that just happened and how and Trump's sort of previewed this, that he wasn't going to ever accept defeat; that if he was declared the loser, it's only because it was fraud; that there's there's no conceivable way he could lose the election. So he had people ready to use that as soon as that outcome might happen. And how there's a huge swath of the country that still believes against all evidence that he really won the election.
And then people again, so going back to what we were discussing earlier about how motivation and emotion can overwhelm cognition. So they will grab on to any bit of misinformation out there that's consistent with that. If other people in the group are saying, ‘Yeah, this is the story,’ and put all of their energies into remembering and recycling that information and promoting it. And people are amazingly adept, as I said, at that sort of mental gymnastics. People can engage in discounting everything that goes against that. And so certain leaders can promote particular ways of conceptualizing reality. And individuals, especially in groups where reason and careful analysis of data are not necessarily promoted, are very good at sort of jumping on that and using the same talking points. So collectively, we all see this - we meaning whoever is in that group - as the only one possible way to understand what happened.
So we redefine what the Civil War was about; we redefine what government means; we redefine what loyalty is, and all those kinds of things that in the abstract would seem really difficult to do. But as soon as there's a sort of slogan, or a certain way of thinking, or a new narrative that's promoted by certain leaders, it sort of snowballs.
There is this concept in group psychology called group polarization, which is not just the polarization between groups - even within a group. So if you have a group where most of the individuals in the group kind of lean toward one decision - let's say you’re all members of a business that's deciding about whether we should do something really risky. And so let's say most of the individuals in that group kind of slightly lean toward doing that, but they're still concerned about it. After group discussion, what tends to happen is a group becomes more extreme in the direction in which they were leaning. So if most of the individuals were leaning toward risk, they become riskier and group discussion. If most of the individuals had been leaning toward conservatism, they become more conservative after group discussion.
As a general tendency, the researchers who've looked at this sort of had two explanations that are not mutually exclusive, they both are worked off of each other, to help that along. One is what they call persuasive arguments theory. So that's where when most of the members lean toward, let's say toward risk, most of the arguments are in that direction. So let's say I had two arguments in favor of risk, and you had two and one of them were the same as one of mine, but you had a separate one. So now we have three. And then another person mentions one, and the number of arguments toward risk increase. So therefore, you become more confident in the risky decision.
There's also the social comparison angle, so independent of what the arguments are, independent of what the evidence is based on, you just know, everybody thinks this way. So that must be the cool way to think in this group. So I'm going to really promote that and be a really good group member - independent of what the arguments are.
So for both of those reasons, the group tends to become more extreme. And I think in sort of rewriting history, or creating certain narratives that a group really wants to adhere to, I think both of those elements - persuasive argument and the social comparison angle - both again, feed it to ‘well, okay, this is the reality. And I'm going to be really, really a strong proponent of that, because that will also make me a good group member.’ And I distance myself from the outgroup that way as well by doing that, and that also makes me feel like a good group member. So I think for those sets of processes, we can become more and more polarized.
Nick: That's fascinating. This is one of the questions I've posed to a lot of historians and I feel like we, you know, as a, as a field maybe struggle to answer this question as well as as what I just heard you describe it I've been asking, you know, is the spread of the last cause that top down? Or is that bottoms up? Is it? Is it the, you know, the elite planter class who clearly has a political and an economic agenda spreading this new narrative? Is it, you know, them sort of meeting in backhalls and sort of figuring out how to push this out into the market or is it that the poor families who lost their loved ones in the war are trying to create this narrative? And it sounds like what you're saying is it's actually maybe - there's the the leaders, the Jefferson Davis's of the world, who very quickly after the war say actually this wasn't thought about slavery, was about states rights - they sort of give permission. But you've got the sort of both; you have leaders saying one thing, but they may not be actually then going out and marketing it a ton, but they're giving permission to that a very fertile ground for that idea to take root.
Steve: I think that's exactly right. It's a great way to recap that. Think also about about something like wearing masks during COVID. Right. And the big political split there between Democrats and Republicans or liberals and conservatives on mask wearing and how so many people on the conservative side saw mask wearing is not only is it offensive to be told you have to wear this and it's against your individual rights and freedom, but they actually would be angry when other people would wear them and see that as some sort of insult.
Now think about during the summer, last summer when all this stuff was kicking in; for random reasons I just happened to be reading some books that were set in World War II. So I was reading about Winston Churchill and I was reading about a writer who sort of came of age in Brooklyn and during World War II and, and so all these stories about how citizens felt their patriotic duty was to support the war effort, right and, and make amazing sacrifices in terms of you know, what food you could put on the table. And, like, this is one really poignant story of this kid - he was, like eight years old in Brooklyn, at the time, and there was a movement to donate every piece of metal that you had for the war effort, right. And he had just saved up all of his money that he earned, you know, delivering newspapers to finally buy an early version of a skateboard, whatever they were.
And he just had this for like a week, and people come around collecting metal and he threw his roller skates into the band to donate to the war effort. You know this is the most precious thing that this kid had - and he did it because that's what you do. You know, it's for the cause. So you could imagine easily, talking about counterfactual worlds of like the 1876 election, and what could happen if we did this versus that. So it could have been easily done, where conservatives were the ones in 2020 who were like like, ‘Look, we need to be patriotic for the national cause. So the United States weathered the storm better than any other country and the patriotic thing to do would be to wear masks, and don't be liberal snowflakes who are too scared to wear these things and too self-centered to make sacrifices.’ The good conservative, patriotic thing would have been to wear masks and to socially distance and stuff like that; easily could have been seen as a more conservative thing. But it wasn't, and why wasn't it? And I think it starts from the top down - Trump and others who were promoting a completely different agenda. And then it was fertile ground for people on the ground to take that and run with it and come up with all the all the justification for why mask wearing was was just an attempt to somehow catastrophize the the threat of COVID in the service of trying to have Trump lose the election, etc, etc, etc.
So I think that narrative from the top then took root in very fertile ground where people are ready to see every conspiracy theory possible that was all about trying to take down our man and our cause and run with that. And how do you get people to take vaccines now? And there's a recent study that just came out where again, if leaders on the Republican side would come out in support of vaccination, it might have a huge effect. But absent that, it's gonna be very, very difficult to have the numbers you need on the conservative side to be vaccinated well - while the leaders are privately getting vaccinated, but they don't want to you know, they don't want to share that story with the citizenry.
Nick: …one of the things that you see play out in reconstruction is a lot of the white South initially is allying at least politically with some of the members of the black south. And then decades later after really sort of formalizing these outgroups - watching or participating or sort of not standing up to really heinous, you know, cases of violence. And I'm curious how we, as humans sort of justify that, or how we can even have cases where these are fairly public displays where people are bringing their children to see, you know, bodies hanging from trees or… family's going to watch the Tulsa massacre and the houses burning. I'm curious kind of from a psychological perspective how we get from this sort of somewhat ambiguous place of what are the groups going to look like to that - to where now we've sort of fully dehumanized one group to where we can watch something that I would ever imagine we know to be problematic on some level, but we then in some ways also seem to celebrate it.
Steve: Yeah, I think there's a bunch of stuff there. I think part of it is, is there's a concept called moral disengagement, which is basically, simplistically, it's how we're sometimes capable of just taking what we know to be moral and just deciding we don't - we sort of somehow remove that from the equation of what we're thinking. Like we almost turn off the switch where we're less moral people and we're much more capable of doing that when we're talking about outgroup members than ingroup members.
There's research where, if you see someone getting blood drawn or something - they get a needle stuck in their arm - we have an automatic tendency as humans to feel empathy with that person who is undergoing some pain. Or you see like a hammer hitting their finger or something, we flinch when we see someone else experiencing some sort of pain. And you could see that even in brain activity, how, you know, people show the same kind of brain activity as if they got hit by a hammer, right or got poked by a needle. And it's automatic, it happens immediately. But we don't do that so much when the person’s of an outgroup than an ingroup. We somehow are able to sort of shut that down a little bit - this automatic human tendency toward compassion and empathy, we just do, don't do that as much without groups.
And then I think what happens is, especially if we know we treated an outgroup badly, is then we need to sort of justify that and come up with a moral story, because again, we want to think of ourselves as good moral people. So we justify it. And we often exaggerate the negatives of the group to justify how you could do something that seems so immoral and make it a moral good. So you have a cancer in your body, you need to sort of surgically remove it; if there's a cancer society, you need to do something about that, you need to do something aggressive. And there's no moral quandary because you're saving the greater good, you're saving the body by sacrificing this, this part of it, that's sick. And you're saving the body politic by removing this cancer on society.
And, you know, at some level people might think how do you bring a kid to a lynching? Or how do you expose kids to horrific attitudes, like, again, this past summer about some of these, these rallies where people are protesting wearing masks, and you see these little kids just shouting at people wearing masks, and you see the indoctrination? And these parents think they're being good parents, they are teaching kids the right values.
So we, as humans have this capacity to justify our behaviors to remain a sense that we're doing something moral even if it seems like the behavior is very aggressive and immoral. We’re very good at recalibrating and justifying.
So I think that people are able to sort of recalibrate and say, ‘Okay, yeah, this looks like a horrific act of violence, but I'm not the kind of person who would endorse violence. So why am I doing that here? Well, it must be because it's for the greater good. And it's actually an act of moral goodness to kill these people or do whatever, whenever this immoral act is. Because otherwise, how could I live with myself that I'm supporting this horrible thing and I'm a good person?’
So you're able to sort of again, reconstrue this in a way that makes sense of it - and the more heinous it is, and if you can't distance yourself - like I had nothing to do with that - if you are part of the group that does this and you can't really minimize how bad it was, you're more motivated than ever to come up with a reason that makes sense and support it and, and make it even more extreme, we need to do more of this. Because if it's good in this case, it's even better to do more of it and to wipe out a whole group of people. So I think that aspect of feeling threat - ‘like I know, maybe this is wrong’ - just motivates you to double down on it, to convince yourself it's right.
Nick: That's really interesting. I mean, it sounds like a pretty terrible cycle of exclusion, violence, whatever you want to say because, you know, sort of allowing the bad thing to happen that is going to make you more likely to allow even worse things to happen in the future.
Steve: Right.
Nick: This has been fascinating. The last question I wanted to ask is kind of - if we put all this together, and you're talking about how narratives arise and how people sort of buy into those narratives. How do we think about actually getting people to change their minds? How do we sort of break that cycle that we just described of violence and extremism? Are there types of strategies that we know work or maybe don't work?
Steve: Yeah, I was afraid you were gonna ask that because there's just not - obviously, if we knew the answer that we might be in much better shape. I mean, traditionally, there's been, you know, one obvious, effective strategy is to the extent you sort of reorient yourself to how much that the competing factions have in common, especially if they have gone to the motivation side, which we've been emphasizing this whole time, how we have common goals. So this idea of, and it's almost like literature of superordinate goals - so goals that transcend this group division, like we both want this kind of thing.
So there's a really famous study that was done in the 40s or 50s where they took these young boys in a summer camp, and they sort of had them in two different groups. And they would meet only for competition. So they'd have all these different competitions in the summer camp, and they would hate each other. The two groups who would just go to war - they raided each other's bunks, and absolute, you know, huge conflict. You see the sort of ingroup / outgroup hatred. And then the researchers are like, how do we defuse this? And they would do stuff like, well, let's educate them about each other; that didn't work. Or, you know, all these different kinds of standard attempts to try to get the groups to de-escalate their conflict. So what they decided to do is have them all have to work together on something - like they would take a trip somewhere in a bus and the bus broke down. They know that it was stuck in a ditch, and the only way they can get out is if both groups would work together to push the bus out of the ditch. So they had certain kinds of things that they would rig up where the two groups had the same goal, and they had to work together to get to the same thing they wanted. And all of a sudden, they started really liking each other, once they had the superordinate goals.
And there's a whole bunch of research since then on how powerful that can be. You know, if you think back to if a country, let's say with a lot of divisions within the country, suddenly they're attacked by another country, we band together to fight the common enemy. Now, in the past, there can often ways of promoting that, that we're all in this together. It's so much more difficult now than it ever has been, I think, because of social media, and not just social media, but media more generally and the echo chambers that they can create.
So if your entire source of information about the world is from, let's say, right wing media or left wing media, it's impossible to overcome that to where you can see that we have certain support and goals that we share. So I'm a little pessimistic about how we can get to that point today more than maybe ever because there's so much echo chamber that goes on. You would hope that something like COVID could have done that where we had this common enemy that the virus, and we need to work together. And unfortunately, I think because of the way the media and social media work, that only exacerbated things rather than made them better. But so that's one thing is if there's a way in which groups can sort of be re-oriented, because they're there, they actually share things in common - the same kind of goals.
I do think that it often comes down to again, leadership. If you have some leaders on either side, who I think it wouldn't be that difficult if you had the right leader in place, and how do you get that person is the big challenge, to create different narratives. I think many people in the country are ready to go there and to see a different way, but it's going to take the right leaders to do that.
So again, there's a recent study, where if important people in the Republican Party would promote vaccination, there's evidence that people who otherwise were very skeptical would suddenly become much more receptive to becoming vaccinated. So people who seem deeply entrenched against it would suddenly become much more open to it if people they trusted, leaders they trusted, endorsed that very clearly and seemingly voluntarily.
So you know, to me, that's the biggest thing is if you can have the right people in place, who have a voice, who are heard, who promote a different way of thinking, I think it can be extremely powerful. Because again, all the sorts of reasons, all the rational, all the evidence for why we should only think Y instead of X can be redirected. If you want to think X now, and if leaders are pushing you in that direction. So just as the motivation can steer us to really bad attitudes and judgments and behaviors, it can also steer you the other way toward a much, much better way of thinking and acting.
Nick: It's really interesting. I'm thinking about this in the context of Black Lives Matter last year kind of seemingly crossing the chasm of becoming something that a huge percentage of the country was willing to grapple with that and, and then thinking to this year as I'm moving into the classroom as a history teacher and the debates around what can be taught in our history. Can we teach, you know, critical race theory or even just anything around the problematic history that we've had on race in this country?
Steve: Right. Yeah. And I think people are capable of moving their attitude. So think about like attitudes toward same sex marriage. Unbelievably quickly it went from a majority of the country being opposed to a majority of the country being in favor - within a decade, that switched. And because once norms change, all of sudden so many people, you know, it carries a greatest huge wave along with it. And, you know, the Supreme Court and the president around, you know, suddenly went from being skeptical of it to being in favor of it and making important decisions. And then it was amazing how opinion - and again, certainly, big swaths of the country are still opposed to it of course - but the majority opinion switched very, very quickly. So I think it shows that that kind of thing can happen. But it had the right leaders and the right Supreme Court decisions and all those kinds of things that sort of made that more possible. So I think there is that potential for certain, dramatic movements in the correct, more progressive kind of direction away from hate, but it takes that kind of, I think, top down promotion.
Nick: Well, Steve, again, I can't thank you enough. This has been fascinating. Even as we were talking, I was like, I need to find a way to bring some of this psychology into the classroom. I think that history curriculum tend to be memorizing dates and battles and wars - and here's a timeline and just get all that down. And I think the understanding why we would believe a certain thing or why certain events, why groups might form and how they might interact with each others is so important to understanding the history.
Fireside History is produced by me, Nick Fogel. It is edited by Iris Adler. Scoring and sound engineering by Jason Albert and Hannah Barg. Music in this episode by Blue Dot Sessions. Special thanks to Steve Fein.