
Beliefs are sticky. While many of us like to think we are at liberty to alter our beliefs according to the facts, contemporary psychology and cognitive science disagree. Belief change and belief persistence are much more reliant on social factors than actual evidence.
Cognitive Division of Labour
Social factors have a huge influence over our beliefs, in part, because we cannot investigate the grounding of every belief we hold—we don’t have the time or the expertise. If I were to ask you why you believe earth is the third planet from the sun, or why you believe the MacKay Bridge is structurally sound, or why you did or didn’t get a vaccine in the last two years, your answer will likely be “someone told me so” rather than “I did the research.” While you might think you did some research, you are mistaken to think the little you or I could do with the help of Google could compare to expert testimony. The skills required to conduct research in astronomical physics, or architectural and structural engineering, or immunology are simply not skills we have. Even if you are an astronomer, or structural engineer, or immunologist reading this, I bet you aren’t all three!
More than this, we don’t have the literacy in these fields to evaluate contemporary debates. Think about the expertise required to make an informed decision on which planets should be recognized as part of our solar system, or what qualities allow a bridge to adequately support physical loads, or what makes a vaccine safe and effective. It takes years dedicated to research and training to appropriately understand and offer insight on these issues. Again, you and I cannot make pronouncements in these conversations: we lack the expertise.
This feature of our knowledge systems is related to our broad cognitive division of labour. Since we, as individuals, could never discover all these things about the world, we assign researchers the role of knowledge production. This comes with a paradoxical consequence, though: the more we know as a collective, the less we know as individuals. Therefore, trust plays a significant role in what we can know.
Political Partisanship and Trust
If we all trusted the same authorities, the above observations would be entirely banal. However, social factors significantly affect who we perceive to be trustworthy. This is part of why we see extreme polarization around contentious issues. While social discourse often paints the political ‘other’ as entirely lacking reason, this narrative fails to consider the social dynamics of belief. The problem here is not that our political partisans are reasonable and non-partisans lack intelligence, like you sometimes hear folks say. The problem is that it is unlikely information will be received as reliable if it comes from non-partisans.
Many psychologists have found this to be the case. For example, one study found individuals valued political policy source over content. Another study found that people can identify what scientists say, but have trouble reporting corresponding personal beliefs. An additional study found religiosity and partisanship moderate the extent to which Americans identify scientific consensus and subsequently assert beliefs that contradict their perceptions of consensus. These, and many other studies like them, indicate political partisanship has a stronghold on what we believe.
While it might seem like trusting partisans over non-partisans is irresponsible, we do have consistent and compelling reasons to do so. If I encounter a person who goes on at length endorsing the harmful practice of sexual orientation conversion therapy, I have very good reason to consider them a bad judge of what’s politically important. We do this all the time—and we tend to think it is a good strategy to find trustworthy information! For example, if my friend consistently misidentifies the grass outside our office as red, when it is clearly green, I will likely stop trusting them in areas of colour identification. Likewise with this person endorsing conversion therapy. Why would I trust them if I want the best politically relevant information, and I hear them going on about blatantly problematic policies?
We take political affiliation to embody our values in some way. These values indicate to us who is trustworthy and who isn’t. This is the reasoning behind our endorsement of political partisans—on both sides of the political spectrum. This is also why reasoning alone won’t change minds. We can try to convince people that vaccination is an effective way to reduce the harms of a global pandemic, for example, but if individuals find themselves in communities which deliver false or misleading information, those of us in the out-group likely won’t be able to move the needle (pun intended).
The Social Dynamics of Belief
Here, we arrive at a dilemma. When experts, who have accurate and nuanced information, belong to the out-group, they will almost certainly be met with hostility. Since experts’ messages are often politicized, individuals are left unable to separate the intended message from external social distortion. This is why we should be careful about blaming individuals for their bad beliefs: it is difficult for anyone to sever the link between the facts and the messaging received from political partisans.
Understanding these social dynamics is important. When scientists are conducting research with tangible implications for our collective welfare—think about the harms of under-vaccination—we need ways to effectively communicate these findings to the public. When partisanship obscures the message, though, belief polarization quickly becomes political gridlock. One factor contributing to the political volatility out there is that important information is not always received as friendly.
Photo by Sander Meyer on Unsplash