
Misinformation is more present than it ever has been. Some estimates suggest that as many as 87% of posts to social media included misinformation related to health information. The Canadian Institutes of Health Research define misinformation as inaccurate or misleading information, which individuals often spread without knowing that the information is inaccurate. This is distinct from disinformation, which is inaccurate information intentionally spread to confuse or distract from evidence. The spread of misinformation and/or disinformation increases when there is a lack of factual information available on a topic, or when there is a sense of mistrust in a source who is sharing factual information. It may be especially likely to occur when an individual has had a negative experience related to the issue at hand.
Perhaps the most obvious and relevant example of these phenomena is the COVID-19 pandemic. There was a significant lack of information at the start of the pandemic due to the novelty of the virus, and health professionals and public health officials became targets of doubt as information changed. Coupled with the public’s frustration with public health guidelines, misinformation (and disinformation) began to thrive. Misinformation has the potential to blur the lines between what is real evidence and what is not, with the potential to have serious consequences in a range of contexts, with health certainly being a primary concern.
Knowledge mobilization (KM) can serve as an effective tool in managing misinformation. The American Psychological Association has shared strategies on ways to manage misinformation. One of the main approaches is called prebunking. Prebunking involves “inoculating” people against misinformation by providing an example of misinformation before they encounter it in the real world, in an effort to build up the public’s awareness of what misinformation or disinformation may look like. For example, individuals may be shown a social media post containing misinformation. Along with this example, an explanation of why the claim is false and how the evidence could be misrepresented is provided. Researchers have created games like Bad News to deliver prebunking in an engaging and effective way. Research shows that this approach is effective in spotting misinformation in the short term, but just like our annual flu shot, its effectiveness wears off over a few months and people do need a booster.
Both of these strategies involve KM as they require a targeted understanding of what types of questions people might have about evidence and information, where they might run into misinformation, and how best to communicate the explanation. These activities are key considerations when planning KM exercises, and therefore may lend itself well to addressing these needs. Furthermore, thinking about how to share factual information more effectively and impactfully to counter misinformation is also a goal that KM processes can help meet. Gaining a breadth of perspectives on why people engage with misinformation, why they reshare it, and what needs they have when it comes to assessing the accuracy of information could be key insights into improving our knowledge sharing efforts to debunk misinformation and share factual information more effectively.
Misinformation is a challenging topic, however relying on strategies that can help identify relevant needs and answer the right questions can support this endeavour. Practices in KM can provide key considerations to help guide the way to supporting how we approach meeting these needs.