It’s inevitable that there will be some things that you think you know that you don’t actually know: everyone gets overconfident and makes mistakes sometimes, and every one of us have had to occasionally eat crow. However, a recent study reports that a significant number of people in the United States face this problem of thinking that they know more than they do about a number of key scientific issues. One of these beliefs is not terribly surprising: while the existence of human-made climate change is overwhelmingly supported by scientists, beliefs about climate change diverge from the scientific consensus largely along partisan lines.
Another issue that sees a significant amount of divergence between laypeople and scientists, however, is a belief about the safety of genetically modified foods, or GM foods for short. The study reports that while there is significant scientific consensus that GM foods are “safe to consume” and “have the potential to provide substantial benefits to humankind”, the predominant view amongst the general population in the US is precisely the opposite: while 88% of surveyed scientists said that GM foods were safe, only 37% of laypeople said they thought the same. Participants in the study were asked to rate the strength of their opposition to GM foods, as well as the extent of their concern with such foods. They were then asked to rate how confident they were in their understanding of various issues about GM foods, and were also asked a series of questions testing their general scientific knowledge. The crucial result from the study was that those who expressed the most extreme opposition to GM foods “knew the least” when it came to general scientific knowledge, but thought that “they knew the most.” In other words, extreme opponents of GM foods were seriously bad at knowing what they know and what they didn’t know.
The consequences of having extreme attitudes toward issues that one is also overconfident about can be significant. As the Nature study reports, the benefits of GM foods are potentially substantial, being able to provide “increased nutritional content, higher yield per acre, better shelf life and crop disease resistance.” Other scientists report numerous other benefits, including aiding those in developing countries in the production of food. However, a number of groups, including Greenpeace, have presented various opposing views to the use of GM foods and GMOs (genetically modified organisms) in general, despite the backlash from numerous scientists. While there are certainly many open questions about GM foods and GMOs in general, maintaining one’s beliefs in opposition to the consensus of experts seems like an irresponsible thing to do.
Apart from the potential negative consequences of holding such views, failing to properly take account of evidence seems to point to a more personal flaw in one’s character. Indeed, a number of philosophers have argued that humility, i.e. a proper recognition of one’s own strengths and limitations, is a virtue generally worth pursuing. People who lack intellectual humility – those who are overly boastful, or who refuse to acknowledge their own shortcomings regarding what they do not know – often seem to be suffering from a defect in character.
As the authors of the Nature study identify, a “traditional view in the public understanding of scientific literature is that public attitudes that run counter to scientific consensus reflect a knowledge deficit.” As such, a focus of those working in scientific communication has been on the education of the public. However, the authors also note that such initiatives “have met with limited success,” and their study might suggest why: because those with the most extreme viewpoints also tend to believe that they know much more than they do, they will likely prove unreceptive to attempts at education, since they think they know well enough already. Instead, the authors suggest that a “prerequisite to changing people’s views through education may be getting them to first appreciate gaps in their knowledge.”
It’s not clear, though, what it would take to get someone who greatly overestimates how well they understand something to appreciate the actual gaps in their knowledge. Indeed, it seems that it might be just as difficult to try to tell someone who is overly confident that they are lacking information as it is to try to teach them about something they already take themselves to know. There is also a question of whether such people will trust the experts who are trying to point out those gaps: if I take myself to be extremely knowledgeable about a topic then presumably I will consider myself to possess a degree of expertise, in which case it seems unlikely that I will listen to anyone else who calls themselves an authority.
As The Guardian reports, compounding the problem are two cognitive biases that can stand in the way of those with extreme viewpoints from changing their minds: “active information avoidance,” in which information is rejected because it conflicts with one’s beliefs, and the “backfire effect,” in which being presented with information that conflicts with one’s beliefs actually results in one becoming more confident in one’s beliefs, rather than less. All of these factors together make it very difficult to determine how, exactly, people with extreme viewpoints can be convinced that they should change their beliefs in the face of conflicting evidence.
Perhaps, then, part of the problem with those who take an extreme stance on an issue while greatly overestimate their understanding of it is again a problem of character: such individuals might lack a degree of humility, at least when it comes to a specific topic. In addition to attempting to address specific gaps in one’s knowledge, then, we might also look toward having people attend to their own intellectual limitations more generally. We are all, after all, subject to biases, false beliefs, and general limitations in our knowledge and understanding, although it is sometimes easy to lose sight of this fact.