Annie Murphy Paul, the author of “Origins: How the Nine Months Before Birth Shape the Rest of Our Lives,” is a science writer at work on a book about the extended mind.

Science is taking it from all sides these days. On the right are those who question the reality of climate change and doubt the theory of evolution. On the left are those who inveigh against vaccines and fear genetically modified foods. Those who do accept the authority of science watch helplessly as funding for research is threatened, all the while bemoaning the warping influence of political ideology on the beliefs of their compatriots.

Into this sorry state of affairs arrive two new books, each of which draws on a different body of research to make the same surprising claim: that the misunderstanding and denial of science is not driven exclusively or even primarily by ideology. Rather, scientific ignorance stems from certain built-in features of the human mind — all of our minds.

Granted, this news is not exactly cheering. Research in cognitive science has revealed that “human capacity is not all that it seems, that most people are highly constrained in how they work and what they can achieve,” write Steven Sloman and Philip Fernbach, authors of “The Knowledge Illusion.” Sloman and Fernbach argue that “there are severe limits on how much information an individual can process” and that “people often lack skills that can seem basic.” Even more, the authors contend, it’s unclear that such skills “can ever be learned.”

(Basic Books)

And yet this book, along with another recent title, “Scienceblind,” offers readers a few crumbs of hope. If it’s not political ideology but rather cognitive errors that produce scientific illiteracy, then perhaps we can find ways to fix our mental glitches without wading into politics — that place where good intentions go to die.

The author of “Scienceblind” is Andrew Shtulman, an associate professor of psychology and cognitive science at Occidental College. His research focuses on intuitive theories, which he calls “our untutored explanations for how the world works.” Such explanations constitute “our best guess as to why we observe the events we do and how we can intervene on those events to change them,” Shtulman writes. Intuitive theories help us get by; they work well enough, according to the crude calculus of survival.

The problem is that our intuitive theories are, scientifically speaking, often wrong. We suspect that we came down with the sniffles because we were drenched by a cold rain. We surmise that the weather is hotter in the summer because the Earth is closer to the sun. We embrace intuitive theories because, in Shtulman’s words, “we are built to perceive the environment in ways that are useful for daily living, but these ways do not map onto the true workings of nature.”

Many of our intuitive theories are formed early in life, before formal science instruction takes place. And because all children encounter the same physical world, interpreted through the use of the same limited biological equipment, they tend to formulate similar ideas about how that world works (creating a shared social reality that further entrenches intuitive theories).

Take, for instance, a typical child’s understanding of heat. Kids commonly conceive of warmth as a property of a particular object: a baked potato is hot, while an ice cube is not. Their visual and tactile senses give them no hint that heat is actually a general property of matter, produced by molecules in motion as they rub against one another. “We do, of course, teach children a molecular theory of matter, but not until they have reached middle school, and by that time, they have already constructed an intuitive theory of heat,” Shtulman observes. Starting science education earlier wouldn’t work, he continues, because “children lack the concepts needed to encode the scientific information we might teach them.” Moreover, studies conducted with very young infants “suggest that many of our expectations about motion and matter are innate.”

Even after years of education, our expectations about the physical world — those we’re born with and those we develop very early in our lives — continue to exert a tenacious hold. Indeed, researchers like Shtulman have come to believe that it’s not that we ever replace our intuitive theories so much as that we become progressively better at inhibiting them. This means that attempts to educate students and the public at large must “forge a path,” in the author’s phrase, between the novice’s intuitive theory and the expert’s scientific understanding.

(Riverhead)

This can be accomplished, he suggests, by employing “cognitively-informed instruction” — that is, empirically supported teaching strategies, such as “bridging analogies.” Such analogies begin with scenarios that make intuitive sense (a spring exerts an upward force on a book resting on that spring) and gradually extend to include notions that, while scientifically unassailable, initially strike us as implausible (a table exerts an upward force on a book resting on that table).

Although such “conceptual change” is difficult to achieve and perhaps never complete, it’s worth attempting, Shtulman concludes, because scientific theories “furnish us with fundamentally more accurate conceptions of reality and thus fundamentally more powerful tools for predicting it and controlling it” than intuitive theories ever could. The better we understand thermal equilibrium, the more likely we are to optimize our home heating and cooling practices; the better we understand the biological mechanisms of cold and flu transmission, the more likely we are to take precautions against getting sick.

To seek out a more scientifically informed view of the world, however, we must recognize that our current view is inaccurate or inadequate — and such epiphanies, argue the authors of “The Knowledge Illusion,” are all too rare. Research in cognitive science demonstrates that “individual knowledge is remarkably shallow, only scratching the surface of the true complexity of the world, and yet we often don’t realize how little we understand,” write Sloman, a professor of cognitive, linguistic and psychological sciences at Brown University, and Fernbach, a cognitive scientist and professor of marketing at the University of Colorado’s Leeds School of Business.

Like intuitive theories, the “knowledge illusion” — the sense that we understand more than we do — grants humans a rough-and-ready way of dealing with an intricate universe. We should be glad that our ancestors did not pause, in the face of a rock slide, to contemplate how little they understood about gravity. In our modern world, however, false assurance can come with its own set of dangers.

The knowledge illusion — which academics refer to as “the illusion of explanatory depth” — was first revealed by studies of how well (or rather, how poorly) people grasped the workings of everyday objects such as staplers, speedometers, piano keys, door locks and flush toilets. Psychologists Frank Keil and Leon Rozenblit discovered that, although people confidently assumed that they understood the operation of these items, in fact they had no clue. In attempting to explain how zippers and other objects work — to think through processes that previously they only skimmed over — Keil and Rozenblit’s subjects came to recognize that they really did not understand them after all.

Building on Keil and Rozenblit’s research, Sloman and Fernbach have found that this same misapprehension applies to more abstract phenomena as well, such as tax policy and foreign relations — and to politically charged topics such as climate change and genetically modified organisms. As with our intuitive theories, the development of deeper understanding is not a straightforward process of replacing or supplementing inaccurate or incomplete information. Unsophisticated efforts at education are often ineffective and can even backfire. Sloman and Fernbach describe a study in which parents who were shown images of children with measles, mumps or rubella, or who were given an emotional story about a child who contracted measles, actually became more convinced of the dangers of preventive vaccines.

Once again, research in cognitive science offers a savvier way around our mental blocks. Simply asking people to explain how vaccines work, or how a single-payer health-care system operates, or how a national flat tax would function, immediately renders them more cognizant of how little they understand these issues. It makes them more humble and more receptive to information that challenges the beliefs they previously expressed with such vehemence.

The authors of “The Knowledge Illusion” and “Scienceblind” both describe the research of Michael Ranney, a psychologist at the University of California at Berkeley. In Sloman and Fernbach’s account, Ranney “approached a couple of hundred people in parks in San Diego and asked a series of questions to gauge their understanding of the climate change mechanism. Only 12 percent of respondents were even partially correct, mentioning atmospheric gases trapping heat. Essentially no one could give a complete, accurate account of the mechanism.” Ranney then gave participants a short text (or, in later studies, showed them a video) explaining how climate change operates. This two-step process, Sloman and Fernbach report, “dramatically increased their understanding and their acceptance of human-caused climate change.”

In this experiment, Sloman and Fernbach see the knowledge illusion being exposed and corrected. Shtulman, meanwhile, sees our intuitive theories being addressed and reframed. Intuitively, “we don’t really think of the earth as something that needs protection,” he observes. “How can we harm an object that is seemingly eternal?” Ranney’s clear explanation of how we are, in fact, harming the Earth with the production of greenhouse gases prompts our intuitive impulses to shift in the direction of protecting an endangered planet.

Though they focus on different defects in the human operating system, the authors of these two books arrive at the same solution: To move away from ignorance and toward understanding, we need to address directly what Sloman and Fernbach call “the driving forces” behind our obtuseness. Surprisingly, for once, that obtuseness is not produced by our politics but by the evolutionary history of our brains. Who knew, as President Trump remarked recently of health care, that things “could be so complicated”?

Scienceblind
Why Our Intuitive Theories About the World Are So Often Wrong

By Andrew Shtulman

Basic Books.

the Knowledge illusion
Why We Never Think Alone

By Steven Sloman and Philip Fernbach

Riverhead. 296 pp. $