Histories Of...

Scientists Don’t Have a Monopoly On Objective Thinking

Teaching students to draw informed conclusions from objective data is vital–but it's never been exclusive to science

1 min read

Michael Faraday lecturing  in front of a crowd
Michael Faraday lectures at the Royal Institution, 1856. Image credit: Wellcome Library, London. Wellcome Images.
How We Get To Next logo

Start with this well-meaning article by Peter Salovey, a social psychologist who is currently the president of Yale University. Much of the trouble lies in the title (though as a longtime internet journalist, I’m wary of complaining about an article’s title–the fault rarely lies with the writer). “We Should Teach All Students, in Every Discipline, to Think Like Scientists,” the title asserts, but Salovey is making a better argument, warning against the siloing of scientists and scientific thought in the STEM disciplines, especially in a post-fact world in which a significant number of people rely on feelings rather than conclusive evidence to make decisions.

This, Salovey suggests, goes both ways. For scientists, this means learning to contextualize their work to better communicate it to the world at large: “to be aware of the psychological, social and cultural factors that affect how people understand and use information.” You’ll get no arguments from me on that front–endless studies about science communication show that the truth of your work doesn’t mean much if you can’t present it to a general audience in a compelling way. (The same, surely, is true of all things that need to be communicated, but it feels more dire if it’s about, say, the rising seas, or vaccinating your children).

But for those outside the sciences, Salovey writes:

“¦the best way we can transcend ideology is to teach our students, regardless of their majors, to think like scientists. From American history to urban studies, we have an obligation to challenge them to be inquisitive about the world, to weigh the quality and objectivity of data presented to them, and to change their minds when confronted with contrary evidence.

This was the part, in conjunction with the title, that drew a deserved bit of critique: the idea that objective fact was the purview of the sciences, rather than a bedrock of everything from “American history to urban studies” to a whole lot in between. English professor Aaron Hanlon broke this down in a great thread:

Screenshot of a tweet that reads: 3) Being inquisitive, weighing the quality and ideological bent of evidence, and changing our minds according to the evidence is not

At the risk of this seeming like some sort of interdisciplinary squabble, it’s worth taking a step back and seeing how all of this applies to all of us, whether we have much to do with academia or not. There are a few things at work here, first among them the idea that drawing informed conclusions from objective data is something specific to science: that science is a world of facts and everything else is a world of feelings. I understand how tricky it is to fall into this kind of thinking, especially when good science communication relies on tapping into people’s feelings to spur them to action.

I’ll grant, too, that the methods aren’t the same: when I study history, I’m not explicitly proposing a hypothesis and repeatedly running experiments with control groups to prove that hypothesis out. But I’m also not basing my conclusions on feelings or instincts. Obviously some historians do, but that brings me to another way this sort of thinking is flawed: with something like history, it’s often easier to see bias and manipulation than it is in a science, but the idea that science comes from a place of pure objectivity is a commonly held assumption–and dangerous one. Science, past and present, is full of unchecked bias and uninterrogated belief (I’m reminded of a fascinating episode of “This American Life” from last year, about scientific “fact” that’s taken at face value).

All this sits side-by-side with my frustration over the idea that pure truth, unencumbered by ambiguity or by emotion, is the only thing we need in the world. Some of this is weariness with the Neil deGrasse Tysons of the world, so far past the point of self-parody that it’s almost funny again. I say this as an English major who’s endlessly frustrated by the people who want to find the neurological roots of our love of good prose, as a person who finds scientists’ conversations on the radio show “On Being” about the nature of reality often more enlightening than anywhere else. As someone who recommends Aimee Bender’s exquisite short story, “The Doctor and the Rabbi,” to anyone whose anxieties about faith are about the idea of not knowing.

I wholeheartedly agree with Peter Salovey: I wish that the STEM fields weren’t so cloistered from the rest of the academy, and by extension, I wish STEM professionals didn’t wind up sectioned off, in labs and on dev teams, separate from conversations about historical context, or ethics, or the way their work shapes society. But I want to make sure this doesn’t rest on an idea that science owns objective truth–or that the grey spaces of the world should be obliterated.

How We Get To Next logo

How We Get To Next was a magazine that explored the future of science, technology, and culture from 2014 to 2019. This article is part of our Histories of”¦ section, which looks at stories of innovation from the past. Click the logo to read more.