By Hugh Miller and Casey Bukro
Ethics AdviceLine for Journalists
Think of objectivity as a philosopher, by definition a person engaged in the study of reality.
Now think of objectivity as a journalist concerned with ethics, the search for right or good conduct.
This article will travel down these two roads, explored by Hugh Miller and Casey Bukro. Miller is professor of philosophy emeritus, Loyola University Chicago, and long-time AdviceLine advisor who takes calls from professional journalists seeking guidance on ethics. Bukro manages AdviceLine. He is a former national ethics chair for the Society of Professional Journalists.
The Ethics AdviceLine for Journalists has reported on objectivity before. But it is a controversial topic, one that manages to avoid much attention for a time until it explodes to the surface.
Reexmination time
This is one of those times, brought on by the Trump presidency and roiling discontent among journalists over diversity in the ranks of news gatherers. The Trump presidency forced journalists to struggle over whether to call Trump a liar. Black journalists led a reckoning over objectivity.
But what is objectivity? Let’s start with Miller’s thoughts on that subject, as a philosopher might see it. As a discipline, philosophy goes back centuries, beyond the time of Socrates and Plato. Here is what Miller says:
What do modern philosophy and science have to say about objectivity? To be sure, this is an enormous question. Indeed, I think it’s fair to say that the attempt to ground and secure genuine objectivity of knowledge and perception is pretty much coextensive with most of modern philosophy, and figures prominently in the methodologies of the modern sciences.
But some reflection on the historical origins of the modern problem of the objectivity of knowledge might help shed some light on current controversies concerning contemporary issues such as diversity in the workplace of a journalistic outlet.
“Natural philosophy” in the seventeenth century (as the sciences called themselves at that time) was deeply concerned with finding a method that would allow reliable, reproducible and accurate knowledge to be winnowed from the chaff of the rest of human experience — emotions, passions, beliefs, dogmas. Geometry was the great key: unlike other intellectual disciples, it alone seemed to provide a rigorous, axiomatic, structured, “truth-preserving” structure for knowledge, a structure which could scaffold the rest of the sciences.
Consensus appears
Across a wide spectrum of philosophical schools, a converging consensus began to appear: only those aspects of experience which could be measured and quantified, and thus subjected to mathematical analysis, would count as “objective.” All others would be relegated to the realm of the merely “subjective.” The objective properties of lemon, for example, might be its mass, its size, its density, its physical components and chemical composition, all of which could be analyzed, measured and quantified in standard ways. But its warm yellow color, its citrony fragrance, the acid-sweet bite of its juice upon the tongue – these were to be excluded as merely “secondary” subjective elements, not the stuff of real knowledge.
“Objective” was honorific, “subjective” pejorative. Under Isaac Newton and his successors, physics and the other “hard sciences,” with their mathematical theories, quickly became the gold standard for knowledge. The construction of scientific knowledge and of the body of individual sciences became a quest to describe natural beings in quantifiable ways, using careful experiment and measurement. And those experiments and measurements had to be devised so as to exclude subjective bias and idiosyncrasy by the investigators, as contaminants that would vitiate inquiry and block scientific advance.
Consider the telescope of Galileo. With it, what had previously seemed an inverted dome of dim twinkling lights that moved in a precise, clockwork fashion, with the earth at the center, turned out, upon inspection, to be a vast field of celestial objects which bore startling resemblances to terrestrial phenomena. (Jupiter even had moons, like earth!) Surely this meant that it was made of the same kind of stuff as earth, and no longer entitled to any divine status.) And as Newton proved in his Principia Mathematica, celestial bodies and terrestrial ones alike could all be described with a single mathematical model of motion, universally applicable and open to use by everyone with sufficient intelligence and education, without exception. Here at last was “objective” knowledge. Wholly impersonal, purged of all the dross of subjectivity, it alone could count as the royal road to the future.
Model prevails
This model of the relation between objective knowledge and subjective error prevailed for several centuries. The sciences which were founded upon it made such strides that they seemed invincible. But by the end of the nineteenth century, the model was tottering. Thinkers like Hegel, Marx and Nietzsche provided, each in their own way, trenchant criticisms of this way of conceiving of the relationship between objective and the subjective. In the early twentieth century, physics itself underwent an epistemological crisis with the development of the theories of relativity and quantum mechanics, whose results, while experimentally validated, seemed to defy all traditional intuitive “objective” sense.
In biology, physiology and neurology, the study of the human brain and sense organs, began to lay bare the mechanisms by which perception and thought were constructed, as well as the apparent contingency of what had previously been thought to be necessarily true – the so-called “laws of thought” themselves. Darwin’s theory showed that the development of thought had itself been a contingent historical event, and remained contingent, despite its having already occurred. Modern psychology, too, has shown that what we commonly think to be the sanctum sanctorum of scientific reason is, in fact, riddled with various forms of unreason.
Readers of Michael Lewis’s book, The undoing Project will be familiar with the work of Amos Twersky and Daniel Kahneman, whose research uncovered the regular and systematic ways in which everyday clear and convincing reasoning turns out to be systematically erroneous. This seemingly hardwired tendency to make predictable mistakes can be observed, and corrected for; but its ingrained nature and persistence is unnerving.
Modern philosophy and the sciences remain in this latter-day condition of disenchantment. But the old model of “objectively good” and “subjectively bad” continues to have a strong influence over our thought. Contemporary thought seeks to try to bridge the gap between this early modern model and later critiques of it. Truth, it is thought, must somehow be made out by understanding the interplay between objectivity and subjectivity, without casting either side in a necessarily negative light in advance.
Objectivity in journalism
It is in this context that we can perhaps appreciate, from the standpoint of the history of modern philosophy and science, the problem posed by the ideal of “objectivity” in a practice like journalism. Journal like to style itself “the first draft of history.” History itself is a modern human science, and sets itself the goal of recording, as the German historian von Ranke said, “how things really happened” (wie es eigentlich gewesen ist)—not how we would like them to have, or think they should have, happened. History and its apprentice, journalism, have not themselves been immune to the glamour of the mathematical model of the hard natural sciences, with its coolheaded and impartial scientific investigator at the wheel. But, as our all too brief narrative above indicates, this model has its deficiencies.
The journalist Jelani Cobb reminded us, in 2018, that after massive civil unrest in the United States in 1967 and ’68, the Kerner commission recommended, among its conclusions, that the ranks of journalists should be diversified, so as to include more journalists of color. The scarcity of such journalists, with their distinctive journalistic viewpoints, had greatly contributed to the shock and surprise at those disturbances on the part of white Americans. White reporters had missed what was quite literally before their eyes and under their noses, but which was as plain as the nose on their noses to black Americans: namely that America was, and had always been, a land of violent, coercive, oppressive white supremacy. By assuming that their reporters could adopt what the philosopher Thomas Nagel later called “the view from nowhere,” a perfectly impartial, neutral, impersonal – and race-free – standpoint, white Americans had deceived themselves.
In his polemical book, On the Geneology of Morals, Friedrich Nietzsche wrote, in 1887:
Let us be on guard against the dangerous old conceptual fiction that posited a “pure, will-less, painless, timeless knowing subject;” let us guard against the snares of such contradictory concepts as “pure reason,” “absolute spirituality,” “knowledge in itself:” these always demand that we should think of an eye that is completely unthinkable, an eye turned in no particular direction, in which the active and interpreting forces, through which alone seeing becomes seeing something, are supposed to be lacking; these always demand of the eye an absurdity and is nonsense. There is only a perspective seeing, only a perspective “knowing;” and the more affects we allow to speak about one thing, the more eyes, different eyes, we can use to observe one thing, the more complete will our “concept” of this thing, our “objectivity,” be.
Nietzsche is not saying that there is no truth, no knowing, no concepts, no objectivity. All of these things are real, he is saying. But they do not flow from some transcendental ego or disembodied, ahistorical scientific observer. They are built up by a collaboration and occasional conflict between different interpretive “eyes.” Not only values, but facts themselves come into existence only by means of an interpretive act. The key here is not to limit the scope of interpretive acts to one type of vision, or one mode of hearing. A multiplicity of visions is the surest road to truth.
A journalist
Here is where Casey Bukro takes over this narrative, after a 63 year history as a journalist – as a reporter, writer and an editor, most of that time at the Chicago Tribune.
I’ll say it right up front: I always believed the essence of objectivity is to remove ourselves from the story – to stand apart from it, except for the wisdom and the experience we can bring to it. For most of my time at the Tribune, I was the environment writer, the Tribune’s first reporter assigned to cover the environment full-time. People sometimes asked me if I was the Tribune’s environmental crusader. No, I said, I was a reporter like the rest of them, looking for the truth. I did not favor environmentalists any more than I favored polluters.
To those who say we all have our biases, I say yes, and a professional journalists should recognize that and keep them out of a story.
I was born in Chicago to a Polish-American family, and raised in the Humboldt Park neighborhood made up largely of people with European backgrounds. I thought of it as the best of Chicago neighborhoods, like a small United Nations. It was a mix of nationalities and religions. Everyone was different, and we learned to get along with each other. That is one of the first recollections of my early life.
I doubt my Polish-American heritage had anything to do with being hired at the Chicago Tribune after a stint at the City News Bureau of Chicago, a grueling training ground for young reporters. Before that, I had worked for the Janesville (Wis.) Gazette, after graduating from the Medill School of Journalism, at Northwestern University. I was schooled and trained for journalism, where I spent most of my life.
First SPJ code
I believed in objectivity, and said so when I wrote the Society of Professional Journalists first code of ethics, adopted in 1973, with the help of members of the national Professional Development committee, which I chaired. After the adoption of the code of ethics, I became the society’s first national ethics chair.
The 1973 code said journalists should perform “with intelligence, objectivity, accuracy and fairness.” The code was revised several times, and in 1996 the word “objectivity” was stricken from the code. My recollection is that academics largely pushed for that change, saying human beings are captives of their biases and cannot be objective.
Not only academics resisted calling journalists objective. My own boss at the Tribune, the late Jack Fuller, wrote a book on News Values, Ideas for an Information Age. He was the Tribune’s president and publisher. In the book, Fuller wrote: “No one has ever achieved objective journalism, and no one ever could. The bias of the observer always enters the picture, if not coloring the details at least guiding the choice of them.” Fuller won a Pulitzer Prize for editorial writing. I sometimes wonder if that taught him the value of opinion over objectivity. He was a man of opinions.
For years, news organizations staffed their newsrooms with specialists in covering the environment, education, science, government, politics, sports and other beats that required their talents and ability to recognize critical developments on their beats. And to attract trustworthy sources of reliable information. These are not the activities of remote automatons. It takes engagement and thought about what is important.
Diversity brings change
The recent emphasis on diversity brought new meaning to a reporter’s ethnic background, and racial insights. This is based on the belief that a newsroom should reflect the diversity of communities reporters are covering, so that they understand the customs and the trials of that community. But how is that playing out?
The Pew Research Center in 2019 reported that the public places high value on journalists’ connection to the community, but Americans offer a more mixed assessment of journalists’ actual connection to their community.
This disconnect, of minority journalists actually making an impact in their workplaces, came clearer in an op-ed appearing in the New York Times, written by Wesley Lowery, a Times reporter.
“Black journalists are publicly airing years of accumulated grievances, demanding an overdue reckoning for a profession whose mainstream repeatedly brushes off their concerns; in many newsrooms, writers and editors are now also openly pushing for a paradigm shift in how our outlets define their operations and ideals.”
Black journalists
In an article headlined “A reckoning over objectivity, led by Black journalists,” Lowery writes that “while these two battles may seem superficially separate, in reality, the failure of the mainstream press to accurately cover black communities is intrinsically linked with its failure to employ, retain and listen to black people.”
To listen. In the past, reporters were told to keep their opinions to themselves. Now, as they see it, they are being hired for the opinions and insights they bring to their jobs. Editorial writers, of course, always wrote opinion pieces, which distinguished them from the news pages.
It’s a treacherous, changing landscape. Sometime after Lowery’s op-ed appeared, Lauren Wolfe, a New York Times freelancer, appeared to be dismissed for expressing a political opinion, although the newspaper said that was not true, but offered no further explanation.
Lowery’s boss, Dean Baquet, executive editor of the Times, told Jon Allsop of the Columbia Journalism Review that he believed Lowery’s op-ed was “terrific,” and didn’t believe that he and Lowery were far apart on the objectivity question.
Fair and independent
“Baquet — who has repeatedly stressed the importance of objectivity in the past —said that he doesn’t love the term, and that he would rather frame his view of journalism around ‘fairness’ and ‘independence,’ wrote Allsop. “The independent and fair reporter, he said, ‘gets on an airplane to pursue a story with an empty notebook, believing that he or she doesn’t fully know what the story is, and is going to be open to what they hear.’”
Objectivity, and the appearance of bias, was at the root of a long hesitation by the New York Times and other media to say flat-out that President Trump was a liar.
“It’s not just his outrageous stuff…he says things that are just demonstrably false,” Baquet told Ken Doctor in an interview appearing in niemanlab.org. “I think he’s challenged our language. He will have changed journalism, he really will have.”
It took a long time, Baquet admitted, to understand how to deal with falsehood. “We didn’t know how to write the paragraph that said, ‘This is just false.’ We struggle with that. I think that Trump has ended that struggle. I think we now say stuff. We fact-check him. We write it more powerfully that it’s false.”
Liar presidents
This hesitancy to call a president a liar is strange, since it would not be the first time.
“When Richard Nixon was president, most journalists knew he was a thoroughly dishonest man,” wrote David Greenberg in politico.com. “Notably, though, it wasn’t until the Watergate investigations proved that Nixon had deliberately uttered his falsehoods with the intent to deceive the public that journalists rolled out the heaviest rhetorical artillery available to them: Calling the president a liar.”
As generations of journalists change, some lessons must be leaned anew.
In the Washington Post, Paul Farhi reported that news organizations across the country were starting to describe Trump’s falsehoods that way.
“It’s (almost) official: The president of the United States is a liar,” wrote Farhi.
Other news media hesitate to use “lie” for Trump’s misstatements, writes David Bauder in apnews.com. It’s a question of intent. Editors believe it’s important to establish whether someone is spreading false information knowingly, intending to deceive, and it’s hard to get inside a person’s head, writes Bauder.
Let facts speak
At the Associated Press, “we feel it’s better to say what the facts are, say what the person said and let the audience make the decision whether or not it’s an intentional lie,” said John Daniszewski, the news cooperatives’s standards editor.
“Lie” is considered a loaded word. However, Trump’s birther movement questioning former President Barack Obama’s citizenship led both the New York Times and AP to use the word “lie.”
Clearly, the controversy over objectivity rages on. Andrew Kirell in mediaite.com, writes: “There is no such thing as objectivity in journalism. And it’s time to get over it.”
The Media Ethics Initiative recognizes the trend toward “a new understanding of journalism, one which allows for the inclusion of a journalist’s personal voice,” though others believe ditching the ideals of objectivity and neutrality is dangerous.
“For a journalist to include their own voice is to risk exerting influence over their audience,” writes the author. “Whereas the publication of ‘only facts’ allows for the consumers to make judgments for themselves, not be told what to think by a reporter.”
Seeking bias
There are always conflicting views. Kelly McBride, a Poynter Institute expert on journalism ethics and standards, points out that in today’s polarized world, people judge media accuracy by their own biases.
“If a news consumer doesn’t see their particular bias in a story accounted for – not necessarily validated, but at least accounted for in a story – they are going to assume that the reporter or the publication is biased,” McBride said in a story about controversial news media bias charts.
There will never be a final answer to this controversy, which involves opinions about expressing opinions. In my experience, journalists can be a very opinionated bunch. They love to argue and quibble about details. It’s their job.
Let me tell you my long-time bottom line on objectivity: Those who say it is impossible will never be able to achieve it. Once a person says something is unattainable, they usually stop trying. I always keep trying, keeping Roger Bannister in mind.
Record mile run
On May 6, 1954, Bannister, a 25-year-old medical student, ran the mile in three minutes, fifty-nine and four-tenths of a second in Oxford, England. He was the first in recorded track and field history to break the four-minute mile.
Until that time, some doctors and scientists insisted that no human could run the mile in less than four minutes. They said it could be fatal. I remember clearly a perfectly reasoned article at the time written by a doctor explaining why the human body could not reach such a goal. Lactic acid would build up in the blood during extreme exertion, along with oxygen depletion in the heart and lungs. Seemed perfectly reasonable.
But Bannister did not believe it, and avoided conventional coaching and training methods of the time. Wikipedia reports that 1,400 male athletes have broken the “four-minute barrier” since Bannister did it. And breaking that barrier is now the standard of all male professional middle distance runners.
Upon finishing the record-breaking race, Bannister said: “Doctors and scientists said breaking the four-minute miles was impossible, that one would die in the attempt. Thus, when I got up from the track after collapsing at the finish line, I figured I was dead.”
It pays to be skeptical of what people say is impossible.
*************************************************************
The Ethics AdviceLine for Journalists was founded in 2001 by the Chicago Headline Club (Chicago Professional Chapter of the Society of Professional Journalists) and Loyola University Chicago Center for Ethics and Social Justice. It partnered with the Medill School of Journalism at Northwestern University in 2013. It is a free service.
Professional Journalists are invited to contact the Ethics AdviceLine for Journalists for guidance on ethics. Call 866-DILEMMA or ethicsadvicelineforjournalists.org.
Visit the Ethics AdviceLine blog for more.