How do you know what you read or hear is true? Finding an answer to this question seems more urgent now than ever. The Collins Dictionary 2017 word of the year was “fake news.” Made popular by President Trump, the term “fake news” increased in use by 365 per cent that year. The term itself is misleading, since it can refer to satire or hoaxes, and since Trump uses it to discredit any news outlet critical of him. A better term for the phenomenon of “fake news” is “disinformation,” defined as “the deliberate creation and sharing of false and/or manipulated information that is intended to deceive and mislead audiences, either for the purposes of causing harm, or for political, personal or financial gain.” The amount of disinformation circulating in our world has exploded.
We’re living in the Age of Lies.
The leader of the “most powerful nation on earth” made an average of 10 false or misleading claims per day since he took office—increasing to 15 false claims a day in 2018—according to the Washington Post’s fact checker database. The Post doesn’t call them “lies,” because to lie one has to know the truth and intend to deceive. The New York Times has no such reservations. According to them, Trump lied almost every day of his first 40 days as president. The Toronto Star too keeps a running tabulation of every false claim Trump has made since he took office (4,210 as of January 23, 2019) and provides corrections. “If Trump is a serial liar, why call this a list of ‘false claims,’ not lies?” says the Star. “We can’t be sure that each and every one was intentional. In some cases, he may have been confused or ignorant. What we know, objectively, is that he was not telling the truth.”
According to Trump, “Canada has taken advantage of our country for a long time.” He says that US farmers have been “shut out of Canada” because “Canada is charging tremendous tariffs.” He’s claimed that the deal with Mexico was “one of the largest trade deals ever made,” and that trade with Canada is smaller. He’s called our prime minister “weak” and accused Canadians of being shoe smugglers. For the record, Canada is the biggest importer of US farm products ($20.5-billion worth in 2017), almost all of them tariff-free under NAFTA. America’s two-way trade—exports plus imports—last year was $680-billion with Canada and $622-billion with Mexico.
In a Politico article titled “Trump May Not Be Crazy, But the Rest of Us Are Getting There Fast,” the authors talk about the effect of Trump’s “breezy indifference to reality” on our mental health:
Some people can’t just roll their eyes at obvious bullshit—they experience an assault on truth at a more profound psychic level. “Gaslighting is essentially a tactic used by abusive personalities to make the abused person feel as though they’re not experiencing reality, or that it’s made up or false,” said Dominic Sisti, a behavioral health care expert at the University of Pennsylvania. “The only reality one can trust is one that is defined by the abuser. Trump does this on a daily basis—he lies, uses ambiguities, demonizes the press. It’s a macroscopic version of an abusive relationship.”
Trump’s distortions, exaggerations and outright deceptions certainly contribute to the unreliability of our verbal world. A worse culprit, though, is social media (Facebook, Twitter, Instagram, Snapchat, YouTube etc.), defined as websites and applications that enable users to create and share content or to participate in online networking. These are sources without gatekeepers—“platforms” where users can say whatever they want. Canadians are second only to South Koreans in using social media for news. But false information swirls unchecked on social media. On Twitter, truth doesn’t stand a chance against lies. People are much more likely to retweet a false story than a true one. According to an MIT study published in Science, “Falsehood diffused significantly farther, faster, deeper and more broadly than the truth in all categories of information, and the effects were more pronounced for false political news….”
Trump’s distortions, exaggerations and outright deceptions contribute to the unreliability of our verbal world.
And every lie sows doubt. Once you’ve heard something false, even if it’s corrected, the false story sticks. A report that claimed a vaccine caused autism was withdrawn as fraudulent; its author was discredited and lost his medical licence. No researcher has found evidence of a link to autism, yet surveys show that 53 per cent of the public believes there is good evidence to support a connection. The persistence of falsehood is like the pink elephant. When you’re told not to think of a pink elephant, it’s the first image to spring to mind. Research has shown that even after people have accepted evidence that it is wrong, false information lingers in their minds.
Worse yet, Facebook, with its huge database of personal information in the profiles of its users, has been co-opted to purposely deceive and manipulate voters. Political parties and candidates can micro-target misinformation to different sectors of the electorate to discredit their opponents and enhance themselves. Cambridge Analytica worked on Trump’s election campaign with data illicitly acquired from millions of Facebook users. Their private information was used to create psychological/political profiles and then they were targeted with political messages tailored to work on their psychological makeup. Different messages were framed in different ways to be most effective with different vulnerabilities.
“Instead of standing in the public square and saying what you think, so everyone has a shared experience of your narrative, you’re whispering into the ear of each and every voter, and you may be whispering something different to each of them,” said Christopher Wylie, whistleblower in the Cambridge Analytica scandal, speaking to The Guardian in March 2018. “You risk fragmenting society. There’s no shared experience, no shared understanding, and without shared understanding, how can you have a functioning society?”
Russians interfered in the 2016 US election by spreading false information via social media. Twitter hosted more than 50,000 Russian imposter accounts. Russians also paid for dozens of propaganda ads on Facebook, but far more harmful was the organic spread of malicious misinformation online by unsuspecting American users, bots and algorithms. “Of the 470 Facebook accounts known to have been created by Russian saboteurs during the campaign, a mere six of them generated content that was shared at least 340,000,000 times,” according to the New Yorker.
The new technology of social media is not neutral; it fuels outrage, it polarizes, it’s fast and spontaneous and therefore discourages slow and thoughtful reflection or investigation or questioning. Lies spread faster than truth because “fake news” is more novel and strange, it surprises us, it evokes more emotion—amazement, horror, disgust. The temptation to retweet is just too great.
The spread of lies takes a toll. According to the 2018 Edelman Trust Barometer, which sampled more than 33,000 respondents in 28 countries, trust plummeted in 2017 across all institutions. When social media is included, media is the least trusted institution. Globally, nearly 7 in 10 respondents among the general population worry about fake news or false information being used as a weapon, and 59 per cent say that it is getting harder to tell if a piece of news was produced by a respected media organization.
A decline in trust is a serious problem. Trust is essential to civilization, which is, after all, a giant exercise in mutual co-operation. Clean water comes from the tap, garbage is picked up, buses take us where we need to go—because many people co-operate. In frozen mid-February Alberta, we can wake up to oranges. We have fresh fruit and vegetables, though they can’t grow here at that time of year, because growers and truckers and grocers have trusted each other’s agreements and co-operated to get them here.
We have to trust what others tell us because we simply can’t know everything we need to know from our own experience. How do we know anything? Most of what we “know,” we have learned from others. We take it on faith. We have no direct evidence for it. Milk comes from cows, we are told. We may never have seen a real cow up close, never mind milked one. We believe what people tell us. In our formative years, our parents, teachers, pastors and other authorities shaped our worldview. Later, we accept the reports of media and government. At one time we were informed by our city newspaper, national radio and TV, and we considered these reliable sources. Still, typically it’s not evidence and arguments that determine how we think and vote; it’s the values of our family, friends and colleagues.
We absorb what our parents teach us before our minds are sufficiently developed to evaluate what we’re learning. Indeed, education is for the most part acquiring the “truths” of others and the acceptance of the authority of others. In The Politics of Truth, Michel Foucault says that it’s those with power who get to say what “truth” is. The questioning of authority, particularly religious authorities, only began in the Age of the Enlightenment.
Immanuel Kant said: “Enlightenment is man’s release from his self-incurred tutelage. Tutelage is man’s inability to make use of his understanding without direction from another… ‘Have courage to use your own reason!’—that is the motto of enlightenment.” The Enlightenment gave rise to science and the scientific method of testing hypotheses through careful observation and meticulous recording of what was observed.
The responsible scientist and the reliable reporter have this in common: Both observe carefully, record their observations in unemotional language and seek independent verification for their observations. They use similar methods in order for their conclusions to have an objective and public validity. Valid scientific studies can be replicated by other researchers, and truthful reports of events can be corroborated by other news outlets.
According to Foucault, desubjugation means “not accepting as true what an authority tells you is true, or at least not accepting it because an authority tells you it is true, but rather accepting it only if you consider valid the reasons for doing so…. The subject gives himself the right to question.”
We used to assess whether something was true according to the reliability of the source. A reputable newspaper or other publication would not risk losing its credibility by being sloppy. Careful investigation, accurate reporting, thorough fact-checking and gatekeeper editors helped to prevent misinformation from spreading. If something published was found to be untrue, readers would cancel their subscriptions. Therefore, we felt we could have a certain trust in what did make it into print. With the decline of newspapers and the rise of social media, reliable sources of news are disappearing.
Ironically, social media provide the opportunity for the ultimate democratization of expression. Anyone can say anything on social media. No gatekeeper, no editor, no authority figure stands in the way. Until recently these media took no responsibility for what was transmitted via their platforms. For someone sounding off on Facebook, this might seem to be the logical extension of the motto of the Enlightenment: Use your own reason. Unfortunately, what circulates on social media is often not reason but virulent emotion.
Much of the erroneous information circulating today comes from flawed thinking. Trump provides a continuous stream of examples of faulty reasoning:
“They don’t look like Indians to me,” he says. If they don’t look a certain way to Trump, then, he asserts, they are not what they say they are. Here his own narrow subjective viewpoint is being confused with objective reality.
“Immigrants are dangerous.” This is generalizing from too small a sample. Because some immigrants did something harmful, doesn’t mean all immigrants are dangerous.
“Bringing back coal will bring back jobs.” This idea is based on a faulty assumption. Coal mining used to be labour intensive, but today coal is strip-mined using huge machines and very few workers.
The Oxford Dictionary’s 2016 word of the year was “post-truth,” defined as “circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.” Those are dangerous circumstances. They lead to division, polarization, fragmentation, breakdown. In order for society to function, we have to be able to come to agreements. As citizens concerned with our shared public life, we need valid information based on sound reasoning and evidence. In a secular pluralistic society, we have agreed on criteria for determining whether a statement is credible.
We can begin to assess if what we read or hear is true by examining the language. What does the speaker mean by the words s/he uses? If someone says, “Beets aren’t doing well this year,” we should ask, What beets do you mean? The ones in your garden? The sugar beet crop in all of southern Alberta? What do you mean by “doing well”? Are they stunted? Is the price low? How do you know they aren’t doing well?
Words are not the things they stand for. There is no necessary connection between the word “moon,” for example, and the astronomical body that orbits planet Earth, which the French call “la lune” and the Plains Cree, “tipiskâw pîsim.” The best way to show what the word means is to point to the object in the sky. If someone can point to something in the tangible world to show what they mean, we have a chance of ascertaining whether what they say is true.
We have to trust what others tell us because we simply can’t know everything we need to know from our own experience.
When a word has no referent in the tangible world, we are on shakier ground. Take for example the word “angel.” In Language in Thought and Action, S.I. Hayakawa says the statement, “angels watch over my bed at night” is pointless to discuss.
This does not mean that there are no angels watching over my bed at night.…[It means] we cannot see, touch, photograph or in any scientific manner detect the presence of angels. If an argument begins on the subject of whether or not angels watch over my bed, there is no way of ending the argument to the satisfaction of all disputants,… the pious and the agnostic, the mystical and the scientific. Therefore, whether we believe in angels or not, knowing in advance that any argument on the subject will be both endless and futile, we can avoid getting into fights about it.… Arguments of this kind may be termed “non-sense arguments,” because they are based on utterances about which no sense data can be collected.
We can assess whether an assertion is true or not by determining what kind of statement it is. Consider these three statements:
There are 10 dill pickles in that jar.
I hate dill pickles.
Dill pickles are disgusting.
These are, respectively, a fact-type statement, an opinion and a judgment. A fact-type statement is one that an independent observer can verify. Anyone else can open the jar, count the pickles and verify whether or not there are 10. A fact is an assertion about objective reality, and as such it is something everyone can test, measure and agree on. Facts are relevant to our shared world. Opinions and judgments are an expression of subjective reality. They are about the person who makes the statements and not about the external world. If indeed you do hate dill pickles, then “I hate dill pickles” is an accurate statement.
The judgment “dill pickles are disgusting” purports to make an assertion about objective reality—the nature of dill pickles—but really it is only an expression of the speaker’s subjective disgust for dill pickles. The statement reveals nothing about dill pickles, only the attitude of the speaker. Since some people enjoy eating dill pickles, the statement is not true.
It’s tempting when you hear someone say, “That’s a fabulous movie,” to think the statement is about the movie. But the statement tells us nothing about the movie, except that the speaker likes it. Someone else might say, “That’s the dumbest movie I’ve ever seen.” Which statement is true? Neither. Until we hear some facts about the film—it had six scenes with car chases, it starred Clint Eastwood, it scored 76 on Rotten Tomatoes—we know nothing whatsoever about the film to help us decide whether to see it. Judgments tell us about the likes and dislikes of the speaker, and nothing about the topic under discussion. In that way, all judgments that purport to tell us something about the external world are, strictly speaking, not true.
A fact-type statement could still be untrue. It requires fact-checking, for example on Snopes, Politifact or factcheck.org. An opinion or a judgment is about the speaker’s feelings, values, preferences and subjective viewpoint; it provides no useful information about the objective world we share. That doesn’t mean opinions and judgments don’t affect the world we live in. Political outcomes are changed when people accept opinions and judgments as true. A survey conducted by the Pew Research Centre in early 2018 found that most Americans could not consistently differentiate facts from opinions.
This is an election year in Alberta. Politicians are trying to persuade us to vote for them by making assertions about themselves and their opponents. In widely circulated Alberta news stories, they’ve said: “The carbon tax kills jobs.” “Anti-pipeline activism is a disaster, not only for working people but for effective climate action as well.” “Polls show that the NDP trail Jason Kenney’s UCP.” “This is the worst government ever.” Are these statements true? Are they facts, opinions or judgments? We might ask these politicians: What do you mean? How do you know?
We need to be able to rely on each other’s words. Therefore we need to be very careful that what we, and others, say is trustworthy. Without truth, there can be no trust. Without trust, everything collapses.
Jackie Flanagan is the founding editor of Alberta Views. Her essay “Woman and Politics” ran in AV’s June 2017 issue.