The Number of People Who Say the Mueller Investigation Should Continue is Shrinking

When the public finally sees Robert Mueller's special counsel report, will the facts he lays out change anyone's minds?

For a hint, look no further than responses this week to Attorney General William Barr's summary of the Mueller probe.

In our latest PBS NewsHour/Marist/NPR poll, we asked Americans if Barr's view — that there was not enough evidence to establish President Donald Trump obstructed justice — should be the final verdict or if Congress should continue investigating.

The responses split essentially down the middle — 50 percent said Trump was in the clear and 45 percent said the investigation should continue.

Yet without seeing the full Mueller report, which Barr said won't arrive in Congress until at least mid-April, how can anyone feel strongly one way or the other about the special counsel's conclusions?

One answer is partisanship. Based on the latest neuroscience research, political affiliations have a measurable effect on human cognition — that is, these ideologies warp everyone's ability to weigh information that is presented to them.

People rush to make judgments based on incomplete pictures because of the human desire to belong. In our poll, 89 percent of Republicans agreed with Barr's view on the lack of obstruction evidence, while 75 percent of Democrat wanted further investigation by Congress. People who identify as independents split evenly. Only 5 percent of Americans said they were unsure.

That's because tribalism is one of the strongest motivators for human behavior. We derive pleasure and social capital from being members of a clique, and consequently, our minds exert tremendous energy to signal our beliefs to those who might agree with us.

A protestor (right) from the Stand Against Communism rally, an event organized to oppose to anti-fascist demonstrations and to support U.S. President Donald Trump, among other causes, argues with a counter-protestor (left) during May Day events in Seattle, Washington, U.S. May 1, 2017.  Photo by REUTERS/David Ryder

A protestor (right) from the Stand Against Communism rally, an event organized to oppose to anti-fascist demonstrations and to support U.S. President Donald Trump, among other causes, argues with a counter-protestor (left) during May Day events in Seattle, Washington, U.S. May 1, 2017. Photo by REUTERS/David Ryder

"The domain of politics is just one example of where we're irrational," said psychologist Jay Van Bavel, director of the Social Perception and Evaluation Lab at New York University. "We have all kinds of preferences, biases, emotions and ways of thinking that shortcut logical decision-making."

As Van Bavel outlined in a 2018 commentary, our ideologies can alter our basic perception of the world around us — such as what we see — and also what we remember.

Partisanship creates blind spots in your mind — regardless of your political party. When researchers had a group of Republicans and Democrats watch the same video of a nondescript protest, both groups doled out judgments and punishment based on their political leanings. For instance, when they were told they were watching an anti-military protest, Republicans became more likely to call for police intervention. Democrats more often asked for police intervention when the group was told the protestors were shouting down an abortion clinic. Reminder: It was the exact same video!

Such divergent reactions help explain the confusion and counter-narratives over videos of the Nathan Phillips–Covington Catholic High School standoff earlier this year, or the 2014 Ferguson protests. Our minds create different interpretations because people often see things first through a political lens.

Research by Van Bavel and others argues that most people prefer (and potentially have always preferred throughout history) their partisan leanings over weighing things objectively.

How partisanship shapes the mind and vice versa

The notion that humans are rational is woefully outdated, and our partisanship sprouts almost from birth.

"Kids as young as 4 or 5 years old start to show a preference for their own group over another group," Van Bavel said. "Very young kids start to develop these identities early in life, without much socialization or training. It's something they take to very automatically."

Our childhood adoption of an "us versus them" mentality suggests a biological predisposition of being political. Large studies comparing identical and fraternal twins report modest correlations between genetics and political attitudes — on topics like school prayer, property tax and capitalism — and with voter turnout.

Some of this behavior is learned. Kids start latching onto their parents' leanings as infants and preschoolers, which can translate into full fledged political beliefs later in life. That doesn't mean children fully comprehend partisan decisions, and parents who push too hard run a risk of forcing a child toward a rival party. But naive minds are clearly drawn toward choosing sides.

As adults, these biases evolve into instinctive prejudice, according to research conducted by Van Bavel and others.

"We've done studies where you can simply flip a coin to divide people on two teams, and within minutes they are willing to discriminate in favor of their team," Van Bavel said. "They pay more attention to who's on their team and have better memories for what those individuals do."

Multiple studies argue that if you see someone on the street, your brain makes a partisan judgment about them — "Are they friend or foe?" — within 200 milliseconds, based purely on their face.

To unpack which parts of brain govern these mental leaps, Van Bavel and a colleague at Harvard University — neuroscientist Mina Cikara — scanned people's brains as they reacted to being split into two teams. As stereotypes started to brew in their subject's minds, the researchers saw one core brain area lights up: the orbitofrontal cortex.

Located right behind your eyebrows, the orbitofrontal cortex controls a number of mental tasks, including how we value one thing versus another. It does so by collecting nerve inputs from our fear, pleasure, memory and sensory centers.

MRI scan of a human brain with the orbitofrontal_cortex highlighted. Photo by Paul Wicks/via Wikimedia

MRI scan of a human brain with the orbitofrontal_cortex highlighted. Photo by Paul Wicks/via Wikimedia

"The orbitofrontal cortex puts all of these onto a common metric, so you can make decisions about what's worth doing," Van Bavel said. Let's say you're trying to decide between purchasing a new pair of jeans or saving that money to buy lunch for the week. The orbitofrontal cortex weighs the pleasure you might receive from both activities and then makes a call.

"What we're suggesting is that [the orbitofrontal cortex] places value on certain identities and beliefs," Van Bavel said. "So when Donald Trump says he has the largest crowd at any inauguration ever, if [someone is] highly aligned with him, they might be more likely to value it," even if photographic evidence says otherwise.

The GOP isn't alone. Van Bavel writes that Democrats are also prone to spreading falsehoods that support their partisan identity.

Democrats in the 1980s, for example, were more likely to claim inflation and unemployment rose during Ronald Reagan's presidency, or in more recent times, that military spending under Obama was higher than its true value.

And as you may have noticed from social media, misinformation breeds more misinformation. There's a cognitive basis of this pattern too.

"Our identities are most likely to bias our judgments when there's ambiguity or uncertainty because then we fill in the gap," Van Bavel said.

Which brings us back to the Mueller report …

The first cut is deepest when it comes to facts or misinformation

In the last two decades, there has been an explosion in fact checking — from independent websites like Snopes.com and Politifact and also from news watchdogs at the Washington Post, Associated Press and NPR.

In an ideal world, one might expect fact checks to act like a toilet plunger — allowing people to flush away their memories of fake news. Emily Thorson, a political scientist at Syracuse University, has discovered that's not always the case because of partisanship.

Half a decade ago, she noticed the growing popularity of fact checks, whereby the correction of misinformation would often receive much wider distribution than the initial falsehood.

But in a set of three experiments involving hundreds of people, she found exposure to negative misinformation about a politician of an opposing party is hard to shake.

Even after her participants learned about the correction and could explain it back to her, they still continued to think negatively about the politician 36 percent of the time, even knowing the original story was not true.

"It was very, very clear unequivocal correction — honestly more clear than we usually get at the real world," Thorson said. "But when we asked them, 'Well, how do you feel about this candidate?' They felt more negatively towards the candidate than the people who had never heard the misinformation."

Thorson uses the example of spreading a rumor about a cockroach infestation at your nearest Chipotle. Even if you tell people later on that you misspoke, they'll still feel hesitant about those burritos.

Thorson describes these lingering doubts as "belief echoes" — because the misinformation keeps coming to push back against the truth.

Both she and Van Bavel can easily see these belief echoes in play with the Mueller report. Because parts of the investigation were made public as Mueller continued his work — through indictments and trials — now, as we wait for the report to be released in full, a cloud of uncertainty hangs over what to believe about the final findings.

Without even seeing the Mueller report, pundits and politicos have concluded both that there is substantial evidence of obstruction against President Trump (though Barr said there is not enough evidence to bring charges) and that Trump is exonerated (though Mueller, according to Barr's summary, specifically wrote the president was not ).

"With the Mueller report, the longer they go without reporting it and the less people trust William Barr as an objective observer, the more people feel a sense of uncertainty and the more likely that their identities are going to come in," Van Bavel said. "So Republicans are going to believe everything he [Barr] says and Democrats are not."

That may explain why most Americans — 75 percent — want the full Mueller report to be released, according to our latest NewsHour/Marist/NPR poll.

On that question, Republicans are split. Half said they want to see the report (likely, if this neuroscience is a guide, to validate their opinions about Trump's innocence), while other GOP supporters are ready to move on. Democrats overwhelmingly — 90 percent — want the public release of the report likely because their identities have hinged on the prospect of a political conspiracy with Russia and possible obstruction of justice for nearly two years.

Can anything make people think objectively if they don't want to?

If I may break the fourth wall for a moment:

As a journalist and former scientist, it has been hard to watch misinformation take such massive hold on the American public. I believe in the value of the objective truth … which, as Van Bavel pointed out, is my bias. In our communities — science and journalism — there's a high value in maintaining an identity that reinforces the pursuit of truth, he said.

"Unfortunately I don't think that characterizes the average person," Van Bavel said. "Most people are guided by all kinds of other goals and values and emotions to elevate their status in a group."

One 2015 study lead by researchers at Yale University found that when asked a political question, both Democrats and Republicans prefer spreading a falsehood that parallels their partisan ideology rather than just admitting they don't know the underlying answer.

In surveys, "people may not even be answering the question: "Do I think President Trump committed a crime?" said Gregory Huber, a Yale political scientist who led the study. "They're saying whether or not they think he is a good guy or a bad guy."

His team found a simple remedy: Create incentives for the truth. When the researchers offered participants the chance to win $200 Amazon gift cards for truthful responses to political questions, the spread of partisan falsehoods declined by 60 percent. When they allowed respondents to win money by admitting their ignorance on an issue, biased information dropped by 80 percent.

Paying everyone isn't a feasible (nor affordable) solution for convincing the public to strive for the objective truth, but Huber said he has found a possible remedy: Let people vent. His team has preliminary evidence that if you allow people to express a partisan message before you ask a survey question, their responses become more truthful.

Meanwhile, Van Bavel said social media could do a stronger job of limiting hyperpartisan content in people's feeds. Research shows partisanship has consistently increased alongside social media use, though it was also rising before platforms like Twitter and Facebook became popular. He thinks Facebook has made strides in this arena, but believes YouTube and Twitter will still send people down rabbit holes of extreme content.

He said people should also try to interact with groups unlike their own, but in a strategic way.

"People having a positive contact with somebody who's different than you can be healthy," Van Bavel said. "But unfortunately, there's also research showing that if you follow people who are hyperpartisan and different from you, it can actually amplify partisanship."

Given the mental snares of belief echoes and partisanship, Van Bavel said Americans would have been better served if all of the information from special counsel investigation had been released at once. No dribs and drabs over time. No spin by various politicians.

Van Bavel, Thorson and Huber said the key now is transparency — the sooner, the better. Individuals in the public need to make more of an effort to seek out non-partisan stories about the Mueller report and news in general if they value objectivity, Van Bavel added. (May I recommend the PBS NewsHour?)

"To increase believability, you need absolute transparency," Van Bavel said. "At an individual level, you can cultivate a sense that grounds your beliefs in truth. People who are more analytical and less intuitive are less likely to believe and share fake news."

parkerhimmenting76.blogspot.com

Source: https://www.pbs.org/newshour/science/you-may-not-believe-the-mueller-report-no-matter-what-it-says

0 Response to "The Number of People Who Say the Mueller Investigation Should Continue is Shrinking"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel