The Most Depressing Discovery About the Brain, Ever

Gaby

SuperModerator
Moderator
FOTCM Member
Some research on emotional thinking, though from the analogies he gives, it seems to me the author is afflicted by it himself:

The Most Depressing Discovery About the Brain, Ever

_http://www.alternet.org/media/most-depressing-discovery-about-brain-ever?paging=off

September 16, 2013 |

Yale law school professor Dan Kahan’s new research paper is called “Motivated Numeracy and Enlightened Self-Government,” but for me a better title is the headline on science writer Chris Mooney’s piece about it in Grist: “Science Confirms: Politics Wrecks Your Ability to Do Math.”

[Paper found at _http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2319992]

Kahan conducted some ingenious experiments about the impact of political passion on people’s ability to think clearly. His conclusion, in Mooney’s words: partisanship “can even undermine our very basic reasoning skills…. [People] who are otherwise very good at math may totally flunk a problem that they would otherwise probably be able to solve, simply because giving the right answer goes against their political beliefs.”

In other words, say goodnight to the dream that education, journalism, scientific evidence, media literacy or reason can provide the tools and information that people need in order to make good decisions. It turns out that in the public realm, a lack of information isn’t the real problem. The hurdle is how our minds work, no matter how smart we think we are. We want to believe we’re rational, but reason turns out to be the ex post facto way we rationalize what our emotions already want to believe.

For years my go-to source for downer studies of how our hard-wiring makes democracy hopeless has been Brendan Nyhan, an assistant professor of government at Dartmouth.

Nyan and his collaborators have been running experiments trying to answer this terrifying question about American voters: Do facts matter?

The answer, basically, is no. When people are misinformed, giving them facts to correct those errors only makes them cling to their beliefs more tenaciously.
Here’s some of what Nyhan found:

People who thought WMDs were found in Iraq believed that misinformation even more strongly when they were shown a news story correcting it.
People who thought George W. Bush banned all stem cell research kept thinking he did that even after they were shown an article saying that only some federally funded stem cell work was stopped.
People who said the economy was the most important issue to them, and who disapproved of Obama’s economic record, were shown a graph of nonfarm employment over the prior year – a rising line, adding about a million jobs. They were asked whether the number of people with jobs had gone up, down or stayed about the same. Many, looking straight at the graph, said down.
But if, before they were shown the graph, they were asked to write a few sentences about an experience that made them feel good about themselves, a significant number of them changed their minds about the economy. If you spend a few minutes affirming your self-worth, you’re more likely to say that the number of jobs increased.

In Kahan’s experiment, some people were asked to interpret a table of numbers about whether a skin cream reduced rashes, and some people were asked to interpret a different table – containing the same numbers – about whether a law banning private citizens from carrying concealed handguns reduced crime. Kahan found that when the numbers in the table conflicted with people’s positions on gun control, they couldn’t do the math right, though they could when the subject was skin cream. The bleakest finding was that the more advanced that people’s math skills were, the more likely it was that their political views, whether liberal or conservative, made them less able to solve the math problem.

I hate what this implies – not only about gun control, but also about other contentious issues, like climate change. I’m not completely ready to give up on the idea that disputes over facts can be resolved by evidence, but you have to admit that things aren’t looking so good for a reason. I keep hoping that one more photo of an iceberg the size of Manhattan calving off of Greenland, one more stretch of record-breaking heat and drought and fires, one more graph of how atmospheric carbon dioxide has risen in the past century, will do the trick. But what these studies of how our minds work suggest is that the political judgments we’ve already made are impervious to facts that contradict us.

Maybe climate change denial isn’t the right term; it implies a psychological disorder. Denial is business-as-usual for our brains. More and better facts don’t turn low-information voters into well-equipped citizens. It just makes them more committed to their misperceptions. In the entire history of the universe, no Fox News viewers ever changed their minds because some new data upended their thinking. When there’s a conflict between partisan beliefs and plain evidence, it’s the beliefs that win. The power of emotion over reason isn’t a bug in our human operating systems, it’s a feature.
 
Same study, different coverage:

Science confirms: Politics wrecks your ability to do math

_http://grist.org/politics/science-confirms-politics-wrecks-your-ability-to-do-math/

Everybody knows that our political views can sometimes get in the way of thinking clearly. But perhaps we don’t realize how bad the problem actually is. According to a new psychology paper, our political passions can even undermine our very basic reasoning skills. More specifically, the study finds that people who are otherwise very good at math may totally flunk a problem that they would otherwise probably be able to solve, simply because giving the right answer goes against their political beliefs.

The study, by Yale law professor Dan Kahan and his colleagues, has an ingenious design. At the outset, 1,111 study participants were asked about their political views and also asked a series of questions designed to gauge their “numeracy,” that is, their mathematical reasoning ability. Participants were then asked to solve a fairly difficult problem that involved interpreting the results of a (fake) scientific study. But here was the trick: While the fake study data that they were supposed to assess remained the same, sometimes the study was described as measuring the effectiveness of a “new cream for treating skin rashes.” But in other cases, the study was described as involving the effectiveness of “a law banning private citizens from carrying concealed handguns in public.”

The result? Survey respondents performed wildly differently on what was in essence the same basic problem, simply depending upon whether they had been told that it involved guns or whether they had been told that it involved a new skin cream. What’s more, it turns out that highly numerate liberals and conservatives were even more – not less — susceptible to letting politics skew their reasoning than were those with less mathematical ability.

But we’re getting a little ahead of ourselves — to fully grasp the Enlightenment-destroying nature of these results, we first need to explore the tricky problem that the study presented in a little bit more detail.

Let’s start with the “skin cream” version of this brain twister. You can peruse the image below to see exactly what research subjects read (and try out your own skill at solving it), or skip on for a brief explanation:

study_image_1.png


As you can see above, the survey respondents were presented with a fictional study purporting to assess the effectiveness of a new skin cream, and informed at the outset that “new treatments often work but sometimes make rashes worse” and that “even when treatments don’t work, skin rashes sometimes get better and sometimes get worse on their own.” They were then presented with a table of experimental results, and asked whether the data showed that the new skin cream “is likely to make the skin condition better or worse.”

So do the data suggest that the skin cream works? The correct answer in the scenario above is actually that patients who used the skin cream were “more likely to get worse than those who didn’t.” That’s because the ratio of those who saw their rash improve to those whose rash got worse is roughly 3 to 1 in the “skin cream” group, but roughly 5 to 1 in the control group — which means that if you want your rash to get better, you are better off not using the skin cream at all. (For half of study subjects asked to solve the skin cream problem, the data were reversed and presented in such a way that they did actually suggest that the skin cream works.)

This is no easy problem for most people to solve: Across all conditions of the study, 59 percent of respondents got the answer wrong. That is, in significant part, because trying to intuit the right answer by quickly comparing two numbers will lead you astray; you have to take the time to compute the ratios.

Not surprisingly, Kahan’s study found that the more numerate you are, the more likely you are to get the answer to this “skin cream” problem right. Moreover, it found no substantial difference between highly numerate Democrats and highly numerate Republicans in this regard. The better members of both political groups were at math, the better they were at solving the skin cream problem.

But now take the same basic study design and data, and simply label it differently. Rather than reading about a skin cream study, half of Kahan’s research subjects were asked to determine the effectiveness of laws “banning private citizens from carrying concealed handguns in public.” Accordingly, these respondents were presented not with data about rashes and whether they got better or worse, but rather with data about cities that had or hadn’t passed concealed carry bans, and whether crime in these cities had or had not decreased.

Overall, then, study respondents were presented with one of four possible scenarios, depicted below with the correct answer in bold:

study-image-3.png


So how did people fare on the handgun version of the problem? They performed quite differently than on the skin cream version, and strong political patterns emerged in the results — especially among people who are good at mathematical reasoning. Most strikingly, highly numerate liberal Democrats did almost perfectly when the right answer was that the concealed weapons ban does indeed work to decrease crime (version C of the experiment) — an outcome that favors their pro-gun-control predilections. But they did much worse when the correct answer was that crime increases in cities that enact the ban (version D of the experiment).

The opposite was true for highly numerate conservative Republicans: They did just great when the right answer was that the ban didn’t work (version D), but poorly when the right answer was that it did (version C).

Here are the results overall, comparing subjects’ performances on the “skin cream” versions of the problem (above) and the “gun ban” versions of the problem (below), and relating this performance to their political affiliations and numeracy scores:

study-image-2_0.png

Full study results comparing subjects’ performance on the skin cream problem with their performance on the gun ban problem. Vertical axes plot response accuracy. Horizontal axes show mathematical reasoning ability. Click to embiggen.

For study author Kahan, these results are a fairly strong refutation of what is called the “deficit model” in the field of science and technology studies — the idea that if people just had more knowledge, or more reasoning ability, then they would be better able to come to consensus with scientists and experts on issues like climate change, evolution, the safety of vaccines, and pretty much anything else involving science or data (for instance, whether concealed weapons bans work). Kahan’s data suggest the opposite — that political biases skew our reasoning abilities, and this problem seems to be worse for people with advanced capacities like scientific literacy and numeracy. “If the people who have the greatest capacities are the ones most prone to this, that’s reason to believe that the problem isn’t some kind of deficit in comprehension,” Kahan explained in an interview.

So what are smart, numerate liberals and conservatives actually doing in the gun control version of the study, leading them to give such disparate answers? It’s kind of tricky, but here’s what Kahan thinks is happening.

Our first instinct, in all versions of the study, is to leap instinctively to the wrong conclusion. If you just compare which number is bigger in the first column, for instance, you’ll be quickly led astray. But more numerate people, when they sense an apparently wrong answer that offends their political sensibilities, are both motivated and equipped to dig deeper, think harder, and even start performing some calculations — which in this case would have led to a more accurate response.

“If the wrong answer is contrary to their ideological positions, we hypothesize that that is going to create the incentive to scrutinize that information and figure out another way to understand it,” says Kahan. In other words, more numerate people perform better when identifying study results that support their views — but may have a big blind spot when it comes to identifying results that undermine those views.

What’s happening when highly numerate liberals and conservatives actually get it wrong? Either they’re intuiting an incorrect answer that is politically convenient and feels right to them, leading them to inquire no further — or else they’re stopping to calculate the correct answer, but then refusing to accept it and coming up with some elaborate reason why 1 + 1 doesn’t equal 2 in this particular instance. (Kahan suspects it’s mostly the former, rather than the latter.)

The Scottish Enlightenment philosopher David Hume famously described reason as a “slave of the passions.” Today’s political scientists and political psychologists, like Kahan, are now affirming Hume’s statement with reams of new data. This new study is just one out of many in this respect, but it provides perhaps the most striking demonstration yet of just how motivated, just how biased, reasoning can be – especially about politics.
 
Depressing indeed!

It would be useful if they also questioned the participants using Altemeyer's RWA scale, but then again in this day and age anyone who calls themselves a "republican" or "democrat" is likely an authoritarian anyway.
 
Psyche said:
People who said the economy was the most important issue to them, and who disapproved of Obama’s economic record, were shown a graph of nonfarm employment over the prior year – a rising line, adding about a million jobs. They were asked whether the number of people with jobs had gone up, down or stayed about the same. Many, looking straight at the graph, said down.

<snip>

I hate what this implies – not only about gun control, but also about other contentious issues, like climate change. I’m not completely ready to give up on the idea that disputes over facts can be resolved by evidence, but you have to admit that things aren’t looking so good for a reason. I keep hoping that one more photo of an iceberg the size of Manhattan calving off of Greenland, one more stretch of record-breaking heat and drought and fires, one more graph of how atmospheric carbon dioxide has risen in the past century, will do the trick. But what these studies of how our minds work suggest is that the political judgments we’ve already made are impervious to facts that contradict us.

Maybe climate change denial isn’t the right term; it implies a psychological disorder. Denial is business-as-usual for our brains. More and better facts don’t turn low-information voters into well-equipped citizens. It just makes them more committed to their misperceptions.

Yup, this guy sure is thinking emotionally... he's trying to parlay 1 million pseudo-jobs into a boost to the economy under Obama, and his now defunct position on human caused global warming is blatant ignoring of facts.
 
Laura said:
Psyche said:
People who said the economy was the most important issue to them, and who disapproved of Obama’s economic record, were shown a graph of nonfarm employment over the prior year – a rising line, adding about a million jobs. They were asked whether the number of people with jobs had gone up, down or stayed about the same. Many, looking straight at the graph, said down.

<snip>

I hate what this implies – not only about gun control, but also about other contentious issues, like climate change. I’m not completely ready to give up on the idea that disputes over facts can be resolved by evidence, but you have to admit that things aren’t looking so good for a reason. I keep hoping that one more photo of an iceberg the size of Manhattan calving off of Greenland, one more stretch of record-breaking heat and drought and fires, one more graph of how atmospheric carbon dioxide has risen in the past century, will do the trick. But what these studies of how our minds work suggest is that the political judgments we’ve already made are impervious to facts that contradict us.

Maybe climate change denial isn’t the right term; it implies a psychological disorder. Denial is business-as-usual for our brains. More and better facts don’t turn low-information voters into well-equipped citizens. It just makes them more committed to their misperceptions.

Yup, this guy sure is thinking emotionally... he's trying to parlay 1 million pseudo-jobs into a boost to the economy under Obama, and his now defunct position on human caused global warming is blatant ignoring of facts.

Yup, exactly. It would be useful to include in the study design how people can't see their own emotional thinking/biases even when they're looking at the issue of emotional thinking/biases in a "scientific study context."
 
That is what happens when we mix up our centres. Most of those biases are so well hidden we have a hard time finding them on our own.
As my alias indicates i always figured was a (brace for it, trumpets sounding) logical thinker. 4th way work showed me that I could scarcely have been father from the truth.
 
I found similar coverage of the study this morning. Pretty intriguing stuff. I didn't think of it as depressing. Maybe it is to someone who thinks the problem with the world is that people aren't educated enough. If humanitarian initiatives spent a fraction of what they spend trying to raise people's knowledge trying to raise being instead, I'm sure we'd see a drastically different world.
 
Psyche, thanks for sharing this. And gives to think very much. This issue of "must win" is not only in partisan political issues but of religion, sports and Military-industrial complex -May have a pathological origin, of competition for limited resources rather than cooperate- should be part of control programs generate impressions and identifications, an "us" against "others". Is pretty much the official history of religions, the divide and conquer speaks eloquently, I think. And how normal people to be identified with a creed (and they require it to be true, especially if salvation were promised!) Will go against any other group that says it is not so, no one's going to save you, life is more complex (perhaps being so anthropocentric, the truths to be true must be recognized by the largest possible number of human -or humans "that matter" if it is a racist society-). I'd like to see some psychological studies on anthropocentrism, not only in religion but also in science, art, etc, since I do not know nothing. Or maybe we are so anthropocentric that these studies are tacit and still no theories about them :P (is a joke, surely there are many and I must look for it)
Also if religion in a region is more or less homegenea, like different Christian versions mostly, so then bi-partisanship serve to divide.
 
Good research to show to those who don't believe in the theory that caesar was jesus.
 
Psyche said:
Some research on emotional thinking, though from the analogies he gives, it seems to me the author is afflicted by it himself:

The Most Depressing Discovery About the Brain, Ever

_http://www.alternet.org/media/most-depressing-discovery-about-brain-ever?paging=off

September 16, 2013 |
[...]

The power of emotion over reason isn’t a bug in our human operating systems, it’s a feature.

sarek said:
That is what happens when we mix up our centres. Most of those biases are so well hidden we have a hard time finding them on our own.
As my alias indicates i always figured was a (brace for it, trumpets sounding) logical thinker. 4th way work showed me that I could scarcely have been father from the truth.

Yes sarek, mixed centres, designed that way as a 'control' feature no doubt. Utilized extensively in propaganda as we know, for thousands of 'years'. What a bonus, having the works of Gudjieff and Mouravieff to help us see the esoteric reality of this design feature.


whitecoast said:
[...]
I didn't think of it as depressing. Maybe it is to someone who thinks the problem with the world is that people aren't educated enough...

The trick whitecoast is to not engage emotively with facts on the ground, which is as you have done, methinks. As Laura has pointed out so often, "the universe is not broken", and as the C/s so pointedly repeat - All is lessons. We recognise this 'design feature of the matrix, investigate and share our findings, and act to prevent it from blocking our entry through the door to the inner circle.

This Sott article http://gawker.com/this-three-minute-commercial-puts-full-length-hollywood-1309506149@trotter is a case in point of how this emotive design feature is employed. The video made tears come to my eyes, but: it glosses over the harsh realities that the very technology it advertises is part and parcel of why we cannot afford health care in the first place. Perhaps this would be a useful tool to gauge a subjects susceptability(on not) to psychopathological thinking. Not much use else wise.

That an 'interest' in politics results in cognitive error as well is a clear symbol of the pathological process that it really is. You know, the C/s did not say exactly how the 'man behind the curtain' will reveal itself, but to my opening awareness, recent cognitive science is doing just that.

Thank you Psyche :)
 
Sorry Tumble, I tried but I couldn't see any connection between what you quoted of me and what you wrote afterward about it. Did you find my reply emotional for some reason? :huh:
 
Back
Top Bottom