Page 1 of 2 12 LastLast
Results 1 to 10 of 18
  1. #1 The Science of Why We Don't Believe Science 
    Our widdle friend. Wei Wu Wei's Avatar
    Join Date
    Jan 2010
    Posts
    6,414
    This is very long I'm only going to quote snippits but there is much more at the link.

    This is a topic I'm very interested in, I studied psychology with an emphasis on research, and much of the research work I did was in cognitive stuff like this.

    "A MAN WITH A CONVICTION is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point." So wrote the celebrated Stanford University psychologist Leon Festinger (PDF), in a passage that might have been referring to climate change denial—the persistent rejection, on the part of so many Americans today, of what we know about global warming and its human causes. But it was too early for that—this was the 1950s—and Festinger was actually describing a famous case study in psychology.


    Festinger and several of his colleagues had infiltrated the Seekers, a small Chicago-area cult whose members thought they were communicating with aliens—including one, "Sananda," who they believed was the astral incarnation of Jesus Christ. The group was led by Dorothy Martin, a Dianetics devotee who transcribed the interstellar messages through automatic writing.


    Through her, the aliens had given the precise date of an Earth-rending cataclysm: December 21, 1954. Some of Martin's followers quit their jobs and sold their property, expecting to be rescued by a flying saucer when the continent split asunder and a new sea swallowed much of the United States.


    Festinger and his team were with the cult when the prophecy failed. First, the "boys upstairs" (as the aliens were sometimes called) did not show up and rescue the Seekers. Then December 21 arrived without incident. It was the moment Festinger had been waiting for: How would people so emotionally invested in a belief system react, now that it had been soundly refuted?

    At first, the group struggled for an explanation. But then rationalization set in. A new message arrived, announcing that they'd all been spared at the last minute. Festinger summarized the extraterrestrials' new pronouncement: "The little group, sitting all night long, had spread so much light that God had saved the world from destruction." Their willingness to believe in the prophecy had saved Earth from the prophecy!

    From that day forward, the Seekers, previously shy of the press and indifferent toward evangelizing, began to proselytize. "Their sense of urgency was enormous," wrote Festinger. The devastation of all they had believed had made them even more certain of their beliefs.



    an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called "motivated reasoning [5]" helps explain why we find groups so polarized over matters where the evidence is so unequivocal:

    The theory of motivated reasoning builds on a key insight of modern neuroscience [7] (PDF): Reasoning is actually suffused with emotion (or what researchers often call "affect"). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we're aware of it


    In other words, when we think we're reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt [10]: We may think we're being scientists, but we're actually being lawyers [11] (PDF). Our "reasoning" is a means to a predetermined end—winning our "case"—and is shot through with biases. They include "confirmation bias," in which we give greater heed to evidence and arguments that bolster our beliefs, and "disconfirmation bias," in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.



    That's a lot of jargon, but we all understand these mechanisms when it comes to interpersonal relationships. If I don't want to believe that my spouse is being unfaithful, or that my child is a bully, I can go to great lengths to explain away behavior that seems obvious to everybody else—everybody who isn't too emotionally invested to accept it, anyway.
    http://motherjones.com/politics/2011...-mooney?page=1
    Quote Originally Posted by Adam Smith - Wealth of Nations
    It is not very unreasonable that the rich should contribute to the public expense, not only in proportion to their revenue, but something more than in that proportion.
    Reply With Quote  
     

  2. #2  
    Our widdle friend. Wei Wu Wei's Avatar
    Join Date
    Jan 2010
    Posts
    6,414
    The BackFire Effect

    Sure enough, a large number of psychological studies have shown that people respond to scientific or technical evidence in ways that justify their preexisting beliefs. In a classic 1979 experiment [12] (PDF), pro- and anti-death penalty advocates were exposed to descriptions of two fake scientific studies: one supporting and one undermining the notion that capital punishment deters violent crime and, in particular, murder. They were also shown detailed methodological critiques of the fake studies—and in a scientific sense, neither study was stronger than the other. Yet in each case, advocates more heavily criticized the study whose conclusions disagreed with their own, while describing the study that was more ideologically congenial as more "convincing."


    And it's not just that people twist or selectively read scientific evidence to support their preexisting views. According to research by Yale Law School professor Dan Kahan [14] and his colleagues, people's deep-seated views about morality, and about the way society should be ordered, strongly predict whom they consider to be a legitimate scientific expert in the first place—and thus where they consider "scientific consensus" to lie on contested issues.


    In Kahan's research [15] (PDF), individuals are classified, based on their cultural values, as either "individualists" or "communitarians," and as either "hierarchical" or "egalitarian" in outlook. (Somewhat oversimplifying, you can think of hierarchical individualists as akin to conservative Republicans, and egalitarian communitarians as liberal Democrats.) In one study, subjects in the different groups were asked to help a close friend determine the risks associated with climate change, sequestering nuclear waste, or concealed carry laws: "The friend tells you that he or she is planning to read a book about the issue but would like to get your opinion on whether the author seems like a knowledgeable and trustworthy expert." A subject was then presented with the résumé of a fake expert "depicted as a member of the National Academy of Sciences who had earned a Ph.D. in a pertinent field from one elite university and who was now on the faculty of another." The subject was then shown a book excerpt by that "expert," in which the risk of the issue at hand was portrayed as high or low, well-founded or speculative. The results were stark: When the scientist's position stated that global warming is real and human-caused, for instance, only 23 percent of hierarchical individualists agreed the person was a "trustworthy and knowledgeable expert." Yet 88 percent of egalitarian communitarians accepted the same scientist's expertise. Similar divides were observed on whether nuclear waste can be safely stored underground and whether letting people carry guns deters crime. (The alliances did not always hold. In another study [16] (PDF), hierarchs and communitarians were in favor of laws that would compel the mentally ill to accept treatment, whereas individualists and egalitarians were opposed.)

    In other words, people rejected the validity of a scientific source because its conclusion contradicted their deeply held views—and thus the relative risks inherent in each scenario. A hierarchal individualist finds it difficult to believe that the things he prizes (commerce, industry, a man's freedom to possess a gun to defend his family [16]) (PDF) could lead to outcomes deleterious to society. Whereas egalitarian communitarians tend to think that the free market causes harm, that patriarchal families mess up kids, and that people can't handle their guns. The study subjects weren't "anti-science"—not in their own minds, anyway. It's just that "science" was whatever they wanted it to be. "We've come to a misadventure, a bad situation where diverse citizens, who rely on diverse systems of cultural certification, are in conflict," says Kahan [17].



    And that undercuts the standard notion that the way to persuade people is via evidence and argument. In fact, head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.



    Take, for instance, the question of whether Saddam Hussein possessed hidden weapons of mass destruction just before the US invasion of Iraq in 2003. When political scientists Brendan Nyhan and Jason Reifler showed subjects fake newspaper articles [18] (PDF) in which this was first suggested (in a 2004 quote from President Bush) and then refuted (with the findings of the Bush-commissioned Iraq Survey Group report, which found no evidence of active WMD programs in pre-invasion Iraq), they found that conservatives were more likely than before to believe the claim.

    So the researchers set up a study [20] (PDF) in which they discussed the topic with some of these Republicans in person. They would cite the findings of the 9/11 Commission, as well as a statement in which George W. Bush himself denied his administration had "said the 9/11 attacks were orchestrated between Saddam and Al Qaeda."

    As it turned out, not even Bush's own words could change the minds of these Bush voters—just 1 of the 49 partisans who originally believed the Iraq-Al Qaeda claim changed his or her mind. Far more common was resisting the correction in a variety of ways, either by coming up with counterarguments or by simply being unmovable:

    [/b]Interviewer: [T]he September 11 Commission found no link between Saddam and 9/11, and this is what President Bush said. Do you have any comments on either of those?

    Respondent: Well, I bet they say that the Commission didn't have any proof of it but I guess we still can have our opinions and feel that way even though they say that.[/b]

    ...



    A key question—and one that's difficult to answer—is how "irrational" all this is. On the one hand, it doesn't make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information. "It is quite possible to say, 'I reached this pro-capital-punishment decision based on real information that I arrived at over my life,'" explains Stanford social psychologist Jon Krosnick [23]. Indeed, there's a sense in which science denial could be considered keenly "rational." In certain conservative communities, explains Yale's Kahan, "People who say, 'I think there's something to climate change,' that's going to mark them out as a certain kind of person, and their life is going to go less well."

    This may help explain a curious pattern Nyhan and his colleagues found when they tried to test the fallacy [6] (PDF) that President Obama is a Muslim. When a nonwhite researcher was administering their study, research subjects were amenable to changing their minds about the president's religion and updating incorrect views. But when only white researchers were present, GOP survey subjects in particular were more likely to believe the Obama Muslim myth than before. The subjects were using "social desirabililty" to tailor their beliefs (or stated beliefs, anyway) to whoever was listening.
    annddd just a lil more
    Quote Originally Posted by Adam Smith - Wealth of Nations
    It is not very unreasonable that the rich should contribute to the public expense, not only in proportion to their revenue, but something more than in that proportion.
    Reply With Quote  
     

  3. #3  
    Senior Ape Articulate_Ape's Avatar
    Join Date
    Jan 2010
    Location
    NJ, Exit Only
    Posts
    7,967
    I don't buy it.
    "The efforts of the government alone will never be enough. In the end the people must choose and the people must help themselves" ~ JFK; from his famous inauguration speech (What Democrats sounded like before today's neo-Liberals hijacked that party)
    Reply With Quote  
     

  4. #4  
    Our widdle friend. Wei Wu Wei's Avatar
    Join Date
    Jan 2010
    Posts
    6,414
    last one

    Okay, so people gravitate toward information that confirms what they believe, and they select sources that deliver it. Same as it ever was, right? Maybe, but the problem is arguably growing more acute, given the way we now consume information—through the Facebook links of friends, or tweets that lack nuance or context, or "narrowcast [25]" and often highly ideological media that have relatively small, like-minded audiences. Those basic human survival skills of ours, says Michigan's Arthur Lupia, are "not well-adapted to our information age."


    If you wanted to show how and why fact is ditched in favor of motivated reasoning, you could find no better test case than climate change. After all, it's an issue where you have highly technical information on one hand and very strong beliefs on the other. And sure enough, one key predictor of whether you accept the science of global warming is whether you're a Republican or a Democrat. The two groups have been growing more divided in their views about the topic, even as the science becomes more unequivocal.

    So perhaps it should come as no surprise that more education doesn't budge Republican views. On the contrary: In a 2008 Pew survey [26], for instance, only 19 percent of college-educated Republicans agreed that the planet is warming due to human actions, versus 31 percent of non-college educated Republicans. In other words, a higher education correlated with an increased likelihood of denying the science on the issue. Meanwhile, among Democrats and independents, more education correlated with greater acceptance of the science.



    Other studies have shown a similar effect: Republicans who think they understand the global warming issue best are least concerned about it; and among Republicans and those with higher levels of distrust of science in general, learning more about the issue doesn't increase one's concern about it. What's going on here? Well, according to Charles Taber and Milton Lodge of Stony Brook, one insidious aspect of motivated reasoning is that political sophisticates are prone to be more biased than those who know less about the issues. "People who have a dislike of some policy—for example, abortion—if they're unsophisticated they can just reject it out of hand," says Lodge. "But if they're sophisticated, they can go one step further and start coming up with counterarguments." These individuals are just as emotionally driven and biased as the rest of us, but they're able to generate more and better reasons to explain why they're right—and so their minds become harder to change.




    So is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism. Its most famous proponents are an environmentalist (Robert F. Kennedy Jr. [29]) and numerous Hollywood celebrities (most notably Jenny McCarthy [30] and Jim Carrey). The Huffington Post gives a very large megaphone to denialists. And Seth Mnookin [31], author of the new book The Panic Virus [32], notes that if you want to find vaccine deniers, all you need to do is go hang out at Whole Foods.


    Vaccine denial has all the hallmarks of a belief system that's not amenable to refutation. Over the past decade, the assertion that childhood vaccines are driving autism rates has been undermined [33] by multiple epidemiological studies—as well as the simple fact that autism rates continue to rise, even though the alleged offending agent in vaccines (a mercury-based preservative called thimerosal) has long since been removed.

    Yet the true believers persist—critiquing each new study that challenges their views, and even rallying to the defense of vaccine-autism researcher Andrew Wakefield, after his 1998 Lancet paper [34]—which originated the current vaccine scare—was retracted and he subsequently lost his license [35] (PDF) to practice medicine. But then, why should we be surprised? Vaccine deniers created their own partisan media, such as the website Age of Autism, that instantly blast out critiques and counterarguments whenever any new development casts further doubt on anti-vaccine views.

    It all raises the question: Do left and right differ in any meaningful way when it comes to biases in processing information, or are we all equally susceptible?


    There are some clear differences. Science denial today is considerably more prominent on the political right—once you survey climate and related environmental issues, anti-evolutionism, attacks on reproductive health science by the Christian right, and stem-cell and biomedical matters.


    Some researchers have suggested that there are psychological differences between the left and the right that might impact responses to new information—that conservatives are more rigid and authoritarian, and liberals more tolerant of ambiguity. Psychologist John Jost of New York University has further argued that conservatives are "system justifiers": They engage in motivated reasoning to defend the status quo.


    This is a contested area, however, because as soon as one tries to psychoanalyze inherent political differences, a battery of counterarguments emerges: What about dogmatic and militant communists? What about how the parties have differed through history? After all, the most canonical case of ideologically driven science denial is probably the rejection of genetics in the Soviet Union, where researchers disagreeing with the anti-Mendelian scientist (and Stalin stooge) Trofim Lysenko were executed, and genetics itself was denounced as a "bourgeois" science and officially banned.


    The upshot: All we can currently bank on is the fact that we all have blinders in some situations. The question then becomes: What can be done to counteract human nature itself?



    Given the power of our prior beliefs to skew how we respond to new information, one thing is becoming clear: If you want someone to accept new evidence, make sure to present it to them in a context that doesn't trigger a defensive, emotional reaction.

    In other words, paradoxically, you don't lead with the facts in order to convince. You lead with the values—so as to give the facts a fighting chance.
    That last sentence is the ticket. This is why I love right-wing pundit shows, because they figured this out long ago and have mastered the art of using this knowledge effectively.
    Quote Originally Posted by Adam Smith - Wealth of Nations
    It is not very unreasonable that the rich should contribute to the public expense, not only in proportion to their revenue, but something more than in that proportion.
    Reply With Quote  
     

  5. #5  
    Our widdle friend. Wei Wu Wei's Avatar
    Join Date
    Jan 2010
    Posts
    6,414
    Quote Originally Posted by Articulate_Ape View Post
    I don't buy it.
    :D
    Quote Originally Posted by Adam Smith - Wealth of Nations
    It is not very unreasonable that the rich should contribute to the public expense, not only in proportion to their revenue, but something more than in that proportion.
    Reply With Quote  
     

  6. #6  
    Banned
    Join Date
    May 2010
    Posts
    812
    I read this too Wei... its been making the rounds.

    CU certainly lept to mind.
    Reply With Quote  
     

  7. #7  
    PORCUS MAXIMUS Rockntractor's Avatar
    Join Date
    Apr 2009
    Location
    oklahoma
    Posts
    42,333
    The difference between pigs and people is that when they tell you you're cured it isn't a good thing.
    http://i.imgur.com/FHvkMSE.jpg
    Reply With Quote  
     

  8. #8  
    Our widdle friend. Wei Wu Wei's Avatar
    Join Date
    Jan 2010
    Posts
    6,414
    Quote Originally Posted by KhrushchevsShoe View Post
    I read this too Wei... its been making the rounds.

    CU certainly lept to mind.
    ;)
    Quote Originally Posted by Adam Smith - Wealth of Nations
    It is not very unreasonable that the rich should contribute to the public expense, not only in proportion to their revenue, but something more than in that proportion.
    Reply With Quote  
     

  9. #9  
    CU's Tallest Midget! PoliCon's Avatar
    Join Date
    Aug 2008
    Location
    Pittsburgh PA
    Posts
    25,328
    Quote Originally Posted by KhrushchevsShoe View Post
    I read this too Wei... its been making the rounds.

    CU certainly lept to mind.
    Oh is this the latest talking point the left is sending around?
    Stand up for what is right, even if you have to stand alone.
    Reply With Quote  
     

  10. #10  
    Our widdle friend. Wei Wu Wei's Avatar
    Join Date
    Jan 2010
    Posts
    6,414
    liberals do it too bud.

    just try showing evidence to a liberal that isn't Politically Correct

    this is real stuff, psychological phenomenon that affect everyone
    Quote Originally Posted by Adam Smith - Wealth of Nations
    It is not very unreasonable that the rich should contribute to the public expense, not only in proportion to their revenue, but something more than in that proportion.
    Reply With Quote  
     

Bookmarks
Bookmarks
Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •