Science

Scripture

 Topics

Other Topics

The New 3 "Rs"

Thanksgiving - Origin

Christmas - Origin

Truth, Integrity & Ethics

Science and Religion

Famous American Quotes

Homosexuality

Whale Evolution (Macro)

 

 

 

 

 

Misc. Areas of Interest

Reviews (Books)

Reviews (Movies)

Abortion

Josephus & Christianity

Christians Need Evolution

Why did Jesus not return?

Why evidence is not effective

 

 

 Links

Science & Religion links

Science-Religion Conflicts

 Human Migration

Adam & Eve - Genomics

The Church & Evolution

Intelligent Design

Young Earth Creationism

Theistic Evolution

Christianity & Evolution

 

 

  Summary

Macroevolution is true

Human evolution is true

Scriptures are not inspired

Theism not believable

It's not about the evidence

World Views In Collision

 

Why Care?

 

 

Feelings.  Red pill - Blue pill

Terms

Vestigial Structures

Atavisms

DNA Evidence - Insertions

 1. ERVs   2. Transposons

Human Chrom. 2 Fusion

Pseudogenes

Human Lice & Evolution

Why did they say that?

Old Testament

    Job

    Old Testament Narratives

    Biblical Genocide

    Noahian Flood

    "Firmament" - Flat Earth

Document Changes

Scriptural Contradictions

Who Wrote The Bible?

 

 

 

Veritas Super Omnia 

Items

Origine

Summary Comments 2

8. Really, for most believers, in the final analysis evidence that challenges their belief system is not accepted. Why? Recent articles tease out the various reasons.

 

"A man with a conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point". ~ Leon Festinger, Stanford University

 

How facts backfire

Researchers discover a surprising threat to democracy: our brains

 

By Joe Keohane

July 11, 2010

 

It’s one of the great assumptions underlying modern democracy that an informed citizenry is preferable to an uninformed one. “Whenever the people are well-informed, they can be trusted with their own government,” Thomas Jefferson wrote in 1789. This notion, carried down through the years, underlies everything from humble political pamphlets to presidential debates to the very notion of a free press. Mankind may be crooked timber, as Kant put it, uniquely susceptible to ignorance and misinformation, but it’s an article of faith that knowledge is the best remedy. If people are furnished with the facts, they will be clearer thinkers and better citizens. If they are ignorant, facts will enlighten them. If they are mistaken, facts will set them straight.

 

Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

 

This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

 

“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”

 

These findings open a long-running argument about the political ignorance of American citizens to broader questions about the interplay between the nature of human intelligence and our democratic ideals. Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information. And then we vote.

 

This effect is only heightened by the information glut, which offers — alongside an unprecedented amount of good information — endless rumors, misinformation, and questionable variations on the truth. In other words, it’s never been easier for people to be wrong, and at the same time feel more certain that they’re right.

 

“Area Man Passionate Defender Of What He Imagines Constitution To Be,” read a recent Onion headline. Like the best satire, this nasty little gem elicits a laugh, which is then promptly muffled by the queasy feeling of recognition. The last five decades of political science have definitively established that most modern-day Americans lack even a basic understanding of how their country works. In 1996, Princeton University’s Larry M. Bartels argued, “the political ignorance of the American voter is one of the best documented data in political science.”

 

On its own, this might not be a problem: People ignorant of the facts could simply choose not to vote. But instead, it appears that misinformed people often have some of the strongest political opinions. A striking recent example was a study done in the year 2000, led by James Kuklinski of the University of Illinois at Urbana-Champaign. He led an influential experiment in which more than 1,000 Illinois residents were asked questions about welfare — the percentage of the federal budget spent on welfare, the number of people enrolled in the program, the percentage of enrollees who are black, and the average payout. More than half indicated that they were confident that their answers were correct — but in fact only 3 percent of the people got more than half of the questions right. Perhaps more disturbingly, the ones who were the most confident they were right were by and large the ones who knew the least about the topic. (Most of these participants expressed views that suggested a strong antiwelfare bias.)

 

Studies by other researchers have observed similar phenomena when addressing education, health care reform, immigration, affirmative action, gun control, and other issues that tend to attract strong partisan opinion. Kuklinski calls this sort of response the “I know I’m right” syndrome, and considers it a “potentially formidable problem” in a democratic system. “It implies not only that most people will resist correcting their factual beliefs,” he wrote, “but also that the very people who most need to correct them will be least likely to do so.”

 

What’s going on? How can we have things so wrong, and be so sure that we’re right? Part of the answer lies in the way our brains are wired. Generally, people tend to seek consistency. There is a substantial body of psychological research showing that people tend to interpret information with an eye toward reinforcing their preexisting views. If we believe something about the world, we are more likely to passively accept as truth any information that confirms our beliefs, and actively dismiss information that doesn’t. This is known as “motivated reasoning.” Whether or not the consistent information is accurate, we might accept it as fact, as confirmation of our beliefs. This makes us more confident in said beliefs, and even less likely to entertain facts that contradict them.

 

Pages 3 and 4 continued here  (from: www.boston.com/bostonglobe/ideas/articles)

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Another discussion of the issue of how our emotions and selection for pattern recognition can lead us to hold onto wrong beliefs can found here: What Drives Irrational Rhetoric? The Case of Childhood Vaccinations

 

Also, see this Link  (A Tale of Three Creationists) to an article relating a Christian Ph.D. geneticist who acknowledges the evidence for evolution but then rejects it on solely a faith argument. We must again ask, does evidence matter to some?

 

The Science of Why We Don't Believe Science (2011)

 

What is Motivated Reasoning? How Does It Work?

 

Antievolutionism: Need for Closure, Fear and Disgust (2011)

 

“One of the things cognitive science teaches us is that when people define their very identity by a worldview, or a narrative, or a mode of thought, they are unlikely to change–for the simple reason that it is physically part of their brain, and so many other aspects of their brain structure would also have to change; that change is highly unlikely.”

~ George Lakoff

 

~ Biomed