President Obama speaks to President Bush to mark the end of combat operations in Iraq; hopefully the conversation went better than the polarized commentary since
Obama in his presidential address Tuesday night:
“This afternoon I spoke to former President George W. Bush. It’s well known that he and I disagreed about the war from its outset. Yet no one can doubt President Bush’s support for our troops or his love of country and commitment to our security. As I said, there were patriots who supported this war and patriots who opposed it. All of us are united in appreciation for our servicemen and servicewomen and our hopes for Iraq’s future.”
“The greatness of our democracy is grounded in our ability to move beyond our differences, to learn from our experience as we confront the many challenges ahead.”
And as night follows day… Commentary from the left since the address were angry Obama would give Bush anything given what they see as the catastrophic nature of the decision to invade Iraq and the falsehoods that led to it. And from the right they accused him of having no class because he didn’t outright credit Bush for the surge (on Limbaugh the guest host said it was a “small speech by a small man”).
In this environment, it’s hard to know how anyone can lead us.
Joe Keohane writes a powerful piece on how our entrenched political opinion resists fact that contradicts it. Here’s a snip of an article that’s just so good that it’s going straight into the Village Square library, but we’d strongly recommend you head straight to Boston.com and read the whole piece.
Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
This bodes ill for a democracy, because most voters – the people making decisions about how the country runs – aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
The general idea is that it’s absolutely threatening to admit you’re wrong, says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon – known as “backfire” – is “a natural defense mechanism to avoid that cognitive dissonance.”
Read the rest of the article HERE.