Universal Translator

Wednesday, November 28, 2012

With Respect to Rationality


            By and large, humans are not good at thinking rationally, but, really, who can blame us? We have not had much practice at it. According to the accepted wisdom of anthropologists and paleontologists, by no later than 12,000 B.C.E. – approximately 14 millennia ago – humans already had spread to and settled in every continent on the planet except Antarctica. Yet we only started to make rational sense of our existence a few hundred years ago. Prior to the mid-seventeenth century, nearly all human understanding of ourselves and our world arose out of a mythic, poetic, narrative sense – not a rational one. Of course, that is because “rationality” – properly understood – is not a concept easily embraced by humans. We are drawn to certainty and to firm, fixed answers, but for an idea to be rational it must be capable of being proved false. To embrace rationality means to embrace uncertainty, rationality’s essence.

            This injunction runs directly counter to the goal of nearly every person who seeks to understand anything. When a person asks, “Why did such-and-such happen?” or “How does such-and-such work?” or, more generally, “What is the right thing to do?” that person is unlikely to be persuaded by a response that begins, “Well, to the best of our understanding . . . .” As a general rule, people prefer conviction and certainty over querulous equivocation; in fact, people prefer it so much that often they will embrace terrible policies and follow awful leaders so long as those leaders speak forcefully enough in support of whatever rotten ideas they happen to be pushing. As Bill Clinton once famously observed, “Americans prefer strong and wrong over weak and right.”
             It is to the human race’s great credit that we are slowly learning to give up our natural desire for certainty when it conflicts with our desire to better understand the world around us – an understanding that can only be achieved through embracing the ever shifting, uncertain terrain of rational thought. The great philosopher of science Karl Popper alluded to this idea in his writings about the need for what he termed “empirical falsification.” Popper’s writings suggest that no idea, concept, or assertion qualifies as rational unless it has the potential to be proved false. If the idea, concept, or assertion is incapable of being proved false, then it is not rational but is instead simply an unverifiable belief.
            Consider the European discovery of Cygnus atratus, the black swan. For millennia, every swan ever seen by any European had been white. Accordingly, Europeans rationally believed that “all swans are white.” The supposed proof of this assertion lay in the fact that every time a new swan was seen – sure enough! – it, too, was white. So it came as quite a shock to Europeans when they landed in Australia and beheld a black swan for the first time. Nevertheless, what made the Europeans’ idea “all swans are white” a rational concept was not its correctness or incorrectness, but the possibility (later realized) that it might be proved wrong. Sighting after sighting after sighting of white swans, for years and for decades and for centuries, certainly suggested that all swans are white. But it took only a single example of one black swan to falsify that suggestion. Having been proven false, the idea then could be revised and refined until it more closely approximated what now appears to be the case: most swans are white.
            Unfortunately, it is difficult to internalize the concept that an idea must be capable of falsification in order to be rational, and this difficulty has led to some catastrophic consequences. For example, on June 6, 2002, U.S. Defense Secretary Donald Rumsfeld held a press conference at NATO headquarters. At the time, the Bush Administration was arguing that going to war against Iraq was a necessary response to the supposed threat posed by Saddam Hussein’s possession of weapons of mass destruction. However, that argument was undermined by the Bush Administration’s inability to present any actual evidence that Hussein did, in fact, possess such weapons. When a reporter asked Rumsfeld upon what basis the Bush White House was making this claim, Rumsfeld replied, “the absence of evidence is not the evidence of absence. . . . Simply because you do not have evidence that something exists does not mean that you have evidence that it doesn't exist.”
            According to reporters Michael Isikoff and David Corn, this was not merely an off the cuff remark uttered by Rumsfeld in the middle of a press conference. In their 2006 New York Times bestseller, Hubris: The Inside Story of Spin, Scandal, and the Selling of the Iraq War, Isikoff and Corn reveal that Rumsfeld’s assertion about the absence of evidence was actually an argument routinely made by Undersecretary of Defense Douglas J. Feith, who had been charged with making the case internally for going to war with Iraq, and whom General Tommy Franks – the man who led both the 2001 Afghanistan invasion and the Iraq War – once called, “the dumbest fucking guy on the planet.”
            According to Isikoff and Corn, Feith argued that if Saddam Hussein possessed weapons of mass destruction – as the Bush Administration believed – then it only made sense that Hussein would hide those weapons. Accordingly, the failure of U.S. intelligence and UN inspectors to find any actual evidence that Hussein possessed WMD was perfectly consistent with the idea that Hussein did possess WMD. As a result, Feith argued, the lack of evidence that Saddam Hussein was a dangerous threat could in itself be considered evidence that Hussein was a dangerous threat. Relying upon such arguments, George W. Bush invaded Iraq, tens of thousands of U.S. troops were wounded or killed, and hundreds of thousands of Iraqis – men, women, children, and infants – died horribly. Of course, no WMD were ever found.
            Even if one puts aside the dubious nature of Bush’s good faith in invading Iraq and simply accepts that he and his advisors honestly believed the rationale they were mouthing to one another, one still is struck by how completely irrational these people were when they committed the United States and Iraq to war and terrible bloodshed. They posited that the United States had to invade Iraq because Saddam Hussein had weapons of mass destruction, then claimed support for this assertion by pointing to the fact there was no evidence Saddam Hussein had weapons of mass destruction. Simply stated, they made a case in favor of war that could not, under any circumstances, be proved false without actually going to war – which was the entire point to begin with. It is difficult to imagine an argument more irrational, or one with more tragic consequences.
* * *
            Our understanding of the world is and can only be a mere model of the real thing. Our ideas exist only within the confines of our skulls, and it is something of a miracle that within such a small space – only a few hundred cubic centimeters – we are capable of representing the Universe. Unfortunately, we often forget that the representation we create for ourselves is not the reality, and that the map we draw in our heads is not the territory. Every map, no matter how finely drawn, is only an approximation of what it represents; some information is always lost when the map is created, some terrain is never represented entirely correctly.
            To have the surest guide, we must be willing always to embrace the idea that the map we have drawn might be wrong and might need to be revised. Of course, that also means embracing the idea that we sometimes will get a little lost, and that at times we may even have to reverse course, but we can accept such setbacks if we also recognize that changing our map when necessary is the only way by which we can progress, stumblingly and haphazardly, to the Truth. If we give in to our natural impulse to sacrifice doubt for certainty, to give up rationality for belief, then we will end up drawing our map so that that its errors cannot be repaired; if that happens, then eventually we will wander where “there be dragons” and, ultimately, we will perish.

3 comments:

  1. This comment has been removed by the author.

    ReplyDelete
  2. REVISED -- typos corrected.

    Actually human beings are quite good at rationality, and have been for a long long time. Hitler, Goering and Goebbels come to mind. The Greeks were no slouches either — determining the earth was round three millenia ago and theorizing the existence of the atom. Philosophies and religions have their groundings in rationality, albeit they may at times woefully stray. Rationality is no more than a tool connecting dots. The more information we have the more dots we connect. But rationality, or reasoning if you will, is only a tool.

    Now those little maps we carry around in our heads. Those attempts to create some sort of map of reality that we can use. Therein lies a problem. If your map is different than my map, why then we disagree. And if different enough I may have to go to war and kill you.

    When we accept the world as it is by welcoming it and others in, all is good. This is not the case when we have expectations, when we carry around a pocket full of should's and should not's. Then we run into trouble.

    Most dangerous if we allow ourselves to do it is create that little map of ourselves which we then go to great lengths to justify and defend.

    Rationality allows us to build jet airplanes and bake tasty casseroles. We risk much if we ask more from a tool than that it serve its master.

    Tools do not tell us how we should lead our lives or if we should or should not go where indeed there may or may not be dragons.

    ReplyDelete
  3. Thank you for posting.Very well written.Waiting for updating
    friv 2016

    ReplyDelete