Monday, February 28, 2005

The Only Constant Is Change

As some of you have no doubt noticed, I've changed the color scheme of the web site rather dramatically. It seems that there was a small, yet vocal minority of individuals who were incessantly whining about the white-text-on-black-background look that preceded what you see now. Apparently, it was giving them headaches or seizures or something. Anyway, the squeaky wheel gets the grease.

If you'd like to comment or vent about the new look, this is the place to do it. I'm fairly pleased with the transformation, but clearly I can be persuaded if there's a groundswell against it. So, speak now or forever hold your peace.

Sunday, February 27, 2005

Traumatic Reverberation, Part I

With the passage of the 13th amendment in 1865, 250 years of slavery on American soil came to an end. In the intervening 140 years, approximately 5 generations have come into existence never directly experiencing America's original sin. There are perhaps a handful of individuals alive today who have ever met a living former slave or slaveholder. The institution is as dead as something can be, a discarded relic of our distant past.

But, in the immortal words of William Faulkner, "The past is not dead. In fact, it's not even past."

As much as we wish it was not the case, the legacy of slavery lives on today in many different forms. The economic and educational disadvantages faced by the newly released slaves in 1865 have yet to be fully erased. Certain laws enacted during this period, done with the intention of preserving the marginalization of the African-American community, remain on the books today (most notably the Florida statute barring convicted felons from voting). Of course, the racism that became culturally entrenched during this period still exists. It may be less intense and less overt than it once was, but it continues to percolate beneath the surface. And more often then we would like to admit, it boils over.

But what of the psychological legacy? Is it possible that, after all these years, the descendents continue to bear the emotional scars of their bonded ancestors? Again, we might cling to the notion that time heals all wounds. But to do so would be to ignore our large body of evidence demonstrating the persistence of trauma across generations. That cliché might provide us with a small measure of comfort, but it would merely be turning a blind eye to the reality that we face today.

POST-TRAUMATIC STRESS DISORDER (PTSD)

In the last quarter century or so, the field of psychology has come to recognize the persistent long-term effects of trauma exposure. It was originally a condition associated with the battlefield and was referred to as "shell-shock." More recently we've come to understand the genesis of PTSD to be far more expansive. Victims of kidnapping and violent sexual or physical assaults frequently display PTSD symptomology. Even witnessing violence can trigger the onset of PTSD. In short, nearly any exposure of violence has the potential to leave a lasting mark on the victim or witness.

While it remains difficult to predict which experiences will lead to PTSD in which individuals, some correlations have been noted. For example, individuals with pre-existing mental health conditions (such as borderline personality disorder) or who have low self-esteem tend to exhibit PTSD symptomology at higher rates than the population mean. Exposure to previous traumatic events also appears to increase the incidence of the disorder. The quality and competency of one's social support structure plays a role both in onset and in ultimate recovery. And, naturally, the nature and severity of the triggering event is a significant determining factor. That said, PTSD can emerge in individuals who lack all of the previously mentioned risk factors. Ultimately, there's no telling.

The range of symptoms is quite wide and can be dramatically affected by the nature of the triggering event and the environment in which it occurs. These symptoms can be broken down into three general categories: intrusive, avoidant, and hyperarousal. Within these categories, the symptoms break down in the following manner.
  • Intrusive
      Dissociative states
      Flashbacks
      Intrusive emotions and memories
      Nightmares and night terrors
  • Avoidant
      Avoiding emotions
      Avoiding relationships
      Avoiding responsibility for others
      Avoiding situations that are reminiscent of the traumatic event
  • Hyperarousal
      Exaggerated startle reaction
      Explosive outbursts
      Extreme vigilance
      Irritability
      Panic symptoms
      Sleep disturbance
  • Typically the symptoms are compounded by the following complications.

    Alcohol and drug abuse or dependence
    Chronic anxiety
    Depression and increased risk for suicide
    Divorce and separation
    Guilt
    Low self-esteem
    Panic attacks
    Phobias
    Unemployment
    While we have come a long way in our understanding of this disorder, treatment remains as much an art as it is a science. Various psychotherapeutic modalities in conjunction with medication (including antidepressants, antipsychotics, anti-anxiety agents, and mood stabilizers) have had some success in ameliorating some of the more serious symptoms. Yet, a true cure remains elusive even in this modern age. And those of a preceding era would have been forced to rely on nothing more than community support and their internal constitution. Ultimately, for them, the condition would remain largely unresolved.

    ITEMIZING TRAUMA

    Too often, I think, we blithely acknowledge the brutality of the peculiar institution without consciously facing it. But for reasons that will soon be clear (if they are not already), we can't ignore the details in the course of this discussion.

    Those entering bondage during the early history of American slavery largely did so as prisoners of war. Whether they were captured by a warring tribal faction or directly by slave traders, the ultimate result was the same: a forced relocation that almost universally obliterated all familial and social ties. The process of relocation was quite barbaric in and of itself. Potential slaves were packed into tiny compartments for the transatlantic voyage. The ceilings in these compartments were often so low that the Africans could not stand erect. Ventilation was poor, mobility severely truncated, and sanitation nonexistent. These were the conditions they were forced to endure for a journey that lasted between 90 and 120 days. The mortality rates on these voyages were shocking, with some estimates ranging as high as 30%. Finally, efforts to segregate the diseased and the dead from the living and healthy were lax at best, which often meant that the prisoners spent long periods of time crowded up against seriously ill individuals and corpses.

    The experience upon reaching American shores improved slightly if at all. Once slavery had devolved into its most virulent form, extremely draconian methods were employed to retain control over the slave population. Whippings, maiming and mutilation, branding, and castration were common behavior management techniques. Uncontrollably rebellious slaves were murdered by decapitation, hanging, or outright torture. Their bodies were often publicly displayed as a warning to other potential malcontents.

    However, within this environment, familial structures began to reemerge. Slaves married and bore children. Yet, this created little stability. Considered property of the slaveowner, family members were frequently sold to distant plantations. All families lived in constant fear of such forced removals.

    Female slaves faced an additional threat within their life of bondage: rape. Sexual assault perpetrated by white slave owners was extremely common. All too often these encounters produced mulatto offspring. When this happened, these children were accepted into the slave families. However, they also served as a constant reminder of the traumatic method of their conception.

    Finally, everything that the slave experienced they experienced more or less in the open. This meant that everyone (white, black, slave, free, adult, child) was a witness. Since the laws of man and God at the time supported the institution, there was no need to conceal the methodology of its maintenance. Nothing was hidden.

    DIAGNOSIS

    In one sense, I'm making a fairly obvious argument today: slaves were treated poorly and suffered for it psychologically. I'm not exactly going out on a limb on this one. But, I am trying to take the argument a little further and suggest that the type of trauma that they endured would have led to a high incidence of a very specific psychological disorder. And I think that if you look at the triggering mechanisms and risk factors for post-traumatic stress disorder and compare them against the experience of an American slave, the population's predisposition for the disorder becomes clear.

    Kidnapping and exposure to violence (as a victim or as a witness) frequently induces PTSD. The forced relocation from Africa and the common practice of selling slaves are essentially nothing more than a form of kidnapping. Violence (physical and sexual) was often directed at slaves, and those who were not the object of this violence were regularly witness to it. The inhumanity of the transatlantic voyage was surely a traumatizing experience. Likewise, the displayed remains of rebellious slaves would have left a deep mark on any exposed to them.

    But even more significant than the actual trauma was the susceptibility of the population. The chronic nature of violence in the slave experience meant that any exposure was preceded by previous traumatic events. Familial and social structures were constantly being deconstructed, thus depriving slaves of the support structures that might reduce the negative consequences of any traumatic experience. Low self-esteem, a recognized risk factor for PTSD, was undoubtedly rampant in slave populations. And many common slave experiences (physical/sexual abuse, neglect, parental loss or separation) presupposed individuals for developing borderline personality disorder, yet another PTSD risk factor.

    In short, if you were attempting to induce PTSD in a large population, you would design a system not unlike the American institution of slavery.

    At this point, my conclusions are speculative. No psychologist would ever attempt to diagnose an individual sight unseen. Moreover, at this point I know of no historical sources that describe symptoms commonly associated with PTSD. Therefore, it is impossible to know with any certainty how this disorder manifested in the population in question. However, the confluence of trauma and risk factors suggest that its incidence could have been extremely high. If that were the case, the consequences for the community at large would have been devastating.

    This entry is turning into quite a monstrosity, so I'm going to stop here and continue this argument in a new post. At that time, I will investigate the potential ramifications of widespread trauma disorder for the slaves, for their descendents, and ultimately for us all.

    Proceed to Traumatic Reverberations, Part II.

    Return to Slavery and History Index.

    Saturday, February 26, 2005

    Best Friday Cat-Blog. Ever!

    I'm about to publish the next segment in the slavery series that I'm working on. But, since it's going to be a few more hours, here's something a little lighter to chew on. Enjoy!

    (Be sure to read the comments when you get there -- you'll see what I mean.)

    Thursday, February 24, 2005

    A Downward Progression

    Perhaps the most startling revelation that I have come to these last few days regarding the institution of slavery was how little I knew about the institution itself. The little time that my high school American History class spent discussing the subject largely focused on its political implications in the years preceding the Civil War. I seem to recall brief mention of the institutions inhumanity, but these references were rather oblique and lacked any real examination of its horrors. It was as though its existence in the historical narrative served only as contrast to the progress achieved following its abolition, its evil presented so that we might take pride in its eradication.

    Thus, for me the details of slavery were provided by popular culture. Gone with the Wind, Roots, and The Blue and Gray television miniseries were my primary sources of information. And since two of those three sources focused primarily on white Americans (and one is a factually challenged romanticization of plantation existence), these presentations did little to correct my ignorance. Even so, I did come to understand some basic facts. Slavery was an institution that transformed African Americans into property, valued only for their ability to work. The institution was maintained via unspeakable cruelties and freedom, for both the individual and his or her descendents, was unachievable. At the time, that seemed to be all one needed to know.

    Of course, while that is an apt description for a certain time and place, it obscures the depth and complexity of slavery's 250 year existence in what is now the United States. What actually occurred during those years is incredibly revealing and, in fact, challenges one of the most central notions of American mythology. In so, a fresh look is overdue.

    THE BEGINNING

    The first 20 African slaves arrived in the Jamestown colony during 1619. At the time, Jamestown was struggling to become an economically viable outpost in the New World. After several failed attempts at various colonial industries, a strain of tobacco began to show a potential for profitability. However, tobacco cultivation required a massive and inexpensive labor force, something in short supply. To remedy this dilemma, the colonists had two choices: indentured servitude and slavery.

    Despite the distinction, there was initially little different between the two labor categories. Indentured servants were bound to service for a specific time period in exchange for passage across the Atlantic. Upon completion of their contract, they were released. Likewise, though slaves did not enter into the arrangement willingly, their term of service was not lifelong. Moreover, since the English did not traditionally enslave fellow Christians, slaves could occasionally gain their freedom through conversion.

    At approximately the same time, slavery emerged in Dutch New Amsterdam (which would eventually become New York City). The first 11 slaves arrived during 1624 and were immediately put to work constructing the colony's infrastructure. The terms of their servitude were somewhat stricter than their southern brethren, as there were no explicit routes to freedom. However, they were allowed to earn and keep wages. Also, they had access to the courts and used them to recoup wages that they were owed. And more generally, they realized the critical role that they played in their society and used this knowledge to negotiate favorable settlements on many issues. Ultimately these negotiations led to a change in their status from enslaved to what was known as half-freedom. This allowed them to lease and work their own property. They could still be called upon to provide free labor for the colony, and their children were born slaves. Yet, it did represent a small degree of upward mobility.

    Make no mistake. Life as a slave during this era was exceptionally difficult. Your primary value to the society in which you lived was determined by the labor you performed. As such, slaveholders did all they could to extract this value from you. Days were long, hard, and largely thankless. But for all the hardships, freedom was an achievable goal. Hope was not lost.

    DEVOLUTION

    From the perspective of the Virginia tobacco farmer, they were certain problems inherent with their adopted labor system. As slaves and indentured servants completed their obligations and became free, there was a constant need to replace the lost laborers. This attrition was compounded by the slaveowner's obligation to free those who converted to Christianity. Finally, those released from service began to compete with the established landowners for access to the most productive acreage. To preserve the social order they had constructed, changes would have to be made.

    Economic necessity began to drive cultural change. The notion that Christians could not enslave fellow Christians allowed far too many to escape bondage. That tradition vanished. Suddenly, one's race determined eligibility for enslavement. Likewise, the limited term that had characterized the preceding era disappeared. Slavery was now a permanent condition, as immutable as the color of one's skin. And, in one cruel and final insult, the Virginia colony passed a law in 1662 declaring that children would share the status of their mother. Freedom was now all but impossible, even for one's descendents.

    With these changes, the profitability of slaveholding skyrocketed. Slowly but steadily, slavery replaced indentured servitude as the exploited labor force of choice. As this occurred, the demand for slaves increased dramatically, leading to an explosion of the international and domestic slave trade. By 1720, slaves outnumbered free whites in some territories by nearly 2 to 1.

    However, there were other consequences of these changes. With all hope of legitimately acquired freedom lost, slave resistance increased dramatically. Slaves began to search out the means to escape bondage -- and when they could they ran. Slaves lashed out against their masters, burning their barns and poisoning their food. Outright rebellion was a constant threat and many were put down during this era. Whites responded with ever increasing acts of cruelty. Captured runaways were punished with whipping, branding, and, for male offenders, castration. Even tangential association with a slave uprising was rewarded with decapitation. These grizzly trophies were mounted on spikes and decorated high-traffic roadways as a warning against revolt.

    Of course, the barbaric treatment only served to intensify the slaves' quest for freedom. And so, the system began to cycle rapidly downward. Increasing violence led to increased resistance, which led to even darker depths.

    But increasing profits, especially in the Carolinas, changed the calculus for the slaveholder. Whereas before the replacement value of a slave deterred the most inhumane treatment, no longer was this true. If a slave was maimed or killed, the deterrence that act provided against misbehavior in the population at large was worth the relatively small cost of his or her replacement. With moral, legal, and now economic constraints lifted, slaveholders were limited only by their imagination. And they were very, very creative.

    THE MYTH OF EVOLUTION

    One of the most frequent misunderstandings with regard to Darwin's Theory of Evolution is what it actually means to evolve. Typically people view evolution as a process whereby organisms move from less perfect to more perfect forms. Implied within this view is the notion of progress. Earlier organisms are seen as less advanced than those that exist today. But, as any biologist will tell you, this is a fundamentally flawed perspective. Each organism is adapted to the environment of its time. As the environment changes, so do the creatures that exist within it. The quality of any adaptation is measured against its suitability for the current environment and not against proceeding forms. Progress is a myth created to bolster the notion of human exceptionalism. It doesn't really exist.

    However, this myth is not confined to the biological sciences. James W. Loewen, author of Lies My Teacher Told Me: Everything Your American History Textbook Got Wrong, spent several years examining some of the most popular American History textbooks. During this research, he discovered that every textbook presented narratives infused with an unyielding faith in American progress. A case in point:

    Three textbooks offer appendixes that trace recent trends, all onwards and upwards. These efforts are undistinguished. They do not use constant dollars, for one thing, so their bar graphs of rapidly rising family income or health care expenditures show far more "progress" (if spending more on health care is progress) than occurred… No textbook charts phenomena that might the negative, such as frequency of air pollution alerts, increased reliance on imported oil, or declining real wages.
    The story told by these textbooks is one of ever increasing enlightenment, wealth, morality, and freedom. Progress is the overarching structure into which all historical events must fit.

    But, the story of slavery does not fit into this mold. As bad as slavery was in 1619, it was infinitely worse hundred years later. Despite the rhetoric of American patriots in 1776, for a substantial portion of the population, freedom was not on the march. Quite the opposite. And while we currently stand far above the bar set early in the 17th-century, the myth of steady improvement obscures the wild oscillations that have occurred at points in between.

    This was an important lesson that I was deprived of. True, things often get better. But sometimes they get much, much worse. Moral advancement isn't automatic -- it requires awareness and perseverance. And even then we sometimes fail to escape our darker tendencies. It's a crime that one can earn a high school diploma without facing this irrefutable fact.

    We aren't served by the American fairy tale. The truth is that the United States has frequently been a force for good in the world. There is no need to embellish our accomplishments. But we should also recognize that many have suffered at our hands. It is part of who we are. Slavery is a stark reminder of our weakness in this regard and, if it isn't taught accurately, a tremendous opportunity for growth is squandered.

    And that is something that we truly cannot afford.

    Postscript: While much of the information in this essay was derived from the PBS documentary Slavery and the Making of America, I also discovered this resource. Both were invaluable sources of knowledge and I highly recommend them to you.

    Proceed to Traumatic Reverberations, Part I.

    Return to Slavery and History Index.

    Tuesday, February 22, 2005

    Revealing America's Peculiar History

    If you had asked me in high school what my least favorite subject was, I would have said Spanish. But that would only have been because I was so bad at it. If we had restricted the selection to those classes where I demonstrated reasonable aptitude, I would have said American History. There was simply nothing in the disjointed presentation of names, dates, and events that seemed relevant to the life I was living. It was so boring, so lifeless, and such an exercise cliché and platitude that it's amazing that I didn't lose my mind. However, while slumped in my chair in the back of the class, I did get plenty of sleep. So, I guess it wasn't a total loss.

    Over the last few days, though, this simmering bitterness has been reawakened. The catalyst for this rediscovery has been my exposure to a recently aired PBS documentary titled Slavery and the Making of America. I learned more about our Peculiar Institution during the first five minutes of the documentary than I did in all my years of school. And when you consider that slavery was the single most important socioeconomic institution in the history of our nation, you can see how worthless my history education was.

    Slavery isn't a footnote in our history. It wasn't a quaint tradition utilized by a handful of Southern plantations. Rather, it was an absolutely critical component of the birth of the American nation. The economies of the early settlements (in the North as well as in the South) depended upon the low-cost labor provided by slavery. Our fortunes during the American Revolution were nearly turned by slavery's existence, as offers of freedom convinced many slaves to join the British against the American patriots. It fueled the southern agricultural juggernaut, sustaining an otherwise uncompetitive economy. It was the single most controversial political issue from the advent of independence through the conclusion of the Civil War. And the consequences of slavery reverberate through history, affecting the lives we live even today.

    In short, one cannot even begin to understand who we are without facing this shadow from our past.

    Over the next few days, I'm going to attempt to remedy this deficiency. I haven't planned this out entirely, so I don't know how many posts this will actually generate. But, I have a few ideas that I want to explore with respect to both history education in general and slavery in specific. Hopefully, I'll be able to do it justice.

    Stay tuned.

    Proceed to A Downward Progression.

    Return to Slavery and History Index.

    Monday, February 21, 2005

    Intellectual Biowarfare

    There are a lot of cheap and dishonest tactics that people use in the course of a debate. But without a doubt, one of the worst of these is "The Strawman." For the uninitiated, this is when you misrepresent your opponent's position so that his argument is easier to defeat. It works best when your opponent is unable to correct your misrepresentation, such as when he is not in the room and is unaware of how you have characterized his reasoning. That's the cheap part (that dishonest part is when you lie).

    It's always a sad moment when The Strawman makes an appearance. To the informed audience, it's the moment when the advocate admits that his position cannot stand on its own. It's sort of like saying "your argument is so good that I wish I was arguing with someone else -- so let's pretend that I am." And consider the value of the victory so achieved. I mean, I could be heavyweight champion of the world if I could select all of my opponents. But, would it mean anything?

    Given what this technique says about those who employ it, it is initially surprising that it gets as much plays it does. Yet, not a day goes by without Rush and Hannity beating the shit out of some liberal strawman. Even Bush makes a point of trotting out this rhetorical canard from time to time. So, it makes me wonder. If using it makes you look bad and, since it only works when the opposition is absent, it converts no one, what's the point?

    Well, it turns out that there is one.

    Have you ever used the phrase "a germ of an idea" or described a quickly spreading intellectual concept as a "thought virus?" If so, you're probably already primed to view the spread of ideas and opinions through the lens of the germ theory. If not, it's really very simple. The transference of ideas from one person to another is closely analogous to the spread of a biological infestation. As with an infectious agent, ideas spread person to person with exposure. The factors that control the speed and efficacy of transfer, such as the strength of the idea and the hospitality of the host, likewise parallel the biological model.

    However, the similarities don't stop there. Just as you can develop resistance to viral infection through exposure to a weakened strain of the virus, exposure to weak arguments makes you resistant to the full strength version. This is what is referred to as inoculation. In the biological manifestation of this phenomenon, your body develops antibodies when first exposed to the weak strain, thus preparing your immune system for the full-scale assault that might come later. And while the details of conceptual inoculation are not quite so clear, the resulting effect is the same.

    And so, with this information, the true function of The Strawman is laid bare. The fact that this technique has no persuasive power is irrelevant because it is, by design, focused on allies rather than on opponents. It is a method of preaching to the converted, with the intention of cementing their allegiance.

    I cannot say with any certainty whether everyone who employs this technique is aware of the inoculation effect. There are certainly times when an opponent's argument is distorted because it is honestly misunderstood. At other times strawmen are constructed simply to create an easily defeatable opponent. But once we move out of the amateur ranks, you can be certain that the effect is known. When national elections turn as they do on the smallest sliver of the electorate, is essential that each side avoid attrition by any means necessary. If that means stooping to what is, in effect, brainwashing, then that's the game were playing.

    We often lament the state of political discourse in this country. But even as we protest, we remain largely unaware of how low we have sunk. There was a time when I fantasized about cornering a “dittohead” and converting him with the power of logic and charm. Experience has shown me what a fool's errand that was. Until we discover a method to pierce through this encrusted layer of intellectual antibodies, no real discourse between the right and left can occur. And while we wait for such a discovery, we must inoculate our own just to stay in the fight.

    Thursday, February 17, 2005

    Bad Arguments for Bankruptcy Reform

    As the debate over bankruptcy reform has evolved, defenders of the proposed legislation have begun to emerge. One of the most vocal of these defenders has been Todd Zywicki of The Volokh Conspiracy. In two separate posts (here and here) he argues that credit card companies are not responsible for increasing bankruptcy rates and that medical expenses are not responsible for 50% of bankruptcies. To address these issues, I shall employ the always entertaining "So What?" defense.

    Let's proceed.

    Zywicki’s discounts the role of credit cards in bankruptcy by arguing that there has been no measured increased in household indebtedness.

    First, the argument doesn't make much sense from an economic perspective. Unless credit cards have somehow removed the borrowing constraint on individual credit (and no one has provided any evidence that it has), there would be no reason to believe that credit cards would increase overall household indebtedness.

    Instead, economic theory would predict that the primary effect of the introduction of credit cars would be to shift around patterns of consumer credit use, by substituting credit card debt for other less-attractive forms of credit, such as pawn shops, personal finance companies, and retail store credit (such as from appliance and furniture stores). In fact, this is what the evidence indicates has actually happened.
    He continues by suggesting that the measured correlation between high credit card debt and bankruptcy is explained by the credit cards role as the credit line of last resort and the strategic debt repayment employed by those intending to file bankruptcy.

    First, the correlation between credit cards and bankruptcy may reflect the unique role of credit card borrowing in the downward spiral of a defaulting borrower. Credit cards provide an open line of unsecured credit to be tapped at the discretion of the borrower. Thus, for many debtors credit cards are a "credit line of last resort" to stay afloat to avoid defaulting on other bills…

    Second, a debtor's increased use of credit cards preceding bankruptcy also may reflect strategic behavior taken in anticipation of filing bankruptcy… Given the choice between defaulting on secured or nondischargeable obligations on one hand versus dischargeable credit card debt on the other, the incentive is to use credit cards to finance payment of nondischargeable and secured debt.
    To these arguments, I respond So What? The crux of these arguments is that individuals, for various reasons, are shifting debt so that it might be dischargeable through a Chapter 7 filing. This offers no proof that individuals are being financially irresponsible. In fact, it's quite the opposite. People who are deeply in debt are transferring as much of that debt as possible to dischargeable credit lines, thus putting themselves into the best possible financial position post bankruptcy. It would be foolish to do anything less.

    Moreover, the credit card industry has been knowingly handing out dischargeable credit lines of their own accord. Surely, they must have realized that their clients might choose to use this credit in such a fashion. And if they didn't, So What? Why is their recklessness cause for systemic reform?

    Zywicki then moves on to the role of medical expenses in bankruptcy. To challenge the often cited 50% figure, Zywicki questions the methodology used in the supporting research.

    In fact, the "finding" in this article of a massive rise in medical bankruptcies appears to actually be a result in the way in which medical bankruptcies are counted, rather than an actual change in the numbers. They draw their data from two sources. First, self-identified bankruptcy filers who say that some medical event "caused" their bankruptcy. [Emphasis in original]



    Among the self-identified factors that are listed as "medical" causes of bankruptcy in Exhibit 2 of the article are the following: illness or injury, birth/addition of new family member, death in family, alcohol or drug addiction, uncontrolled gambling. First, it is surely open to question whether uncontrolled gambling or a death in the family really should count as a "medical" problem. More generally, the category "illness or injury" is very broadly defined in the study, and there is no apparent limit on the time frame over which the illness or injury occurred, or the severity. So classifying all of these factors as medical problems that have "caused" bankruptcy certainly seems open to question. [Emphasis added]
    First, I'm a little puzzled by his assertion that it is "open to question" whether or not "death" should count as a serious medical problem. I mean, if there is a medical problem more serious than death I'd like to hear about it. But, in a more general sense, I'm not certain how valuable the opinion of a bankruptcy specialist is in establishing the seriousness of various medical conditions.

    However, in the spirit of fairness, I will grant him that the study possibly overstates the role of medical expenses in bankruptcy. Likewise, I will stipulate that the causal relationship between medical expenses and bankruptcy is not established by the study and that media claims to the contrary are misleading.

    Which brings me to my core argument -- So What? Even if medical expenses aren't causing bankruptcy directly, they are unquestionably a significant contributing factor in many cases. Given that fact, what matters is not the number of medical expense related bankruptcies, but the manner in which those bankruptcies are handled by bankruptcy law. David Welker of Ex Parte (which is managed by the Harvard Federalist Society, if you can believe that) backs me up on this point.

    I am skeptical of the 50% number, but on the other hand, the whole debate is irrelevant. Does the analysis change if a mere 40% of bankruptcies are "caused" by medical problems. How about 30%? What if it is a mere 10%? Regardless of the number, any reform legislation should deal with such cases appropriately. Even a 10% number would be over 150,000 people a year. Zywicki concedes that some bankruptcies are caused by medical conditions and claims that the bankruptcy legislation deals with these appropriately. But whether these cases are dealt with appropriately or not, that is the real question! If the legislation deals with these cases appropriately, then it would not matter if the number were even much higher than 50%. After all, those cases are resolved appropriately. On the other hand, if these cases are not handled appropriately, you are going to be dealing with tens (maybe hundreds) of thousands of people a year in an unfair and innappropriate manner.
    And so Zywicki may have caught the opposition playing politics with the numbers on this issue. To that I say "Touché!" However, I fail to see how this advances the argument for bankruptcy law reform.

    Approximately 1.5 million people file Chapter 7 bankruptcy every year. Some of those are high interest earners who could repay at least some portion of their debts. I would happily support reforms designed to eliminate that sort of abuse. However, if those reforms prevent legitimate use of bankruptcy protections, I'm out. Given the enormous profitability that is ubiquitous to the industry, I'm willing to accept abuse to provide protections to the truly vulnerable (besides, all systems have noise). Until there is an affirmative demonstration of the need for reform and the ability to employ it effectively, I cannot support this endeavor.

    I'm still waiting.

    Wednesday, February 16, 2005

    Witchhunt 2005

    There are many intelligent and important individuals alive today who represent the best of left wing political thought. Ward Churchill is not one of these people. Yet, he is now frequently held up as an example of modern leftist academia. As the controversy surrounding his radical views reached a fever pitch, and as calls for his resignation/termination began resonating across the conservative echo chamber, he has also become the poster boy for advocates of academic freedom. Churchill himself revels in the spotlight, having garnered more attention in this moment then he has in his entire career. Whether or not he retains his post at the University of Colorado matters, I suspect, very little to this man. Unfortunately, though, the repercussions of this controversy will matter to the rest of us -- and in some ways that I think are not immediately apparent.

    BACKGROUND

    The process by which Ward Churchill came to the attention of the mainstream media is a fairly typical tale for such things (Kevin Drum connects the dots for us). But, for our purposes, what's important is how it began.

    Shortly after September 11, 2001, Churchill published Some People Push Back: On the Justice of Roosting Chickens. In this article he argues that the attacks of 9/11 were a predictable response to American cultural, economic, and military intrusions into the Middle East. However, he also attempts to discredit the notion that the victims of the attacks were purely innocent.
    As to those in the World Trade Center… Well, really. Let's get a grip here, shall we? True enough, they were civilians of a sort. But innocent? Gimme a break. They formed a technocratic corps at the very heart of America's global financial empire – the "mighty engine of profit" to which the military dimension of U.S. policy has always been enslaved – and they did so both willingly and knowingly… If there was a better, more effective, or in fact any other way of visiting some penalty befitting their participation upon the little Eichmanns inhabiting the sterile sanctuary of the twin towers, I'd really be interested in hearing about it.
    While it took a few years for these comments to reach the ears of the general public, once they did the result was predictable. Blaming the victim rarely makes you popular and in this instance you had to take a number to join the lynch mob.

    ACADEMIC FREEDOM

    As his views became widely circulated, and as the public became aware of his position as a tenured professor, calls for his termination began to grow. The combination of his radical opinions and the fact that his salary was paid at least in part by public tax dollars was too much for many to bear. He simply had to go.

    It was then that a small cadre of defenders began to emerge. While they universally detested his controversial views, they were unwilling to accept his termination based upon them. In their view, the removal of a tenured professor based upon the unpopularity of his ideas would undermine the principle of academic freedom. And on this point, they are exactly right.

    While it might seem odd to those of us who toil away as "at-will" employees, the principle of tenure is a crucial component to intellectual progression in academia. A newly hired professor is expected to spend the first few years of his career demonstrating his value to the host institution. Each institution measures value in its own idiosyncratic way, be it through teaching excellence, research and publishing, or the like. At the completion of this period, if the candidate has met the institution's expectations, he is granted tenure and cannot be terminated for anything less than academic fraud or gross misconduct. With these protections in hand, our newly tenured professor can venture out in any direction that captures his attention. Nothing is off limits.

    This freedom allows academics to explore controversial issues in ways that would otherwise not be possible. They are expected to leave conventional wisdom behind and to embark into uncharted territory. Ideas are ultimately judged on their merits, but without the freedom that tenure provides the vibrancy of debate would be greatly diminished. The "marketplace of ideas" would be transformed into the "Wal-Mart of comfortable notions."

    Thus, allowing Churchill's views to be the basis of his termination would set a precedent that would threaten the entire system. Academics everywhere would begin to self-censor their investigations, closing off any topic that might potentially draw popular criticism. Universities would rapidly become institutions proficient only at indoctrination and would leave behind a long tradition of intellectual inquiry.

    So it would seem that, despite his views, Churchill's termination would have to be fought.

    ACADEMIC FRAUD

    If you're going to be in the business of defending the freedom of expression in this country, you inevitably are going to be rubbing shoulders with some fairly unsavory characters. It's simply the nature of the beast. And as this case demonstrates, defending academic freedom often places you in the same company. The problem with dealing with such individuals is that they tend to behave in a rather unscrupulous ways.

    This appears to be the case with Ward Churchill.

    Thomas Brown, an assistant professor of sociology at Lamar University, recently published an essay that strongly challenged claims made by Churchill regarding the 1837 smallpox epidemic. According to Churchill, this epidemic was a biowarfare event perpetrated by the United States Army. Brown's essay dismisses this assertion and concludes the following:
    Situating Churchill’s rendition of the epidemic in a broader historiographical analysis, one must reluctantly conclude that Churchill fabricated the most crucial details of his genocide story. Churchill radically misrepresented the sources he cites in support of his genocide charges, sources which say essentially the opposite of what Churchill attributes to them.

    One of Churchill's primary sources regarding the smallpox epidemic is Russell Thornton, a professor of anthropology at UCLA. Inside Higher Ed addressed this directly.

    Thornton, who is a Cherokee, has written extensively about the horrors of U.S. treatment of Indians. But his study of the Mandan concluded that the epidemic was not intentional.

    Thornton said in an interview last night that Brown's essay was correct. He said that people have periodically told him over the years that Churchill has "misrepresented my work."

    "Issues like Ward Churchill cast aspersions on legitimate Indian scholars," Thornton said. Of U.S. treatment of Native Americans, Thornton said, "The history is bad enough -- there's no need to embellish it."

    That doesn't seem to be the end of Churchill's deceptions. For much of his career he has presented himself as a Native American. This has played no small role in establishing his credibility with respect to his field of inquiry. Yet, according to the American Indian Movement:

    …Ward Churchill has fraudulently represented himself as an Indian, and a member of the American Indian Movement, a situation that has lifted him into the position of a lecturer on Indian activism.

    …Ward Churchill has been masquerading as an Indian for years behind his dark glasses and beaded headband. He waves around an honorary membership card that at one time was issued to anyone by the Keetoowah Tribe of Oklahoma. Former President Bill Clinton and many others received these cards, but these cards do not qualify the holder a member of any tribe. He has deceitfully and treacherously fooled innocent and naïve Indian community members in Denver, Colorado, as well as many other people worldwide.

    Indian Country Today is also critical of his claims of native heritage.

    Churchill's claim [of Native American heritage] is so seriously in question…that it offends some as much as the galling insults and the opportunistic political reactions. Churchill, it would now seem, is neither claimed by sensible liberal scholars nor by any of the American Indian tribes, including Cherokee and Creek, to which he has claimed affiliation.

    These deceptions, academic and otherwise, serve to undermine any principled defense of Churchill's position. As much as we might like to defend his right to freely express his admittedly inflammatory opinion, his dishonesty in and of itself justifies his removal from the University of Colorado. And, as it now stands, it can be done without threatening the tradition of academic freedom.

    SEEING THE FOREST

    Actually, I'd like to take back that last sentence.

    Over at Inside Higher Ed, commenter Louis Proyect noted:

    …Churchill's sins [with respect to academic fraud] pale in comparison to what I have seen around me since my undergraduate days.

    This raises a troubling issue. What if, despite aspirations, Churchill's level of academic fraud is more common than we had initially been led to believe? What if his poor scholarship fails to completely distinguish him from his peers?

    Now, consider that question alongside the following David Neiwert observation.

    …Ward Churchill is hardly the only academic in America with genuinely repulsive views that deserve renunciation. Indeed, there are a number of right-wing professors who could face similar academic firing squads if the punditocracy chose to raise their cudgels against them…

    So why are they not every bit as eager to expel [conservative] radical academics from our midst? Their silence has been longstanding; if anything, you'll find so-called mainstream conservatives actually defending thinkers like this (see, e.g., the long-running right-wing apologia for Charles Murray's repulsive theories about race.)

    In this country, drug use rates are virtually identical in the Caucasian and African American communities. However, despite comprising a mere 11% of the population, African-Americans represent a majority of those imprisoned on drug offenses. While many factors contribute to this phenomenon, one of the most important is the fact that drug prohibition criminalizes activity engaged in by a wide segment of the population. Since law enforcement lacks the resources that would be required to universally prosecute drug offenses, selective enforcement is the result. Inevitably this enforcement falls primarily on minority communities, who tend to be poor, easy to prosecute, and generally undesirable. In such an environment, it is their membership in an undesirable community rather than their criminal activity that determines their eligibility for prosecution.

    And this is the situation with Ward Churchill. He may be a sloppy academic and that may justify his termination. But we shouldn't ignore the reality that it was his views that brought these ancillary issues to light. Fraudulent scholarship was a rationale developed ex post facto and most likely would never have been unearthed were it not for the withering scrutiny that Churchill endured. In truth, few of those calling for his dismissal have any stake at all in preserving the integrity of academia. That isn't what this is about.

    Irrespective of the current rationale, the effect of his termination will be indistinguishable from a termination based upon his controversial views. In the future, academics will have to carefully consider whether or not their personal and professional history can withstand intense media focus before they venture beyond popular conventions. Many will choose to err on the side of caution. Invaluable contrarian view points will vanish from the public sphere, and our intellectual palette will be restricted to the ideas that comfort the majority. And we shall be poorer for it.

    Ultimately, I don't like Ward Churchill or what he stands for. His views are poorly justified and, frankly, rather pedantic. I hardly think that his contributions to academia would be missed. Moreover, as things stand, the fraudulent quality of his research limits his claim to the protections of academic freedom. At the same time, we shouldn't kid ourselves. This was a witch hunt, plain and simple. And as he burns at the stake, many others tremble in fear, knowing that the forces brought to bear against Churchill are easily redirected.

    Update: there's a fairly interesting dialogue developing in the comments section, so be sure to check it out.

    Keep Your Pants On

    Sorry things have been a bit slow here recently. The post I'm working on has taken a little extra time to put together. It should be ready later today.

    Remember people, it's all about quality…

    Monday, February 14, 2005

    What's in a Number

    In an effort to justify the blogosphere's reputation as a "self-correcting" medium, I offer you the following.

    An entry I posted last Thursday included this juicy morsel:
    They were showing support for the policies of George W. Bush -- the same policies that have led to the deaths of as many as 100,000 innocent Iraqis.
    A few days later, during one of my semi-regular excursions onto the dark side, an intellectual jouster followed me home and came upon the phrase quoted above. He then chastised me for advocating the "long debunked" 100,000 Iraqi civilian death figure.

    Given where I was lurking and who was reprimanding me, I wasn't terribly concerned. But, I realized that, in truth, I didn't really know all that much about where that figure came from or what it really meant. In the context of the post in which it appeared, the validity of the figure was not terribly consequential (accepting the 25,000-30,000 figure proposed by my critic would not have substantially undermined my argument). Still, I felt it was worth my while to research the issue so that I might approach it in the future from a position of authority.

    And, as it turns out, a correction is in order.

    The figure (which turns out to be, in fact, 98,000) arises from a study (pdf) that was published this fall in The Lancet. The study was an epidemiological investigation that used a sampling methodology to determine the death rate in Iraq before and after the American invasion. The study did generate a fair amount of controversy and condemnation from both the left and the right. Unfortunately for the naysayers, much of the criticism is nothing but -- well -- horseshit.

    So much for it being "long debunked."

    That said, there are some clarifications that should be made.

    First, as I said above, the study was attempting to determine the net change in the death rate in Iraq before and after the start of the war. The 98,000 figure, therefore, was an extrapolation of the study's results and not an actual count. Since this is a completely valid statistical method to employ, this is a rather nuanced distinction to make. It is worth knowing, though.

    More important than that, however, is what the study is not trying to say. Due to the macroscopic focus of the study, the authors made a deliberate decision to record all deaths, regardless of causality. No attempt was made to differentiate between combatants and noncombatants. Therefore, it is not accurate to say, as I did, that the 98,000 figure refers to civilians. Likewise, the data was not restricted to deaths caused by coalition forces. Deaths that occurred as a result of Iraqi military, the insurgency, and of natural, nonviolent events were all included in the study's data set. Therefore, one must be careful not to imply that 98,000 deaths resulted directly from actions taken by coalition forces (although, it is fair to argue indirect responsibility).

    Finally, it is important to acknowledge that this is a single study and that it should not be the final word on the subject. If we truly care to know the answer to this question, other research using different methodology must be applied to the situation. This study takes an important first step, but it hardly puts the issue to rest.

    OK -- enough with the caveats. Let's dig in to the actual statistics.

    One point that is frequently misinterpreted by both sides is the nature and meaning of the study's confidence interval. For those of you who are not conversant in the details of research statistics, a confidence interval is a statistical calculation which reveals how certain researchers are that a measured value represents the actual value of the phenomenon being studied. Traditionally this is expressed by citing a range of values between which the actual value is believed to reside. The size of this range is determined by the methods used in the study and by the degree of confidence (hence the name) that the authors are attempting to express. Generally the level of confidence must be 95% before the results are considered statistically significant.

    For many, this confidence interval is where the validity of the study appears to break down. Due to the methods employed (most notably, the small sample size), the 95% confidence interval for the study is 8000-194,000 postinvasion deaths. This range of values causes otherwise reasonable people to lose their minds. Case in point:
    Readers who are accustomed to perusing statistical documents know what [that range] means. For the other 99.9 percent of you, I'll spell it out in plain English—which, disturbingly, the study never does. It means that the authors are 95 percent confident that the war-caused deaths totaled some number between 8,000 and 194,000. (The number cited in plain language—98,000—is roughly at the halfway point in this absurdly vast range.)

    This isn't an estimate. It's a dart board.

    Cute. According to Mr. Kaplan, any value between 8000 and 194,000 is equally likely. For everyone out there who hasn't taken Introduction to Statistics, that assertion is ridiculous. Despite this wide range of possible values, 98,000 is still the most likely value. It is considerably more likely than values that exist closer to the outside of this range.

    However, the range does demonstrate the existence of uncertainty with respect to the 98,000 figure. Due to the sample sizes involved (and unavoidable consequence of performing research in a war zone), it is difficult to say exactly how many excess deaths have occurred postinvasion. It could be 8000 or less (a 2.5% chance exists for this possibility). It could also be 194,000 or more (again, 2.5% probability for this outcome). Honesty requires that we acknowledge this reality.

    That said, the study makes one inarguable point that is frequently lost upon those who seize upon the implied uncertainty in the confidence interval. Here's Daniel Davies to educate us:
    Although there are a lot of numbers between 8,000 and 200,000, one of the ones that isn’t is a little number called zero. That’s quite startling. One might have hoped that there was at least some chance that the Iraq war might have had a positive effect on death rates in Iraq. But the confidence interval from this piece of work suggests that there would be only a 2.5% chance of getting this sort of result from the sample if the true effect of the invasion had been favourable. A curious basis for a humanitarian intervention; “we must invade, because Saddam is killing thousands of his citizens every year, and we will kill only 8,000 more”.
    I lose track of the current justification for the Iraq invasion. But it seems to me that we floated the "improving lives of the Iraqis" rationale at some point. Nothing improves one's life as little as death does, so death reduction appears to be fair metric by which to measure how we are faring in this regard. Unfortunately, this study unequivocally demonstrates our failure to accomplish this goal. A 97.5% probability exists that at least 8000 people have died who otherwise would not have if we had not invaded.

    Personally, on issues as complicated as the justification for war, I am reluctant to employ cost/benefit analysis. There is simply no rational way to evaluate, in the moment, the costs and benefits of such an action. However, my reluctance is shared by few of those who would justify its cost by pointing to the tyranny of the previous regime. To these individuals we can say, with near certainty, that death is on the rise now that we are on the scene.

    And so, I stand corrected. Never again shall I imply, directly or indirectly, that 100,000 civilians have died in Iraq. I will be clear that we do not know how many deaths we are responsible for. I have learned my lesson. I will, from this point forward, merely state that it is very likely that we have greatly inflated the ranks of the Iraqi dead.

    I guess he showed me.

    Friday, February 11, 2005

    Bailing out the Bringers of Bankruptcy

    Via Dr. Atrios, we learn that "Bankruptcy Reform" is in the pipeline again.

    Republican leaders in Congress began clearing the way yesterday for swift passage of legislation backed by the credit card industry and opposed by consumer groups that would make it harder for consumers to wipe out debt through bankruptcy.
    This is just plain sick. The credit card industry is in no way being hurt by rising bankruptcy rates. In fact, as the article states, this last year has been "one of their most profitable years in more than a decade." Credit card divisions are typically the most profitable components of the financial institutions to which they belong. There is simply no affirmative evidence demonstrating the need to change the existing system.

    This is especially true when you consider the reason most people file bankruptcy. These individuals typically incur debt that has arisen from unexpected medical and employment crises. It isn't irresponsibility that led them there. It's a combination of fate, low economic reserves, and usurious interest rates. Moreover, the whole reason that bankruptcy law exists in the first place is that it is economically disadvantageous for all of us to have otherwise responsible individuals trapped under an avalanche of debt. Does it make sense to change this equation so that the credit card industry might experience a slight increase in profits?

    There's one other thing to consider as you evaluate this issue. Whenever a credit card company issues a credit card to an individual, they are, in essence, investing in that individual. The interest that they earn is justified by the depreciation of the value of the loan due to inflation AND the potential risk they are incurring. If these investments weren't risky, there is no way such outrageous interest rates would be tolerated.

    But, for years now, the industry has been targeting a wider and wider pool of individuals, many of whom are clearly risky investments. These individuals are provided with high lines of credit and are encouraged, via advertising, to spend freely. Finally, the companies withhold information that would help their clients manage their debts responsibly (such as revealing how long it would take to erase a debt by paying the monthly minimum).

    Now, if any other industry was making risky investments in shaky businesses, while at the same time giving those businesses poor financial advice, we would expect them to accept the negative consequences of their unscrupulous practices. We would laugh at the idea that it was our responsibility to bail them out. Yet, that is exactly what the credit card industry is asking us to do.

    In truth, it doesn't matter how people are entering bankruptcy. The point is that the credit card industry has always been aware of that risk. If they didn't want to pay for the negative consequences of that risk, then they shouldn't have been loaning money to such high risk clients. If they're being hurt (which they are not) it is due to their own imprudent business practices. It isn't our job to clean up their mess.

    Sadly, the third time is probably the charm on this issue. What with the fight brewing over Social Security reform, little political capital will remain to beat back this atrocity. It is an unpleasant truth that we must pick our battles and let others pass by uncontested. But, make no mistake about it -- this is a dirty deal that we will all pay for eventually.

    Thursday, February 10, 2005

    Clinton Started It

    Here's something else that I should have included in last night's post.

    In an attempt to discredit Democrats who are refusing to buy into the "crisis" as described by the Bush administration, a lot of noise is being made about Clinton's seemingly similar rhetoric as he attempted to shore up the system's finances. Unfortunately for this line of argument, the situations are so different that comparison is meaningless. Mark Schmitt (via Kevin Drum) has the details.

    Wednesday, February 09, 2005

    Untangling Social Security

    If we are going to have a debate about Social Security reform, it really would be nice if everyone actually understood what was going on. Alas, the Bush administration is doing all it can to keep us all in the dark. Fortunately, we aren't completely reliant upon them for our information.

    Most readers of this site are probably aware of the work being done on this issue over at Legal Fiction. But if you yourself are not, you should be. Publius has been breaking down the Bush plan into bite-size pieces. Even if you feel that you have a pretty good handle on the proposal, you should check it out. Part I is here and Part II is here (I'll update this post once the remaining sections are posted).

    (Update: Part III has been posted.)

    Kevin drum also has a post up today which explains some of the issues regarding the Social Security Trust Fund. As the administration attempts to convince the nation that the trust fund doesn't exist, it's worth your while to see what Kevin has to say about it.

    And let me add just one thing. Josh Marshall sometime ago (I can't find the link right now -- I'll update once I've located it) pointed out that the crisis that we're facing in 2018, the year revenues from payroll taxes will fail to cover the cost of Social Security benefits, isn't a Social Security fund crisis, it's a general fund crisis.

    You see, in 1983 payroll taxes were increased in order to cover the costs of the retiring baby boomers -- starting in 2018. This created a surplus in the system which would have been able to cover the costs of this demographic bubble. The problem is that instead of saving this money, it was spent. In other words, the fund surplus was diverted to the general fund. So now, the general fund needs to pay back that loan. And that is where the problem is. The general fund doesn't have the cash to make good on its debts.

    There are a host of ways to deal with this problem, but it is important to know what the problem actually is. And, since this administration has made a career of exacerbating the currently existing general fund crisis (through tax cuts and enormous spending increases), it might give you pause as you consider their plans for rectifying it.

    Darwin's Great Conspiracy

    The New York Times, in its infinite wisdom, decided to provide Michael Behe with some valuable real estate on the op-ed page yesterday. Unfortunately, upon this ground he attempted to construct a temple to Intelligent Design, a false idol if there ever was one. Of course, an editorial as ridiculous as this one screams out for rebuttal. Thankfully, Nick Matzke and PZ Myers are out there doing the heavy lifting for us all.

    But beyond the flawed logic and outright deceptions presented in Behe's editorial is an insinuation that deserves to be directly confronted.

    The prominent placement of this editorial implies that the debate rages within the biological sciences between evolutionary theory and Intelligent Design. While its proponents frequently parade about lists of scientists who harbor doubts regarding the theory of evolution, the overwhelming majority of scientists agree that it is an extremely well supported explanation for the diversity of life on this planet. This devastating truth is demonstrated in the most amusing fashion by the good people over at Project Steve. In reality, Intelligent Design is not taken seriously by any reputable biological scientist. And that is why this debate takes place at school board meetings and on the New York Times op-ed page rather than within the scientific community.

    Now, there can really be only two reasons why Intelligent Design has been unable gain traction within the field of biology. The first is that the theory is wholly without merit. But, understandably, that isn't an explanation that Intelligent Design proponents are comfortable with. The second is that the theory is valid (or at least worthy of full and open debate), but that it is being suppressed by those beholden to the dominant evolutionary establishment. If only those with a vested interest in perpetuating the evolutionary myth would allow the voice of Intelligent Design to be heard, it would find its rightful place in the halls of scientific truth.

    Unfortunately, we live in a world where far too many people are willing to accept this second explanation. One of the great triumphs of the conservative movement has been to advance the theory that bias regularly overwhelms scientific objectivity. When uncomfortable and/or inconvenient scientific evidence bubbles to the surface, it is frequently dismissed as being the product of political agendas. Global warming, the effectiveness of safe-sex education in reducing the spread of HIV, and evolutionary theory have all been subjected to this dismissive treatment. And that is just the beginning of the list.

    The success of this tactic reveals how little is understood about the process of scientific advancement. A brief explanation exposes the folly of this line of reasoning.

    If I, as a scientist, conduct an experiment whose results prove or disprove a scientific hypothesis, the first thing I will do is publish the results in a scientific journal. My article will be consumed by a population of my peers. Those who find my results interesting will either attempt to replicate them or attempt to design an experiment to build upon them. Regardless of which path they choose, the results of my experiment are being independently validated. If my data is cooked, this process will reveal it.

    There are, of course, circumstances under which bad science can slip through the cracks. There are thousands of journal articles published every year. Because of this fact, some research will not be exposed to independent validation. This is especially true if the research falls into obscure or otherwise uninteresting venues.

    However, research that generates high levels of interest in the scientific community cannot escape widespread scrutiny. False theories can enjoy temporary acceptance, but once they fail to demonstrably explain observed data, they cannot but fall from grace. As one of the core principles of modern biology, it is absurd to suggest that evolution has been excused from rigorous independent analysis.

    Thus, Michael Behe is asking us to believe the following: Evolution is a false theory, unsupported by experimental data, that is being intentionally preserved by a widespread conspiracy of biologists that has persisted, unbroken, for 145 years.

    Occam's Razor, anyone?

    The scientific process is by no means perfect. There are numerous examples where it failed, for a time, to accurately reflect objective reality. But the existence of exception does not disprove the rule. Over time, scientific consensus approaches truth with a precision that dwarfs all other intellectual methodologies. The assertion that science as a whole is subservient to the bias of individuals is absurd and reflects either a gross misunderstanding of the nature of science or a desire to undermine its message. The proponents of Intelligent Design are guilty of both these sins.

    No wonder nobody takes them seriously.

    Monday, February 07, 2005

    Unforgivable Overreaching: The Fall of George W. Bush

    I'm halfway through watching the new Ken Burns documentary, Unforgivable Blackness: the Rise and Fall of Jack Johnson, and it has left me thinking about a subject I rarely dwell on: boxing. Unlike its typical cinematic portrayals where the combatants swing from the heels and land roundhouse after roundhouse, defense tends to be the dominant feature of any high level boxing match. The reason for this is that each aggressive attack has a tendency to leave the attacker exposed. If the attack fails (something that good defense frequently achieves) the defender has an opportunity to freely inflict damage on the attacker. At the highest levels, openings like these can quickly turn a fight. Thus, strategy demands an emphasis on defense, with cautious attacks applied only when the opportunity presents itself.

    It strikes me that these principles can be applied to high-level politics as well. Typically, politicians play defense. They would prefer to say nothing at all on an issue rather than risk opening themselves to criticism. As a rule, mistakes are punished far more than boldness is rewarded. This is one of the things that leaves so many people frustrated with politics and politicians (and, frankly, with boxing and boxers as well). Unless you are invested in the minutia (in politics or boxing), it rarely seems as though anything at all is happening. But, as unfortunate as it is for the spectators, that's the way it has to be played if you want to win.

    However, George W. Bush has been breaking that mold over the last four years. Whatever you might say about the wisdom of his various policy initiatives, they have been quite bold. His tax cuts, his prescription drug plan, and the Iraq invasion were all extremely aggressive moves on his part. Even some of his failed initiatives (Mars, anyone?) were quite striking in this way. This coupled with the manner in which he then wielded his power against the Democrats, portraying their principled resistance as weakness and unpatriotic treachery, again demonstrates his willingness to swing for the fences. Nothing, not even reality, could persuade him to temper his "go for broke" style of play.

    But, as the saying goes, if it ain't broke, don't fix it. Up until this point, Bush's aggressive political style has been rewarded by two election victories, increasing Republican majorities in Congress, and an extremely successful legislative record. And, in spite of numerous points of vulnerability (the record deficits, the extremely poor job creation record, the blatant deceptions of Congress and the American people, the quagmire in Iraq, etc.), nary a blow has been landed against him. It has been quite a remarkable performance.

    The thing is, though, whether the strategy is working or not, it's still bad strategy. Sure, when you're battling a lesser opponent, defense isn't particularly critical. If circumstances conspire and provide you a dramatic advantage, you can make errors and still expect to dominate. The problem is that, after a while, you'll start to forget that external factors allowed you to play the game this way. You'll start to think that the rules are different for you and that you are invulnerable.

    As they say, hubris comes before the fall.

    I am now starting to think that Bush's Social Security plan is the moment of overreaching. Dismantling Social Security has long been the fantasy of the American political right. Yet, for just as long, it has been considered a political third rail. It has been incredibly successful in achieving its intended goals and has enjoyed tremendous popularity, especially within older demographics (who, of course, vote in exceptionally high numbers). Until now, conservative politicians have accepted these facts as evidence that Social Security abolition is politically infeasible.

    On the other hand, George is playing by his own rules now. And the game plan looks familiar. Invent a crisis, present a solution, portray opposition as irresponsible naysayers leading us all to destruction. It may be a strategy that is completely divorced from reality, but that's never been a problem before. Once again, he swinging from his heels and trying to land the knockout.

    This time, though, circumstances have changed. This isn't a national security issue and therefore Bush can't play the 9/11 card. The so-called crisis is, at least, 38 years away. And the nature of the crisis is not immediately clear. Unlike the immediate terrorist threat that Bush has used unceasingly to his advantage, this crisis is ephemeral.

    Moreover, the opposition appears to be uniting against him. Again, with national security, too few Democrats were willing to risk standing against Bush for fear of being portrayed as unpatriotic. This time no such fears exist. Many Democrats are proudly standing up and declaring that "there is no crisis," unconcerned that they might pay for their resistance in 2006.

    These are the facts that arise before we even begin discussion of the plan specifics. Will Bush's "take no prisoners" tactics carry the day once it becomes clear that he intends to massively increase the deficit, reduce benefits, and require that individuals bear all of the risk in the new system?

    At long last, George W. Bush has gone too far, and in doing so he has left himself exposed. For a time, Bush has viewed himself as being politically invincible and thus far the evidence has vindicated that opinion.

    This time, though, his chin is showing.

    Thursday, February 03, 2005

    Giving Them the Finger

    When Iraqi citizens dipped their index fingers into the ink wells over the weekend, it was more than a simple act of accounting. It was a mark of solidarity. It was a mark of defiance. It was a mark that they proudly displayed in spite of the fact that it literally placed their lives in danger. It represented the risk that each participating citizen was willing to take in order to make their dream of self-rule a reality. Knowing this, who could fail to be inspired by the images of Iraqi citizens dancing in the streets, waving their blue fingers in the air?

    And what could be more disgusting than witnessing congressional Republicans attempt to exploit this beautiful symbol for partisan gain?

    If tomorrow I decided to walk down the street wearing a fake Purple Heart in order to "show solidarity" with American veterans, I would be spit upon -- and rightfully so. Wearing a medal that I did not earn, a medal that represents the deep sacrifice that far too many of our soldiers willingly offer in defense of our nation, is sacrilege of the highest order.

    Tom Delay and his cohorts did not participate in the Iraq election. They risked nothing as they sat in their Washington, DC townhouses watching the CNN election coverage. And if this was truly an act of solidarity, rather than an act of partisanship, why were no Democrats invited to participate?

    Solidarity? No, I don't think so. They weren't demonstrating support for the Iraqi citizens. They were showing support for the policies of George W. Bush -- the same policies that have led to the deaths of as many as 100,000 innocent Iraqis. Could anything be more offensive than that?

    I'm sure that many Iraqi citizens witnessed this cheap political stunt. I'm sure that they saw it for what it was. And, it is my hope that they all felt free to give us the finger right back.

    Wednesday, February 02, 2005

    Risky Business

    One of the featured topics for tonight's State of the Union address is sure to be President Bush's plan for Social Security reform. While it is unlikely that any real specifics will be presented, tonight's address promises to be the "shock and awe" moment of the reform campaign. I would expect there to be a lot of "imminent crisis" talk ("imminent threat" in Iraq, anyone?) along with a fair amount of choice/ownership society effluvia. If you're hoping to hear about his plans regarding the $2 trillion transition costs, well -- don't hold your breath.

    But even without access to the specific details of any reform package, it strikes me that the current Social Security debate encapsulates one of the most fundamental differences between Republican and Democratic ideology.

    Here's what I mean.

    If we were to strip out of the debate the pragmatics of implementation and questions regarding the nature of the "crisis," and if we were to focus merely on retirement benefits and exclude survivor and disability benefits, the positions held by the two parties pivot on a fundamental ideological position.

    Democrats believe that Social Security is a system that should provide benefits to recipients without requiring that the individual assume personal risk. In order to provide such a system, Democrats are willing to accept a certain amount of drag on the economy (in the form of payroll taxes) as well as a fairly modest maximum level of benefit.

    Republicans believe that the systemic drag of payroll taxes and the limited maximum benefit levels are too great a price to pay for the absolution of personal risk. Therefore, in an ideal world, individuals should assume personal risk in exchange for the potential of higher lifetime returns.

    Therefore, beneath all the pomp and circumstance, this is really a debate over the appropriate distribution of risk.

    With that point clarified, a few other truths are revealed.

    Since both systems are theoretically possible to implement, this really isn't a question of pragmatics. There is a functioning system in either paradigm, even if we might debate their respective effectiveness. That said, endorsing one system over the other must be an expression of something else. What could it be?

    It is my assertion that the parties are in truth debating whether or not the negative consequences of risk are distributed fairly.

    Democrats tend to view these negative risk consequences as random events. Therefore, it is morally abhorrent to abandon those who experience these negative consequences, as those individuals did nothing to deserve their fate.

    On the other hand, Republicans tend to view the consequences of risk as an expression of merit. And even if they acknowledge some degree of random application in these consequences, they believe that it does not rise to the level that would justify economic drag and benefit limitation.

    This may, in fact, be the central divide between the two parties. Nearly every debate in the economic sphere, from affirmative action to tax policy, turns on this point: do we deserve our fate? Republicans largely say yes, while Democrats, ever the slaves to nuance, say not always -- and not often enough to allow the premise to define public policy.

    So, as you watch Bush this evening, keep this principle in mind. His domestic economic agenda is driven by his unwavering belief in the existence of an American meritocracy. It is no less than an expression of his faith in the Gospel of Wealth. He will, of course, never express this directly, as it would contradict the immediate experience of far too many within the electorate. But, as always, his actions articulate his principles far more clearly than his words ever could.

    Idiots in the End Zone

    If you watch enough NFL football, you will occasionally be witness to an exceptionally bizarre phenomenon.

    Let me set it up.

    It's late in the fourth quarter, with maybe 30 seconds to go in the game. It's fourth-down and the team on offense is about 60 yards from the end zone. The team emerges from the huddle and sets at the line of scrimmage. The ball is snapped and the quarterback drops deep into the pocket. Spotting a wideout who has beaten the coverage, the quarterback unleashes a tremendous pass that travels 50 yards, landing in the soft hands of his receiver. With the defense nowhere in sight, the receiver waltzes right into the end zone for a touchdown. He spikes the ball and embarks upon an incredibly flamboyant celebratory display. In moments, he is joined by the entire offensive squad, all of whom participate in the joyous exhibition.

    And thus, the game is transformed from a 42-0 rout to a 42-7 rout.

    So, for all those people out there who are trumpeting the success of the Iraqi elections as a vindication of George Bush's foreign policy, remember that even though you might have just scored a touchdown, you're still getting your ass kicked. Don't forget that the road to this success was paved by a string of outrageous failures. As Publius put it:

    Elections are great – yesterday was a historic achievement that none can deny. It's a notch in Bush's legacy belt to be sure. But it should not distract us from the reality that success will be difficult because of a series of wrong choices based on wrong assumptions by this administration and its cheerleaders. In the immortal words of the Wolf in Pulp Fiction (Harvey Keitel), “Well, let's not start suckin' each other's dicks quite yet.
    Therefore, I would recommend that the administration and its supporters refrain from dancing in the end zone. I mean, when you're this deep in the whole, you just look like a bunch of idiots.

    Smells like Success

    I think that we'll have to wait for a few more days before we can say for certain, but the initial indications are that the Iraqi election was an overwhelming success. Currently, election officials estimate that 60% of eligible voters made an appearance at the polls -- better than you typically find during a US presidential election. While the turnout was lower in the Sunni dominated regions, overall it appears that they voted in significant, if not substantial, numbers despite calls for a boycott. And while 34 died in the sporadic attacks over the course of the day, the violence was not enough to subdue the population's democratic spirit.

    Suddenly, and perhaps unreasonably, I'm filled with a sense of hope for the future.

    As I've said before, the election was only the beginning. But its successful execution reveals much. First, it shows us that there are limits to what the insurgency can accomplish. Despite their threats, they were unable to mount any large-scale attacks, nor were they able to produce enough low-level chaos to significantly affect voter participation. True, their efforts were stymied by massive security operations conducted by Iraqi and coalition forces -- and only for one day. But, if you're going to choose only one day in which to pull out all the stops in order to limit the violence, Sunday was it.

    Beyond their operational deficiencies, it is now clear that the goals of the insurgents are not completely congruent with those of the population. They may share their desire to evict the US occupiers, but the agreement appears to stop there. Therefore, it appears increasingly likely that, once the shared irritant is removed (i.e. we leave), the Iraqi citizens will embrace the legitimately elected government rather than anything that arises from the insurgent elements. If this holds, it means that the days of an effective, significant insurgency are numbered.

    Second, the turnout suggests that most of the country is willing to work within the existing system. Even though turnout was relatively low amongst Sunni Muslims, their participation was not insignificant. I suspect that this will affect the strategy ultimately employed by Sunni leadership. Even before the election they were signaling their intentions to participate in the drafting of the permanent constitution, doing so even as they called for a boycott. Their enthusiasm for participation will surely be emboldened now that it is clear that substantial numbers of Sunni citizens wish it.

    The positives don't stop there, for if the Sunni opt to participate in the drafting of the permanent constitution, the odds of a truly democratic government sprouting in Iraq increase dramatically. One of the most critical features of constitutional democracy is the protection of minority interests (discussed here). The ratification process as currently articulated essentially grants veto power to each ethnic subdivision. Therefore, all parties have a shared interest in constructing exactly such a system. The Sunnis and Kurds, as minorities, will demand protections and the Shi'ites will most likely be willing to concede them rather than restarting the entire process from scratch (which would almost certainly produce the exact same result).

    All of this boils down to one thing: the citizens of Iraq are invested in the system. It may not be perfect and it might be light years from the constitutional democracy imagined by the Bush administration, but the desire for a peaceful solution to the problem of Iraqi autonomy was clearly on display this weekend. As I said above, Iraq isn't out of the woods. Yet, it is apparent that the foundation required for ultimate success exists. The path to a democratic Iraq might still be derailed, but for the first time in the life of this entire ugly episode, I can feel the hope.
    Weblog Commenting and Trackback by HaloScan.com