David Brooks (New York Times, 5/28/10) informs us that the idea that "government should have more control over industry" is one of the "predictably partisan and often puerile" reactions to the oil spill. The lesson that smart people derive from the spill, Brooks says, is "that humans are not great at measuring and responding to risk when placed in situations too complicated to understand."
What follows is, as Matthew Yglesias pointed out (5/28/10), largely cribbed from a 1996 New Yorker essay by Malcolm Gladwell (1/22/96) that argued that "accidents are not easily preventable" because of various psychological pitfalls that humans are prone to–e.g., in Brooks' paraphrase, "people have trouble imagining how small failings can combine to lead to catastrophic disasters," and "people have a tendency to place elaborate faith in backup systems and safety devices."
In other words, it's all very complicated, and what we need to do is work on "helping people deal with potentially catastrophic complexity" so we can "improve the choice architecture."
But is the story really all that complicated? The New York Times had a story in yesterday's paper (5/27/10), headlined "BP Used Riskier Method to Seal Well Before Blast," about how the oil company chose to use a cheaper casing for the well, even though this could lead to a buildup of explosive gasses–as it seems did happen, leading to the catastrophic spillage in the Gulf. Did BP make this decision because as human beings they have trouble understanding complexity? Or did they make that choice because they are trying to pump oil as cheaply as possible so they can maximize their profits?
Of course, telling the story that way makes it sound like maybe you need to have some outside authority watching over companies engaged in dangerous activities to make sure their corner-cutting doesn't lead to disaster. And that would be partisan, and probably puerile.