Good comments in Bulletin of Atomic Scientists on risk, Japan’s accident and system coupling:
We have now had four grave nuclear reactor accidents: Windscale in Britain in 1957 (the one that is never mentioned), Three Mile Island in the United States in 1979, Chernobyl in the Soviet Union in 1986, and now Fukushima. Each accident was unique, and each was supposed to be impossible. Nuclear engineers have learned from each accident how to improve reactor design so as to diminish the likelihood of that particular accident repeating itself but, as Donald Rumsfeld famously reminded us, there are always “unknown unknowns,” and so each accident has been succeeded by another, unwinding in a way that was not foreseen. The designers of the reactors at Fukushima did not anticipate that the tsunami generated by an earthquake would disable the backup systems that were supposed to stabilize the reactor after the earthquake.
And presumably there are other complicated technological scenarios that we have not foreseen, earthquake faults that are undetected or underestimated, and terrorists hatching plans for mayhem as yet unknown. Not to mention regulators who place too much trust in those they regulate.
Thus it is hard to resist the conclusion reached by sociologist Charles Perrow in his book Normal Accidents: Living with High-Risk Technologies: Nuclear reactors are such inherently complex, tightly coupled systems that, in rare, emergency situations, cascading interactions will unfold very rapidly in such a way that human operators will be unable to predict and master them. To this anthropologist, then, the lesson of Fukushima is not that we now know what we need to know to design the perfectly safe reactor, but that the perfectly safe reactor is always just around the corner. It is technoscientific hubris to think otherwise.