James Welles’s book, The Story of Stupidity (first published 1988) is a sweeping chronicle of Western civilization, with a decided slant as indicated by the title. Welles’s grasp of history is actually quite impressive. But his unrelenting cynicism and pessimism make his book as stupid as anything he discusses.
“The lesson the twentieth century,” he says, “is that science and technology will not save us from ourselves.”
That’s the kind of trite, pseudo-profound, ultimately stupid posturing that fills the book and makes me want to retch.
It’s easy enough to find examples of stupidity throughout history. But Welles epitomizes an all too common misanthropy of pervasive contempt for Man and all his works. Ironically, he often inveighs against historical victimizations, yet it’s hard to see why he cares about those victims, since he apparently has no use for any of the human race.
Not, at least, as it actually exists. Welles relentlessly condemns people for failing to measure up to an ethereal standard of sagacity, virtue, foresight, selflessness, and perfect action. As though there’s no reason anyone should ever have been more – well – human.
His jaundiced mindset is exemplified by his take on the Chernobyl accident. “Still,” he says, “those who believe the risk to the public is acceptable for the sake of profit to themselves and their cronies arrogantly continue pushing nuclear power while mouthing soothing platitudes about nuclear safety.”
I actually spent much of my professional career battling operators of nuclear power plants. I grew to realize they were not in fact motivated by “profit to themselves and their cronies.” It was to provide good service to the public at low rates. To the extent they did seek profits, it was mainly to further that aim via keeping their companies financially sound. And meantime, hundreds of nuclear power plants throughout the world have operated quite safely for decades.
That’s what I mean by Welles himself displaying exactly the sort of narrow-minded stupidity he denounces.
In fact, invoking episodes like Chernobyl and the Titanic, he sounds like an extremist technophobe who thinks it’s wrong even to try for progress. Thus he says, “Although those who run our high tech culture are supposed to be alert and generally on the ball, they refuse to recognize that all the safety designs and official regulations in the world will not eliminate the incalculable factor of stupid errors as long as people remain human.”
No – we do not refuse to recognize this. The true point here – the exact opposite of what Welles intends – is that we bravely go forward, knowing there are risks, but that nothing can be gained without risk. That’s why ships did not stop crossing the seas after the Titanic. And planes don’t stop flying because they might crash. And we don’t stop living in houses because sometimes they burn down.
And – I hope – Deepwater Horizon won’t stop offshore oil drilling.
The alternative – implicit in Welles’s book, though he doesn’t face up to it – is to “play it safe” and never take any risks. That would mean not getting out of bed in the morning. In fact, it would mean not even having beds, because we’d still be sleeping on the cold dirt floors of caves.