Archive for the ‘Philosophy’ Category

Ethics of humanitarian and development efforts: problems versus symptoms

July 1, 2020

My daughter Elizabeth, 27, has worked for five years in the Mid-East for humanitarian organizations, currently for a consultancy much involved in Afghanistan. Wonderful, you might say. She herself is less sure — always engaging in critical self-scrutiny.

There’s much literature criticizing the whole foreign aid and development landscape, the road to Hell being paved with good intentions. Much aid has wound up serving to strengthen dictators. Other downsides may be less obvious. Send aid directly to schools and you relieve government of that expense so it can spend more on, say, weapons. Send used clothing and you undermine a nation’s own garment industry. And so forth.

Elizabeth and I have discussed such issues as relating to my own support for a Somaliland education project. Her thing is trying to find what actually works best in the context of a local culture and its idiosyncrasies. She’s troubled that the project was started by a rich white guy who went there with good intentions but scant local knowledge. She pointed me to a sardonic short story in the voice of an African employed by some sappy do-gooder Americans who created a program actually accomplishing nothing. But I was moved by the proven success of the one in Somaliland.

The words “white savior” come up. We’re told to worry instead about problems closer to home. But Africans are no less my fellow humans than those across the street. And their problems tend to be much the greater, with resources to tackle them far smaller. I don’t see myself as a white savior; hopefully, a human contributor.

That makes me feel good. Is my Somaliland involvement really an attempt to buy myself those feelings? We’re actually programmed by evolution to feel good when doing good, it’s a mechanism to promote such behavior, thereby aiding group survival. So is there any such thing as true selfless altruism? But I’d maintain we are what we do. The doer of a good deed doesn’t delude himself believing he’s altruistic — he is in fact behaving altruistically. And his motivation is immaterial to the other beneficiaries of his action.

Elizabeth recently wrote a blog essay concerning the Oscar-winning film Learning to Skateboard in a Warzoneabout an NGO project for Afghan girls — and an Al Jazeera article, Skateboarding Won’t Save Afghan girls. The latter contends the program just covers up the country’s problems, which it blames on “centuries of ruthless Western military and political intervention.” The skateboarding is likened to “palliative care” that makes dying patients feel better without curing them. The article invokes the “white savior” trope, and says the program and film “decontextualize” the girls’ lives, presenting them as “ideal victims for pity.” While making “Westerners feel good about” the Afghan war “which ‘liberated’ girls and women and gave them opportunities their own society would never have afforded them.”

Why put “liberated” in snide quote marks? America’s intervention did liberate them, did give them opportunities the article actually correctly characterizes. Though obviously Afghanistan’s problems were not all solved. Is that really the bar for judging any project’s worth?

Elizabeth says the real question is whether a program like the skateboarding —which does have real benefits — comes at the cost of other initiatives, which might have larger impacts. “Should we address the problems, or the symptoms of the problems — or both?”

She cites a book, Winners Take All, by Anand Giridharadas, arguing that the business world is too focused on symptoms rather than underlying problems — and indeed those so focused are the very people benefiting from the system that perpetuates the problems. Giridharadas cites the example of a phone app to help people with “unpredictable employment” to even out their incomes. Which he characterizes as a symptom of the real problem, an economic system making unpredictable employment so common — a system he says the app’s developers themselves helped create and benefit from.

Seriously? As if they somehow calculatingly orchestrated the whole global economic structure just so they could profit from the app? And does Giridharadas have a workable solution to the underlying problem he sees? No, he just wants other people to simply forgo their self-interest. Thanks a lot.

Casting the problem as the fault of villains is a kind of scapegoating all too prevalent (particularly in the left-wing economic perspective). But those who profit by hiring people for temporary work enable those employees to earn money by creating goods and services whose buyers value them above what they pay. Seems win-win-win to me. Not rendered villainous because Giridharadas imagines some fantasy world in which people’s earnings are divorced from the economic value their work creates. (I suggest the result would actually be a nightmare world.)

Elizabeth too largely disagrees with Al Jazeera and Giridharadas. She sees nothing wrong with addressing “symptoms” — while also working on “problems” — which may take decades if not centuries. These are not mutually exclusive. No reasonable person could view the skateboard film and think all Afghanistan’s problems are solved. Indeed, she considers it important to spotlight such successes. Whereas moralistic symptoms-versus-problems dichotomizing can make doing what’s merely feasible seem pointless.

Elizabeth’s main concern is with the impact one’s actions can achieve, and thus whether to target “problems” or “symptoms” — the “policy level” versus the “personal level.” But as for what any individual can do, she interestingly invokes the concept of “comparative advantage.” That’s an economics doctrine saying a nation gains from trading whatever it’s best at producing, even if other nations can produce that thing better. Applying it here would mean doing what one is best equipped or positioned to do. Better to have a modest success than an over-ambitious failure. But she also suggests a third option: start small and strive to scale up.

I think Al Jazeera’s analogy to palliative care is also fatuous moralizing. One is not usually able to achieve big-picture solutions. But regardless of what level you’re looking at, what matters is quality of life — for the many, or a few, as may be. Every one counts. Every improvement counts. Inability to go big doesn’t negate the value of the small. A cancer patient may not be cured but meantime palliating the pain is worth doing. Likewise for the Afghan skateboarding girls.

No individual can “solve” the kinds of big problems at issue. All one can do is what helps as much as one can. A lot of people doing that helps a lot.

Don’t let the perfect be the enemy of the good.

Big Bang, big questions

June 22, 2020

Our Universe began with the Big Bang about 13.7 billion years ago. It started virtually volumeless, virtually infinitely dense and hot, and then expanded. What came before, and triggered the Big Bang? That’s not a valid question, because Time itself began with the Big Bang.

This is the “standard model” of today’s science. I am a believer in science. But that’s not like a religious belief or faith; instead, a matter of epistemology. Which refers to how we know things.

This doesn’t mean everything in science is “true.” That misunderstands the point. Scientific precepts (unlike religion) are always subject to revision with more information. That can disprove a theory, but none is ever proven with finality. That said, however, the bulk of modern science can be pretty much taken to the bank. The concept of biological evolution, for example, will not be disproven by new information. And the same applies to most of modern physics.

Current cosmology devolves from Edwin Hubble’s 1929 discovery that most other galaxies are moving away from us. The farther distant, the faster. This means the Universe is expanding. Run that movie backwards and it contracts. Ending all crunched together: the Big Bang.

Note that the expansion doesn’t mean everything is enlarging. Instead it’s space itself that’s expanding, carrying everything along with it. And stuff all moving away from us doesn’t mean Earth is at the center. Picture instead a raisin cake rising; as it expands, each raisin moves away from every other.

Science has figured out the physics of the Universe’s start, back to a very teensy fraction of a second after the Big Bang. But then you get to a point where the extreme conditions of density and heat mean the laws of physics as we know them don’t work. We call this a “singularity.” (The same applies inside a black hole. Some scientists speculate that a black hole’s singularity can give off big bangs; maybe that’s our own origin.)

Inability to parse out just exactly what happened in that very first instant might be considered a problem in the standard model. But there’s a difference between “don’t know” and “can’t know.” While some theorists say “can’t know,” I prefer to suspend judgment on what future science may be able to penetrate. Scientists a century ago could not have imagined today’s knowledge.

Meanwhile, inability to wrap our heads around the notion of Time beginning with the Big Bang might also feel like a problem. Yet hitting that seeming conceptual wall doesn’t stop thinking about explanations for the Big Bang. Some reasonable concepts have been sketched out at least in a general way. We can say they’re not science because we have no way to test such ideas experimentally or with predictions — today. But again, a different story in the future should not be ruled out.

But here’s another problem. The Universe’s diameter is currently estimated at 93 billion light years. (At least that’s what we can see; the whole thing could be larger.) That doesn’t gibe with its age being only 13.7 billion years; it implies expansion exceeding light speed.

The explanation is inflation: during an infinitesimally small interval after the Big Bang, the Universe expanded faster than light speed. But didn’t Einstein tell us nothing can travel faster than light? Yes; but that applies only to objects moving through space. In inflation, it was space itself expanding.

And what caused this? It’s theorized that the force of gravity suddenly reversed, pushing stuff apart rather than pulling it together. Then, just as suddenly, it switched back. We have some ideas about why that could have happened.

However that, and the whole inflation theory, is mainly supported on the basis that it’s the only way we can account for what we observe.

Here’s another problem. We know the law of gravity: proportional to mass and decreasing with the square of the distance between objects. But other galaxies don’t appear to obey it, unless there’s much more mass than we can see. Scientists call that extra stuff “dark matter,” and have debated various ideas for what it might be. We just don’t know.

A possible solution is “Modified Newtonian Dynamics” (MOND). Just as some laws of physics change when it comes to the ultra small (quantum mechanics), the law of gravity might not apply to the ultra large distances associated with galaxies. Realize that gravity being far the weakest of nature’s fundamental forces — and diminishing with the square of the distance between objects — we’re talking about a force of evanescent smallness at galactic distances. A tweak to Newton’s gravity law might explain things without requiring any additional “Dark Matter.” (While I find this idea attractive, it is not orthodox physics.)

There’s yet another problem. We had assumed that after the Big Bang’s initial energy burst (and the inflation episode), the momentum of the Universe’s expansion would be slowing. There was debate whether it would eventually slow to a stop, with gravity then starting to pull things back together, toward a “big crunch;” or would expand forever, dissipating into virtual cold nothingness; or would do neither, reaching stasis (a “flat universe”). All dependent on exactly how much mass there is. The third option seemed to be winning.

But then a new discovery blew scientists’ minds: after having slowed for some billions of years, the expansion started speeding up! And is still accelerating.

What’s causing that? “Dark Energy.” Meaning, as with Dark Matter, we don’t know. Yet Dark Energy is calculated to comprise some 70% of the entire Universe. (Remember that per Einstein’s famous equation, energy and matter are interchangeable.)

So . . . the singularity; no Time before Time; inflation; Dark Matter; Dark Energy. Science likes beautiful elegant theories. The standard Big Bang model begins to look like a clunky a Rube Goldberg contraption. With a lot of question marks. Might it all be just a huge mistake? What could an alternative possibly look like?

But suppose the Universe’s expansion does ultimately run out of steam and reverse, falling into a Big Crunch. It wouldn’t necessarily have to collapse all the way back to a singularity. Before that point, the extreme conditions could conceivably trigger a new Big Bang. Going back and forth like that forever. This avoids the conundrum of a singularity and also of a “Time before Time.” Though not the mind-bender of the word “forever.”

This is called the “Oscillating (or Cyclic) Universe,” discussed in Brian Clegg’s book, Before the Big Bang. That title hooked me in, but a more accurate one would have been About the Big Bang. Anyhow, Clegg shows there are serious problems with the Oscillating Universe concept too. He says it’s either equivalent to a perpetual motion machine or else must eventually run out of energy and expire.

There are other theories, like “branes.” And multi-universes. I won’t go into them. None strikes me as anything more than complete speculation.

Anyhow, one is forced to confront an irreducible mystery. Either the Universe had a beginning, arising out of nothing. Or else something always existed, without ever having had a beginning. No human mind can really grasp either possibility.

And there is an even deeper question: why is there something and not nothing? Scientists and philosophers have grappled with this.* Their efforts are far from satisfying. (Of course religion does no better. Why should there be a god rather than no god? At least we can be sure the universe exists.)

“Why is there something” is a question deep in my consciousness. Why I have one is itself a conundrum; but that’s only one small piece of the far larger mystery of existence itself. Most of us take it for granted, but not me. In fact, it’s my understanding of the clockwork of existence — imperfect though that understanding surely is — that nags me with that final “Why?”

It seems we should more logically expect a Universe of nothingness — a non-universe. That at least would raise no deep questions whatsoever. It would just be. (Or not-be.)

But I remain a believer in humanity’s ability to gain understanding. Someday people will look back with bemusement at us primitives, just as we look back at flat earthers.

* As I’ve discussed; here are some links:;;

Margaret Atwood: The Testaments

June 13, 2020

The Testaments is Margaret Atwood’s sequel to her 1985 novel, The Handmaid’s Tale. Set in Gilead, a near-future theocratic dictatorship, in lands between Canada and Mexico.

Gilead is a classic dystopia, whose societal raison d’etre is baby production. Apparently there’s been some kind of infertility epidemic. Many wives are barren; their husbands assigned concubines who aren’t. Those are the handmaids. But all women are medievally subordinated to men. Their schooling limited to things like embroidery; no reading or writing. The system enforced with ruthless brutality. All, supposedly, to serve God.

The Handmaid’s Tale struck a chord at a time when America’s religious right was flexing its political muscles. The book enjoyed a second coming when it seemed they were gaining yet more ground in the Trump era. So the sequel is timely.

For all the fears about America becoming a Gilead, actually no Christian fundamentalists advocate anything like such extremism. And for all their seeming political mojo, they’re doomed. Religious belief correlates inversely with age. In past conformist times, faith was an unquestioned default, but now that people can see an alternative path, more are taking it. Fundamentalists are already only a small minority, though their power is outsized because they vote so assiduously. But their credibility is undermined by hitching themselves to the most morally corrupt gang in our political history. (In that sense, perhaps, today’s America foreshadows Gilead.)

So I don’t see a Gilead coming through conventional politics. The Handmaid’s Tale didn’t explain how it did come about. The Testaments fills in that story, though not in any detail. Gilead’s founders long plotted their coup, then just used guns, slaughtering Congress. No mention is made of, like, the U.S. Army. To impose their rule would have required really an awful lot of men with guns. Texas, California, and maybe some other states seem to have fought them off. I find the takeover rather implausible. But in a novel one must suspend disbelief.

The “testaments” of the title are first-person accounts, unearthed long after, written by three female participants in the book’s action. One had been a 53-year-old unmarried judge, minding her own business, when the coup brings gunmen to her office building to take all the women away. To a stadium, where they’re segregated by profession and held in sadistic humiliating conditions. Groups of blindfolded women are marched onto the field and shot. This is just the start of her ordeal. Which she surmounts — emerging as “Aunt Lydia,” the new regime’s head enforcer of all things female.

Another character is “Commander Judd,” a top leader with a penchant for barely pubescent wives. One after another. Somehow they keep dying. It’s very typical for men posturing as god’s mouthpieces to be doing it for sex, especially with younger females. (Like Joseph Smith.)

I wondered about Gilead’s economy. It seemed to have none, apart from vague references to “econopeople,” never actually shown in productive work. And it gradually emerges that even the big shots live in very straitened circumstances, with even mundane consumables in short supply. That’s what you get when everything’s about God. God does not provide.

The book has some nice writerly touches. Here’s Aunt Lydia talking about her statue: “At least I look sane. . . . . the elderly sculptress . . . had a tendency to confer bulging eyes on her subjects as a sign of pious fervor. Her bust of Aunt Helena looks rabid, that of Aunt Vidala is hyperthyroid, and that of Aunt Elizabeth appears ready to explode.”

But my enjoyment waned as Lydia’s plot to avenge her torture and bring down Gilead unfolded with tedious convolutions that didn’t make much sense to me. A cache of devastating documents (including about Commander Judd’s crimes) is smuggled into Canada on a “microdot” implanted into the arm of a girl likewise perilously smuggled into Canada (on the “Underground Femaleroad”).

I’d have just mailed the microdot to Canada inside an ordinary letter. But such prosaic thinking doesn’t make for a literary thriller.

The stadium scene too might have seemed ridiculously over the top. Atwood making Gilead’s regime an epitome of evil, with no nuances or shades of grey. But the stadium episode actually reprised quite faithfully what happened in Afghanistan when the Taliban seized power. And while such horrors might seem implausible in America, we are too often reminded that human brutality can have no limits when the guardrails are removed.

I keep saying: America represents the culmination of long human efforts to build societal institutions protecting against such horrors. But their perpetuity is not decreed by God. We kick them down at our peril.

Proof of Heaven?

June 5, 2020

Dr. Eben Alexander is a Harvard Medical School neurosurgeon. He’d never been a religious nut. Then in 2008, aged 54, he suddenly had a strange and severe bacterial meningitis infection, putting him in a coma for a week. During which he visited an alternate reality, a full deluxe tour.

This was a 2012 Newsweek cover story, blazoned “Heaven is Real.” I wrote critically about it here. Then Alexander published a book titled Proof of Heaven.

I’ve read it, though not with any afterlife hopes. Instead I was curious why such a man believes his comatose hallucination was real.

He writes that as a neurosurgeon he was familiar with stories of near-death experiences. “But all of it . . . was pure fantasy.” He says he “did know that they were brain-based. All of consciousness is . . . the brain is the machine that produces consciousness in the first place.” He notes the brain is very temperamental. Reduce just slightly its oxygen feed “and the owner of that brain is going to experience an alteration of their reality. Or more precisely, their personal experience of reality.” Should a patient come back with memories, “those memories are going to be pretty unusual. With a brain affected by a deadly bacterial infection and mind-altering medications, (his emphasis) anything could happen.”

Except when it happened to him! That was in, contrast, “ultra-real.”

Attempting to justify this quite remarkable claim that his case differed from all those others he sensibly debunks, Alexander says that during the coma his own brain was not working at all.* Thus, he “was encountering the reality of a world of consciousness that existed (his emphasis) completely free of the limitations of [his] physical brain.”

There’s a problem here. Alexander relates, in great detail, his coma travels, with his tour guide an ineffably beautiful girl, amid millions of butterflies, giving him a look so deep it was beyond indescribable, and on and on. And his memory of it all was recorded where? In the brain he says was out of commission?

You can’t have it both ways. If this trip was in some other reality outside the consciousness in his brain, then that consciousness could not tell us about it. If instead his brain was, on some level, functioning during the coma, it’s far more plausible that what he experienced was (like in all those other cases) just something weird happening in his brain due to the very abnormal coma conditions. It’s to avoid this logic that Alexander posits his experience as entirely outside brain functioning. Yet how can anything be experienced at all, except via the brain?

What, indeed, does it mean to experience something? Who, or what, does the experiencing? This gets back to what consciousness, and the self, are. Descartes suggested they (a “soul”) could somehow exist separate from the brain, but no serious scientist today accepts such “Cartesian dualism.” There’s no rational alternative to consciousness and self emerging out of brain functioning, though we don’t yet know exactly how. Otherwise the very idea of having an experience is incoherent. Alexander claims to have experienced something outside of brain functioning. Even if he somehow did — how would he (the “he” existing within his brain) know it?

And how could a brain in such a compromised state have recorded such a detailed memory as he relates? He himself writes, “The process of memory takes enormous brainpower.” We know how hard it is to remember dreams after waking — even with brains functioning normally.

Interestingly, Alexander says that for several days after his coma, he experienced “paranoid fantasies” that “were extremely intense, and even outright terrifying while happening.” He recognizes they were “something cooked up by my very beleaguered brain as it was trying to recover its bearings.” Yet he insists that was “very very dissimilar” from “the ultra-reality deep in coma.” He says coming out of it he spouted lots of crazy things to his family. But didn’t mention to them the “ultra-reality.” Very strange.

Here’s a clue. Alexander always knew he’d been adopted; his birth parents unwed high schoolers, who he’d assumed had parted ways. But in 2000 he learned they’d actually married and had other kids. And wanted no contact. Suddenly, he says, his view of himself totally changed to “someone cut off from my source.” And “an ocean of sadness opened up within me.” There followed alcohol abuse; a struggle for sobriety; dysfunction in his professional and family life; depression. With his last hope for some force in the universe beyond the scientific “swept away.”

His coma restored it. Though Alexander doesn’t actually say he saw God, God was somehow in the picture. And for all his rapturous description of it, only obliquely does he imply we go there after death. He mentions glimpsing frolicking people but there’s no indication they previously led earthly lives. Still, he writes of “the reality of realities, the incomprehensibly glorious truth of truths that lives and breathes at the core of everything that exists or ever will exist.” Which is: “You are loved and cherished. You have nothing to fear. There is nothing you can do wrong.” Boiled down to one word: love.

Dreams are sometimes the brain’s way of chewing on deep anxieties. I am not a trained psychiatrist, but this looks like a person who, in the pit of his being, does have fears; does fear doing wrong; does fear a love deficit. In other words, a man who suffers from the human condition. Which his brain, even in coma, was struggling with.

Before the coma, he’d finally reconciled with his birth family. After it, he got a photo of his sister who’d died. He says he recognized his heavenly tour guide. Studies have shown that memories are not stable, but change every time we revisit them. And again, dreams are particularly hard to recall. When he saw the photo, his memory of his comatose hallucination could have been tweaked to match it.

Alexander’s brain had been rocked by extreme trauma. He just barely survived. Such a trauma might well have lasting effects. Like believing fantastical tales of Heaven. Whether or not his brain was functioning during his coma, it was out to lunch when he wrote the book.

So it’s piffle; but not harmless piffle. Rebellions against truth and reality are buckling our society’s foundations. Anything encouraging people to believe nonsense is pernicious.

Alexander quotes Einstein: “there are only two ways to live. One is as though nothing is a miracle. The other is as if everything is.” I don’t believe in miracles, in the sense of contravening natural laws. Yet I’m very much in Einstein’s second camp. To me, all of existence — especially my own — is virtually miraculous. I can easily envision the alternative, a cosmos dark, empty, and bleak. Reality is such a gift that I don’t share Alexander’s ache for a better one.

* A neurosurgeon should know that with no brain functioning, he’d be dead. Actually, Alexander later clarifies that it was the “human” part of his brain knocked out, but the deeper parts, that regulate autonomic processes, still functioned.


Harvey Havel: Dealing with rejection

May 23, 2020

This big fat book showed up unexpected in my mailbox. Harvey Havel is a fixture in the local literary scene. Chatting with him at a recent author talk, it had emerged that we’d both published blog essay collections. So he sent me his. He’s a sweet person. Also tormented.

The book starts with political essays (a decade and more old). Perhaps unusual for a professional writer of color, his viewpoint is determinedly centrist. And Olympian, looking past the issues of the day, in a larger perspective, trying to see the tectonic forces shaping our politics. As a professional writer, Havel has a glib command of the relevant lingo. Yet I found his analyses somewhat oversimplified, falling short of profundity. (Sorry, Harvey.)

So after reading some of this I decided to skip ahead to the later sections dealing with more personal matters, and stuff like sexual politics. This was much more engaging. Havel speaks from the heart with unsparing candor.

Like about his alcoholism. It nearly destroyed him; he believes it’s actually necessary to sink that low before one can overcome. He’s apparently been off the stuff for a good long time now, but alcoholism still looms as a big presence in his life.

He was also ruined, he says, by money. Given a big lump sum by his father upon college graduation, he lived the high life, as though it would never run out. Of course it did, while turning him into “a man of low morals and character,” blocking his capacity to grow. Thus he says he remained a child (as of 2005 anyway). He had to learn the meaningfulness of earning what one has. He feels his “relationship with money now is the happiness and satisfaction that I have somehow rid myself of it.”

Here, and elsewhere, he brings in belief in God, crediting that for positive change in his life. I know many people feel the same. But Havel never really analyzes this (as he analyzes so much else). I have no such belief. For me, divorcement from reality cannot be the basis for an authentically meaningful life.

One 2009 piece starts off, “read this poem and then we’ll have a discussion about it.” Titled “Qualm” it ostensibly debates pushing an airplane alarm button, and Havel does discuss it at length. Finding this in the book was a nice surprise, as the poet is Therese L. Broderick (my wife).

Havel is not one of those many people who write as a sideline or hobby. Instead, he decided out of college to make this his career. Now approaching 50, he’s been at it for decades. With little reward. He has self-published many books (including this one), but his indefatigable efforts with established presses have met with constant rejection. Publishers tend to be very picky; selling printed books that make money is extremely hard; so a stream of rejection letters is inevitably part of any writer’s life.

But, having indeed devoted his life to this, Havel cannot just shrug off the disappointment. He has quite a lot to say about it. Mostly he discusses this as a sociological/cultural phenomenon. But one essay tells of his reading a terrific short story. Bringing on an attack of FAS — “Failed Author Syndrome” — and its corrosive resentment of others’ literary success. (He doesn’t mention dissecting that story to tease out what made it so good.)

I am no stranger to literary rejection myself. I spent years struggling to get my magnum opus (The Case for Rational Optimism) published. UntiI finally remembering the press that, over 30 years prior, had reissued my Albany political book. I’ve had ten book publications and made money on all but one. But the loss on that one exceeded all the gains. So I guess I’m no literary success either.

Havel also writes about rejection by women. This too resonated with my own history.

He has a “thing” for white women. Who, he says, generally refuse to view him romantically because of his color. For me it was height (or its lack). One guy’s recent radio essay related how he’d meet women for dates and see their “libido drain away” when he’d stand up, revealing his shortness. I was clueless in my own younger days (part of my problem), but in hindsight being 5’4″ explains a lot.

Back to Havel and his attraction to white women. One entry in the book is actually titled “In Defense of White People.” I was expecting something sardonic. But no. Havel explains that at one time he did share the stew of negative feelings toward whites that some non-whites hold. However, he says, he joined a white family for a time — what he means by this wasn’t clear to me — but anyway, he received acceptance and love, leading him to reject, as simply incorrect, the standard indictment of whiteness.

Of course that doesn’t mean all whites are good. But white people are, mainly, just people, and most people are good. Yet it almost seems as though Havel puts whites on a pedestal.

Perhaps this partly explains his attraction to white women. Then again, a majority of American women are white, so Havel may actually be conflating an attraction to women with one for white women. But he does feel his color is a barrier with white women in particular.

I found this odd. No doubt some racist women would manifest this, but in my observation, many if not most females are sexually receptive to nonwhites, many indeed positively attracted to them. Secondly, while Havel is slightly brownish, his ethnicity is far from evident visually. In fact, being of Indian ancestry, he is caucasian. Also, while I’m no great judge of this, I would rate him pretty good looking.

So what, really, was the trouble? Relating an actual romantic debacle might have helped elucidate this, but Havel includes none. The book makes it sound as though he never actually had a relationship (despite a lot of sex). However, there are some clues in the book regarding his mindset about women. It smacks of that old stereotype, “objectifying” women. He wants one not just white, but beautiful, well-educated, and affluent; it’s very much a package. He seems to believe the ideal way to get such a woman is to fight for her — literally. Physically fighting other men. His writing so often of “winning” women does make it sound like a competition. And he posits that what a woman most wants from a man is to be protected by him.

How about just relating to a woman as a fellow human being?

Woke Gone Wild

May 14, 2020

(My book review being published in Skeptic magazine* (slightly revised))

 A regime-imposed ideology, tolerating no dissent, enforced by a surveillance state and thought police, with transgressors punished. Welcome to Nineteen Eighty-Four. China? Yes. But increasingly, America’s “liberal” universities too. If nothing else, surely liberalism means promoting human liberty, with freedom of thought and expression essential. Yet U.S. campuses have seen the rise of speech codes, speakers disinvited or shouted down, professors offending against the approved catechism forced to apologize, submit to re-education, or even to resign. And an obsession with “diversity” while suppressing the kind that should matter most — diversity of viewpoint.

Robert Boyers has taught in academia for half a century, currently at Skidmore. He’s the longtime editor of Salmagundi, a magazine of politics, culture, literature and the arts, and is very much a man of the left. His 2019 book, The Tyranny of Virtue: Identity, The Academy, and the Hunt for Political Heresies, calls out the perversion of liberal ideals he sees in American universities — political correctness becoming a rigid party line that brooks no dissent, while plunging down rabbit holes of absurdism. The book is full of horror stories from the author’s own experience. Contradictions and ironies abound. The reader enters a hall of mirrors.

The book’s main theme is the suppression of argument, with no discussion allowed. How to justify this? Postmodernism promoted the idea that argument itself is suspect because nothing is really true. And a fetish for nonjudgmentalism strangely transmogrified into a judgmentalism of the harshest sort — against any deviation from the canonical ideology.

Boyers relates how his own younger self once swallowed an apologia by Herbert Marcuse that freedom of speech must yield to an enlightened minority whose virtue entitles it to censor. Fortunately, Boyers himself ultimately gagged on this bilge. Unfortunately, such intellectual arrogance is at the heart of today’s academic culture.

If the PC catechism is really as manifestly correct as its woke minions seem to think, then how is it threatened by debate? Maybe they fear they’ve built a house of cards that cannot withstand scrutiny.

Some European nations ban “hate speech,” which includes anything deemed offensive. Holocaust denier David Irving, for example, was jailed in Austria. In America’s First Amendment culture, freedom of speech trumps any sensitivities of hearers. After all, almost anything can offend someone. Jefferson said the answer to bad ideas is not suppression, but better ideas. But our universities today elevate protection against being offended, or even just being made “uncomfortable,” above freedom of expression.

Thus speech codes, “safe spaces” and “trigger warnings.” Like the helicopter parenting aiming to shield children from all life’s vicissitudes  —  leaving them unable to cope with a real world lacking safe zones. It’s also antithetical to actually educating students. Boyers wonders: how could you teach any novel, for example? A university’s mission used to be moving students out of their comfort zones, opening their horizons, cultivating inquiring minds. Now they’re re-education camps enforcing the narrow bounds of a prevailing orthodoxy.

Looming large in today’s PC catechism is the concept of “privilege,” not just “white privilege,” but any sort of power or status. An egalitarian idea: no one’s above anyone else. Fair enough, utopian though it may be; but privilege warriors go further and actually turn the tables. Anyone deemed speaking from a standpoint of “privilege” is delegitimized and to be silenced. Yet aren’t the attackers invoking a privilege of their own — polemical, ideological privilege? The privilege of feeling virtuous as against an evil “privileged” status?

Another key concept is “inclusiveness.” Applicable to previously marginalized identities and people who’ve sometimes been seen as “the other.” Yet aren’t those tarred with the “privilege” label being marginalized, themselves “the other” now? Further, anything possibly construed as condescending toward some now-coddled group is an unpardonable sin. But isn’t the shielding of such groups, in a way that implies their inability to endure even the subtlest affront, not itself highly condescending?

And the notion of identity is fundamentally a concept of us vis-a-vis them, if not indeed us versus them. Rather than being liberated to live out self-actualized identities, people are put in identity boxes defined by the prevailing ideology. A trans person not allowed to be anyone beyond trans. No Whitmanesque containing of multitudes!

Boyers grapples with what racial identity entails, quoting James Baldwin about his fraught relationship with European cultural icons like Shakespeare, Rembrandt, and Bach, in the context of his African heritage. I couldn’t help thinking that my own Jewish ancestry feels relatively immaterial to who I am as a human being.  Shakespeare, Rembrandt, and Bach are part of our common human heritage, as is the experience of Africans who were enslaved. My race is the human race. That statement itself might be labeled “racist” in woke culture.

There’s a lot of polemics lately about “whiteness” that seems incoherent, like white people should somehow get over or move beyond their whiteness, whatever that means. “Whiteness studies” has now become an academic subject domain, with those harping on “white privilege” stereotyping all whites as a monolithic block. There’s a word aptly describing this: racism.

“Micro-aggressions” refers to anything that makes anyone uncomfortable. But no such transgression is ever treated as “micro;” anybody accused of one subject to aggression that isn’t “micro” at all. They’re said to create a “toxic environment.” Yet what’s truly toxic in today’s academic environment is a climate of fear lest one blurt out anything crossing the innumerable PC red lines, becoming subject to sanction.

Boyers is really faulting a basic lack of human decency. Seen in the unforgiving condemnations of things that are often, on any sane view, trivial. He cites examples of people denounced for merely confusing a name. Sometimes, he says, a mistake is just a mistake. Which should simply be forgiven — by decent human beings.

Disability is another minefield. Boyers describes a poster incongruously headed KEEP SKIDMORE SAFE, with a catalog of “ableist” language to be avoided (on pain of disciplinary action), including such everyday idioms like “turn a blind eye,” or “run to catch a train.” So plainly ridiculous that this might have been satirizing the whole offense-taking culture. But no, it was in earnest. Boyers deems it “hard to imagine a better example of a hostile work environment,” putting everyone in fear of the thought police.

Then there is the absurd notion of “cultural appropriation,” barring white artists and writers from touching upon minority cultures. A white painter’s take on the famous Emmett Till funeral photo met with demands that it be removed from the Whitney Museum—indeed, that it be destroyed. “Stay in your lane,” voiced one critic. The artist’s intent was of no account. Boyers says a “stay in your lane” norm would limit every writer to memoir only. But it’s no two-way street. Black writers can freely depict whites. Indeed, theater is opening up for blacks to portray white characters. Don’t dare the reverse, of course.

Throughout, the book deploys metaphors from religion, such as the saved versus the damned, a church united by a zeal to persecute heretics. So deranged with self-righteousness, the woke congregation cannot see the contradictions between its preachings and practices. Boyers notes that over 200 U.S. universities now have “bias response teams” that, together with campus police, investigate the speech of professors and students. The University of California system circulated a list of prohibited locutions, including “America is a land of opportunity” or “you speak English very well.”

Yet, Boyers writes, “self-described liberal academics continue to believe that they remain committed to difference and debate, even as they countenance a full-scale assault on diversity of outlook and opinion, enwombed as they are in the certainties enjoined on them by the posture they have adopted, which alone confers on them the sense that they are always in the right.”

Curiously absent from the book is the word “McCarthyism.” Still denounced by the left. People blacklisted and otherwise punished for their politics. Apparently it’s a crime when done by the right, but not by the left.

Is there no hope? In an author talk, Boyers avowed guarded optimism that we may have reached peak PC, with sanity starting to push back. And the craziness in academia has not, to any great extent, yet infected the broader American culture. But as universities continue pumping out more ideological Savonarolas, freedom still needs defending as much as in Jefferson’s time.

(Note, this review was refused by several publications because it was too politically incorrect.)

* Here’s the link:

How old is the world?

April 25, 2020

Is the Earth around 4.5 billion years old? Or, just 6,022 and a few months?

PBS’s Independent Lens had a fascinating documentary about Kentucky’s “Ark Encounter” — to go with the “Creation Museum” I’ve written about. The documentary spotlighted some local opposition mainly to the project’s millions in tax subsidies. Surely unconstitutionally violating church-state separation.

This ark is a full-size imagining of Noah’s vessel. Really gigantic, costing in nine figures, to illustrate the ark accommodating every “kind” of animal. But apparently these Biblical literalists weren’t bothered by the implausibility of Noah and his three sons alone somehow managing such a huge project, without the modern technology they themselves used — not to mention the funding.

But of course that’s the least thing that might trouble young-earth creationists. They’ve calculated, from the Bible, Earth’s beginning in 4004 BC. October 23, to be exact! Biblical literalism taken to its ultimate, preposterous extreme.

Actually, the planet is roughly a million times older. If its history were condensed to a single year, then 4004 BC would have arrived on December 31 — at about 11:59 PM.

To swallow that 4004 story, you have to torture a lot of facts. Or just ignore them. One is our seeing other galaxies millions, even billions, of light years distant. A light year is how far light travels in a year. The light from those galaxies took millions or billions of years to reach us. Case closed.*

Likewise, to deny biological evolution you have to work awfully hard waving away practically everything we actually know about life and its history. As geneticist Theodosius Dobzhansky said, nothing in biology makes sense except in light of evolution.

The impresario behind the Creation Museum and Ark Encounter is Ken Ham. The documentary showed what a slick con artist he is. Speaking to a big audience of youngsters, Ham led them in a chant mocking scientists who say the Earth originated billions of years ago: Were you there?

What a killer argument. And if you believe, instead of those scientists, the Biblical story of creation — were YOU there?? And the people who wrote that Bible story — were THEY there??

Also shown was one young woman “scientist,” part of the Ark organization, to give it a patina of “science.” I put those words in quotes because, as one (real) scientist said, one can have the training and capability to do science, but actually doing it is another thing. The young woman “scientist” declared that the Bible is true. How does she know? Because it’s true. It just is. She believes it because she believes it.

As a child I found a price guide to check my Canadian coins. “I’ve got a valuable one!” I exclaimed to my parents. The 1913 dime has two varieties, one rare, one common. My rationalist dad said, “How do you know yours is the rare one?” I said, “I just know it!” I wanted to believe.

In science, facts dictate beliefs. Not the other way around.

Then the show profiled a young man, reared in young-earth creationism. It was very important to him to protect his belief by having all the answers. Which he got from creationist websites arming him with refutations to every fact of mainstream geology and evolution-based science. Refutations which gradually he came to see through as false, misleading bunk.

I’m in awe of someone able to do that, having such intellectual equipment, honesty, and courage. I had it easy; I may have believed in my 1913 Canadian dime, but never in religion. But for people who do, the belief is very powerful. The documentary showed several whose certitude and confidence runs deep. I always remind myself that certain as I am they’re wrong, they’re equally certain I am wrong.

But: what difference does it make, really, whether you think the world is billions of years old, or only a few thousand? If you understand evolution science, or refuse to? It doesn’t exactly affect our daily lives. Or does it? The belief isn’t in a vacuum. It’s integral to a whole way of thinking, to one’s relationship with reality, with existence itself. Indeed, people shape their lives around such beliefs. That’s why they hold them so tenaciously, and why freeing oneself from such false belief is often so traumatic.

Surveys show about 40% of Americans believe the 4004 BC story. These are more or less the same people who don’t believe climate science. Who believe Trump.

* Actually, young-earth creationists answer that God could simply have made that light travel faster. Or created all the stars, and made them visible, all on the first day. Belief in such literal omnipotence is a universal cognitive get-out-of-jail free card.

The Open Society and its Enemies

April 21, 2020

(A condensed recap of my talk at the Albany Public Library on Nov. 19)

Philosopher Karl Popper (1902-94) wrote The Open Society and Its Enemies in 1945. It seems timely now. The old political categories of right versus left, liberal versus conservative, are breaking down. Today’s true divide is over the open society idea. Meaning openness to change, toward ideas and free debate, individuals following their own paths, immigration, globalism, free trade, and so forth.

Popper attacks what he calls “historicism” — the idea that history has laws we can discover. But concepts of future inevitability inhibit efforts to change it. Popper says “the future depends on ourselves, and we do not depend on any historical necessity.”

Also, he contrasts “piecemeal” social engineering against utopianism aiming to remake society entirely. The former pragmatically targets the most urgent problems; deploying reason rather than passion, to achieve its aims democratically. Whereas utopianism pretty much requires coercion. A closed society. And that, says Popper, leads to the Inquisition, the secret police, and a “romanticized gangsterism.”

So who are the enemies of the open society, of Popper’s title? Plato, Hegel, and Marx. Most of book is a critique of these three.

We start with Plato’s “Theory of Forms.” It holds everything in our world is a pale shadow of a corresponding perfect prototype, its Form. The Forms are more real than our “actual” things, which are doomed to decay. Of course this is nonsense.

But for Plato it had big implications. “The state” too had a corresponding Form — a perfect antecedent, which once existed. Plato saw his contemporary states as necessarily less perfect; indeed, degenerated. To arrest that degeneration, by opposing all change, was the essence of his political program.

He gives lip service to a goal of human happiness, but it’s the happiness of the whole society, not of any individuals. He seems to assume his ideal state is a “good” in itself. Having nothing to do with the well-being of ordinary people, who Plato says exist only to serve the state. As do, indeed, even the rulers.

Those rulers would be a class apart, a race apart — with no mixing allowed. This leads Plato to eugenics; he actually invented the idea. The ruling class must preserve its racial purity through carefully supervised breeding.

We associate Plato with the idea of “philosopher kings.” Supervising the eugenics program required trained philosophers — meaning men indoctrinated with Plato’s ideas, giving them the necessary “wisdom.” And also, the mystical “Platonic Number” they’d need.

Yes, he said there is a magic number, which he didn’t reveal. But without rulers privy to it, racial degeneration is inevitable. (Plato was really angling to be made ruler himself.)

Popper casts Plato’s ideal as a quintessential closed society; a tribal, collectivist, caste society. Collectivism is often held up as a virtue, the idea that the community supersedes the individual, and one must transcend selfishness and valorize something higher — the collective. Whereas in an open society people are free to choose social connections as they please.

The transition from the former condition to the latter is a profound and even wrenching social revolution. It was first seen in Plato’s Athens, with the rise of democratic and individualist ideas. Popper says Plato actually diagnosed the resulting social strains quite acutely. Longing for a return to ancestral virtues — Make Athens Great Again. But that ancient faith was gone forever, and against it was rising a new faith — in reason, freedom, and human brotherhood — the faith of an open society.

Plato’s ideas may not have seemed so extreme in 400 BC as now. Our quest for wisdom was just beginning. Yet other ancient thinkers, like Epicurus, Democritus, and Lucretius, were already capable of deeply humanistic ideas. And before Plato, in his own Athens, there was Pericles, who said the state should serve the many, not the few. That reason and humanitarianism should rule.

And there was one other giant upon whose shoulders Plato might well have stood: Socrates. He’s seemingly a major voice in Plato’s written dialogs. But Socrates was the true soul of humility and wisdom, knowing how little he knew; a believer in individualism, and that it’s reason that makes one human. He died for these beliefs. Plato betrayed him.

Plato was Athenian, but saw its perennial enemy Sparta as more like his perfect state. Today, we see one even closer to it — North Korea.

Hegel was a German philosopher of the early 1800s. Popper quotes Schopenhauer, who knew Hegel personally: “The Certified Great Philosopher was a flat-headed, insipid, nauseating, illiterate charlatan who reached the pinnacle of audacity in scribbling together and dishing up the craziest mystifying nonsense.” I wish Schopenhauer had told us what he really thought.

But for all the gibberish, Hegel did have something to say. Whereas Plato saw civilization’s trajectory as degeneration, Hegel saw progress toward an ideal. But not in a straight line. This is the concept of the “dialectic,” with every thesis having its antithesis, and out of their conflict comes a synthesis, a kind of unity of opposites. Then the process can repeat on a higher level. “The history of the world is none other than the progress of the consciousness of freedom.” But alas, what Hegel meant by “freedom” is nothing we’d recognize. The essence of Hegel was the romanticized worship of the state, the nation, and its historical destiny.

Thus did Hegel propel the rise of German nationalism, appealing to tribal instincts, again putting the collective over the individual. Hegel conceived the nation as united by a spirit that acts historically, with the state as its incarnation, asserting its supremacy through war, which he also idolized. The state is exempt from any moral consideration; only historical success matters. And of course you have the Great Man as leader, the one who expresses the will of his time, embodying the idea of a heroic life, contrasted against shallow mediocrity.

All this Popper deems a surrender of reason and humanity. And German thought is riddled with such tripe, from Fichte to Nietzsche. With, of course, a straight line to Hitler.

Then we come to Marx. (I wonder if history would have been different if he’d had a longer name. “Marxism” is punchy. Would “Schickelgruberism” have such appeal?)

But Marx was not a Marxist, in today’s sense; nor a communist. Instead he was mainly the ultimate historicist, promoting the idea of history as a science, with the working class overthrowing capitalism being predictably destined.

But Popper says that unlike Plato and Hegel, Marx really applied careful reason in his analysis. He also had a genuine humanitarian impulse, troubled by the lousy conditions suffered by working people in industrialized economies. In Marx’s social analysis, economic class interest and class struggle was central. But Popper casts him as an enemy of the open society insofar as Marx sanctioned social change not through democracy but violence. A kind of revolution without moral legitimacy.

And Marx and his followers had no handle on the greatest problem of politics: how to control power. Marxists saw state power as a threat only in the hands of the bourgeoisie or capitalists. How wrong that was. Only democracy, says Popper, can hope to protect human values from the state. And he was writing at a time when Soviet Communism’s horrors were still largely unrevealed.

Marx predicted capitalism’s downfall because competition would force intensifying worker exploitation, their worsening misery making revolution inevitable. He never foresaw a totally different story: how proletarians would make the state work for them, by democratic means, achieving a host of reforms regulating and improving working conditions, while also gaining a substantial share of the rising wealth created by vastly increasing productivity. All this led not to greater misery but mass affluence on a scale Marx never imagined. Invalidating his historicist approach.

Popper says “history” is not really even a thing. Human existence is too complex. What people usually mean by “history” concerns just one aspect, political power; which Popper calls an idolatrous worship of power (recalling Hegel). And he’s particularly scathing toward Christians who see “the hand of god” in this so-called history.

He also discusses rationalism versus irrationalism. Of course nobody explicitly advocates irrationalism. But that’s what it actually is when people attack what they call a “soulless” faith in rationalism that supposedly leads to all sorts of excesses, even the Holocaust. Postmodernism, that flourished after Popper, denied there’s any such thing as truth. But while truth may not be absolute, we use our reason to get ever closer to it. The Holocaust did not result from over-reliance on reason, but from the kind of Hegelian irrationalist romanticism Popper denounces.

Popper’s final conclusion is that history has no meaning. And while the fact of progress is written large in human annals, no law of history propels such progress. Nor can history tell us what to do. Rather, it’s all a matter of what we choose to do. And that is how, if history has no meaning, we can give it meaning — by our choices, working for right principles — rule of reason, justice, freedom, equality — humanism — the open society.

The American Crisis

April 13, 2020

These are the times that try men’s souls. “Try” meant “test” when Thomas Paine wrote those words.

We’re having an extraordinary economic crisis, entwined with an extraordinary health crisis. While America was already undergoing a crisis of the soul. A political and leadership crisis that was also a moral one, testing the very principles this nation stands for.

All this will end. But the world will be different.

We’re not hearing much now about limited government. I’m no government-loving “progressive,” but even libertarians recognize a need for government to protect us in situations like this, organizing and mobilizing a societal response. But unfortunately we’re also seeing why the big modern bureaucratic state is distrusted. It’s not size that counts so much as how you use the thing.

China’s authoritarian regime sneers at governments hamstrung by democratic accountability. China was indeed unfettered in imposing draconian measures to contain the virus. On the other hand, it wouldn’t have been such a big problem if they hadn’t started out silencing doctors who raised the alarm. China also failed to properly alert the world. Thus its regime is very culpable.

So is ours. Even given China’s guilt, the disaster here did not have to happen. Had we done what South Korea, Taiwan, and Singapore did — merely acting competently. Instead, America’s government bumbled and fumbled in a disorganized manner for almost two months because Trump refused to heed experts ringing alarm bells. This tragic fact is now well documented by multiple responsible sources. It cost us many thousands of lives, untold other human suffering, and trillions of dollars.

So a key lesson is the importance of competent, intelligent, responsible, sane leadership. That’s up to voters. So far I don’t see that lesson sinking in.

COVID-19 threatens our national security. Trump fetishizes the military, imagining this conveys strength. Actually the bulk of our giant defense budget is oriented toward re-fighting WWII (all those costly aircraft carriers, etc.), not the real threats of the modern world. Like pandemics. Wasting all those resources on useless “defense” actually weakens us. Spending a tiny fraction of that money on defense against threats like COVID-19 could have made all the difference. We didn’t do it.

This American failure is not invisible to other countries, who are suffering in consequence. They expected better. A real blow to our international standing.

Meantime, big government is getting bigger. The crisis prodded Congress into the kind of bipartisan action that seemed unimaginable just weeks ago, expanding government’s role in both size and scope to support the economy in ways also unimaginable weeks ago. We may think this is just a temporary emergency response. The bipartisanship already is fading. But expansions of government don’t have a tendency to reverse themselves. The idea of government relieving businesses of downside risks, and subsidizing paychecks, may stick around, with large implications. Not socialism, exactly; more like state capitalism. And the bailouts seem more accessible to big businesses than small ones, accelerating a trend toward consolidation, as against the more dynamic small-firm end of the business spectrum.

The government is throwing around trillions of dollars very fast and without preparation or forethought. A massive program like this ought to have been preceded by a careful legislative process with input from divergent viewpoints. Of course this is an emergency situation. But oversight is definitely lacking. In fact, Trump’s already fired the inspector general who’d been tasked with keeping tabs on the handouts. Why? It’s hardly paranoid to foresee massive abuse and corruption. Surely there must be an investigation of where all the money is going. Trump will foam at the mouth screaming “witch hunt.”

This is also changing us as a society. Sociologist Robert Putnam’s 2000 book Bowling Alone pointed up a trend toward atomization. That preceded the smartphone era, which has prompted vast handwringing about growing solipsism. Strangely, on one level, it’s all about human connectedness, with people fixated on their phones mainly for stimuli from others. Yet while our Facebook “friend” rosters grow, real friendships contract. (I’m baffled by people obsessing over online content concerning others they hardly know.)

Now we have “social distancing” — as if that hadn’t already been an apt way to describe what was happening. In-person communication being supplanted by virtual communication. If this were a battle between the two, the former has just suffered a devastating strategic reverse. Now it’s actually wrong for us to socialize in person, it’s bad for public health!

Our society is built upon our webs of human interconnectedness, embodied in the term “social capital.” A key element of that is social trust. It’s the very basic understanding that you can walk down the street with no expectation that a passer-by will bash you on the head and grab your stuff. Or, more prosaically, that when you buy packaged food it won’t be poisoned. Et cetera, et cetera. A vast range of ways we trust that society will work as it should. This can’t be taken for granted, it was built up over thousands of years.

Countries where social trust — and, in particular, trust in government and other institutions — is high (like South Korea, Taiwan, and Singapore) have seen commensurately high levels of citizen cooperation with public health directives.

But polls have shown that Americans’ social trust is eroding. It’s not that people are actually becoming less trustworthy. It’s that more of us believe others are less trustworthy. This can become self-fulfilling if we act in ways that exhibit less trust. The decline in social trust may be partly due to reduced face-to-face interaction. And it’s aggravated by having two political tribes each believing the other consists of bad people who threaten everything that’s good and holy.

And now, we look at other people we encounter in the street, in stores, etc., and view them as literally potential threats to us. “What if that guy has the virus?” What if this kind of distrust becomes ingrained, even after the crisis ends?

Human social virtues in a time of crisis

April 1, 2020

Garrison Keillor once said, if the purpose of one’s life is to serve others, then what purpose is served by the existence of those others? This actually poses a deep philosophical issue. John Donne wrote that no man is an island. Yet each of us experiences existence only within the confines of our own skulls. Experiencing only one’s own feelings, not those of others.

It can be argued that we only ever do seemingly selfless deeds when it rewards us with good feelings. Evolution programmed us to have such feelings — and with empathy for the feelings of others, even if we cannot experience them directly — to make us do things for the common good. Hence even if pure selfishness might seem strictly logical, a degree of selflessness is a fundamental part of our human nature (barring sociopaths who failed to get that software installed). And we measure our virtue largely in terms of our interactions with others. Summed up pretty well by the golden rule. Nobody is perfect but most of us try.

And not just because of our programming. Your rational brain tells you that if you want to live in a society where people treat each other well, it behooves you to behave that way yourself. And if everybody does this, it’s good for everybody. We do what’s right mainly because we know it’s right, and why.

Holding fast to these standards of conduct is especially vital in a crisis like today’s, where the temptations for selfishness are heightened, and so is its ill effect. Where social solidarity is more needful than ever. Americans are largely meeting the test.

Acting rightly does make one feel good about oneself. But that may not be enough. We all have egos, greedy for such feelings, and one way to pump them up is through validation from others. This may seem strange because, again, you don’t have direct access to what others feel. But you’re affected by their behavior, which in turn is affected by their feelings toward you. And our social programming makes our position in society important to us. All this makes us crave the good opinion of others, and suckers for flattery.

Thus if we do good or are successful, we want others to know it. One way is to tell them. But that actually contravenes the golden rule. How so? Well, do you enjoy hearing others’ boasts? Saying “Look how great I am” implicitly tells the hearer, “and you’re not.” Even if unintentionally, self-aggrandizement forces the hearer to ponder the comparison. It’s not nice. That’s why bragging has a negative connotation, and modesty and humility are virtues.* A basic rule of living in society.

Much human behavior seeks to evade that rule. Successful, rich people cannot wear a badge announcing their net worth. But a lot of what they do (and buy) is mainly to advertise to others about their success. Boastfulness by other means.

But some are boastful by boasting. “I am very rich,” Trump has said. “I am very smart.” He’s even boasted of being the most modest person ever. And he tells us he’s doing a great job. Thus his coronavirus briefings (whose TV ratings he’s bragged about). Recently the word of the day, repeated like a verbal tic, was “tremendous.” Then he switched to “incredible.” Maybe tomorrow it will be “fantastic.” And not content to trumpet his wonderfulness himself, he trots out sycophantic flatterers to bubble about it.

What’s truly incredible is a president using a horrific crisis, with thousands dying, and millions suffering deprivation, as an occasion for sickening orgies of self-congratulation.

And contemptible as such braggadocio is, worse yet if the boasts are lies. It’s been factually documented how his failure of leadership delayed forceful action on testing to contain the virus. Doing what other countries did would have saved many thousands of lives and trillions in economic devastation. This reality might have brought forth some humility. A different reality can only be constructed out of lies. Like the simply false claim that we’re testing more than any other nation. (Our per-capita testing rate is certainly way below.)

I have pilloried Governor Cuomo in the past, but his coronavirus briefings are models of what Trump’s are not. No self-praise extravaganzas. No bashing the press and other critics, no demanding obsequious flattery. No lying. Cuomo gives us the unvarnished truth. He takes responsibility. He brings the situation home to us in a very human way we can all relate to. He tells us what needs to be done, what we all must do.

Knowing he’s being unfavorably compared to Cuomo infuriates Trump. But, incapable of learning from Cuomo, he resorts to pot-shots at him: “He had a chance to buy, in 2015, 16,000 ventilators at a very low price . . . he shouldn’t be talking about us. He should be buying his own ventilators.” But instead, said Trump, Cuomo goes for “death panels and lotteries.”

Albany Times-Union columnist Chris Churchill has deconstructed exactly how vile this Trump cheap shot is. It came (surprise) from the internet, a right-wing website, based on a 2015 state task force report on pandemic planning. Churchill read it and interviewed the task force leader — concluding that the attack on Cuomo was “blatantly dishonest.” The report discussed strategies for dealing with a ventilator shortage, but did not recommend buying thousands just in case. Let alone somehow present an option to buy 16,000 “at a very low price.”

But Trump’s gross distortion of the facts is kind of beside the point. He’s repeatedly shown he needs no facts at all to slime somebody. And keeping up such divisive dishonesty, even in this time of national trauma, is just ghastly.

Here is the real point, that all this leads up to. I started out talking about our most fundamental human precepts for living among others. How normal people have that software pre-installed, and how crucial it is in a crisis like we face now. When the leadership we choose is someone who has not had that software installed, we are in very deep trouble as a society.

* Certain commenters will jump to sneer about my own modesty. I was tempted to actually talk about it here. But that would be immodest.