Archive for the ‘Philosophy’ Category

The Unlikely Pilgrimage of Harold Fry

July 3, 2014

A book group can expose one to hidden treasures and unforeseen pleasures. Here’s a great example.

images-1Harold, 65, recently retired, lives with longtime wife Maureen in a small English town. One day comes a letter from Queenie, an old co-worker friend whom he hasn’t seen in twenty years. In hospice dying of cancer, she’s saying goodbye.

Harold pens a short reply and goes out to mail it. But something makes him pass mailbox after mailbox. Stopping for a bite at a garage, he tells the girl there about Queenie; she relates how powerful her faith was when her aunt had cancer. This inspires nonreligious Harold* to continue his walk – to the hospice in Berwick, 500 miles distant – suddenly convinced that that will keep Queenie alive.images

Thus begins Rachel Joyce’s novel, seemingly light and quirky, quasi-comic even. It’s hard to take seriously at first. But, wow, this becomes a profoundly compelling tale.

Harold and Queenie were never lovers. But they do share a secret; and Harold owes her, more than he even realizes.

When he phones Maureen to announce his plan, she responds matter-of-factly, without demur. This might have seemed weird, except that the two had lived for twenty years in a state of deep freeze, interaction kept minimal. (I could relate, having experienced something similar, for a time, with a girl I lived with.)

The marriage’s black hole had to do with their son, David. Harold hadn’t been the greatest dad, though not the worst by far. Maureen blames him for what went wrong. Iconically remembered is an ancient beach episode wherein Harold dithered about diving to David’s rescue until a lifeguard intervened. Maureen fails to consider that she didn’t dive in either.

images-4They haven’t seen David in decades. Harold doesn’t speak to him. Maureen does; indeed, they have quite normal phone conversations. That normality actually seemed bizarre, in the circumstances; until the more startling truth, revealed near the end, explains all. (I won’t spill it here.)

I recently reviewed another book group selection, Cheryl Strayed’s Wild, about a woman not wholly prepared for a big hike, and of course Harold is preposterously unprepared for his, having walked out in “yachting shoes.” Things go downhill fast, and the reader wonders how this can possibly continue. But an angel (in human form) fortuitously appears and gets Harold reasonably fixed up to reboot his pilgrimage (though still in yachting shoes).images-2

The novel grows more broadly comic, yet at the same time rather darker, when Harold’s story becomes a media sensation and he attracts a motley gaggle of fellow “pilgrims” he could do without. He agonizes over extricating himself but is stopped by a sense of responsibility toward them. Eventually, they leave him behind.

UnknownAnd now it gets truly dark. Does Harold reach Queenie? It no longer really matters, with Harold and Maureen both haunted by their fraught memories, their regrets, their demons. Maureen even begins to sense her own responsibility. At one point she actually travels to find Harold, and their strained conversation is heart-breaking. When Harold mildly suggests she join the trek, she cannot stifle her reflexive, acid “I think not.” Enroute home she ruefully chides herself for those words, and her inability to say the things she wishes she could.

But that isn’t the end.

This is such a deeply affecting, human book. Harold and Maureen are neither heroes nor duds, but Everyman and Everywoman. Sometimes life goes smoothly; sometimes not. Sometimes people sink under their troubles, but sometimes they rise up. Harold and Maureen are limited people; but sometimes we can transcend our limits. Sometimes love dies but sometimes rises up again.images-5

I think yes.

*Harold overlooks the girl’s not actually saying her aunt was helped.

I’m Going to Die

June 30, 2014

(A version of this appeared on the Albany Times-Union’s “Faith & Values” page, June 21)

America’s deaths are projected to rise (baby boomers being mortal) from 2.59 million in 2010 to 4.25 million in 2050. That could include you (or, worse, me). And while best-selling books claim to prove Heaven’s reality, even most believers aren’t eager to depart.

UnknownI heard a philosopher on the radio recently calling fear of death irrational. Human brains have no way to mentally model nonexistence; and he analogized one’s life to what’s between the covers of a book, saying that Long John Silver doesn’t fear what happens when Treasure Island reaches its final page.

“That makes no sense,” my wife remarked.

Marcus Aurelius

Marcus Aurelius

I agreed. Philosophers going back to Marcus Aurelius and Lucretius (whom I’ve written about) have similarly struggled to persuade us – or, really, themselves – that death is nothing, basically because one won’t be around to experience being dead. But we understand what ending a life means. The radio philosopher’s analogy was silly because Long John Silver is a fictional construct with no consciousness.

Death is loss – complete and total. That one won’t suffer afterwards – as one grieves the loss of a dollar, or a beloved – may be a small comfort, but very small. Indeed, I think most of us would prefer if posthumousness could somehow be suffered. At least that would be something. Better than nothingness.

My cat, not knowing he’ll die, is unafraid.  Unknown-1My knowledge is both a blessing and a curse, but surely more of a blessing. Ignorance may be a sort of bliss, but I prefer an authentic life, grounded in reality. That includes the reality of death. Accepting this is  painful, yes, but it’s part of being alive in the fullest sense; looking life squarely in the eye.

Fear is healthy insofar as it alerts us to dangers and motivates preparation and avoidance. But while of course it makes sense to act to postpone death, in the end it comes, and fearing the inevitable is useless.  However, our thinking about mortality includes more than simple fear. While the radio philosopher was right at least that we can’t wrap our heads around the concept of nonexistence, what one does fully understand the loss of everything one values. That anticipatory regret is not at all irrational.

We must figure out how to live with it. Unknown-2And it does have one beneficial aspect, of putting other anxieties in perspective.  The same radio program also featured a man with acute stage fright, a folk singer. But why obsess about appearing in public (what’s the worst that could happen?) when Death is on your dance card? If you can live with that, no lesser fear should terrify you.

Moreover, its being limited makes life all the more precious. And I don’t allow knowing it will end subvert my pleasure in living it.  Rather than morbid contemplation of what being dead will be like, I prefer to focus instead on what being alive is like (that itself being enough of a puzzle, as I’ve written). images-2Rather than seeing death as a theft, I see my life as a gift. I don’t take my existence for granted; au contraire, there was no cosmic necessity for it, and I consider it almost miraculous.

To crave more of it may be natural, yet foolish if that corrodes what one does have. As Richard Dawkins has said, let go the impossible wish for another life, and live the one you’ve got.

Lawn Fetishism Revisited – “I See Nothing”

June 24, 2014

UnknownBehind my property is a small tract stranded between houses, apparently unsuitable for siting another. So, unattended, it grows into a mini-jungle. Behind that is a patch of grass invisible from any house. I didn’t even know it was there until one recent morning when I spotted my neighbor mowing it.

Why mow grass no one can see? I’m not sure whether this is crazy or weirdly admirable. I’m reminded of Steve Jobs fussing over the aesthetics of computer insides. “Nobody will even know about it,” his minions objected. “But I will,” Jobs said.

I wrote a few years ago about “Lawn Fetishism.” My neighbor actually seems to enjoy mowing. Now my wife seems to have the bug. I used to hire a mower, but lately she insists on doing it herself. (Maybe she didn’t think I was having it done often enough.)OLYMPUS DIGITAL CAMERA

Now, Therese is a poet, so of course she has her own poetic approach to landscaping.

She explained to me that the front “is for show,” so she mows it in the conventional way. But the back is her playground, with several rectangular patches allowed to grow wild, among the mowed areas. In addition, in between our lawn and the mini-jungle, she has created a –well, I don’t know what to call it. (See picture) Nor, quite what to make of all this. It’s not anything I’d ever have thought of doing. But grass is not one of my preoccupations. And she likes it.

imagesWhen I was a kid there was a TV comedy, “Hogan’s Heroes,” about Americans in a German POW camp, always into shenanigans. A fat middle-aged Sergeant Schultz was supposed to be guarding them. But Schultz wanted life easy. So when shenanigans were going down, he’d raise his eyes skyward saying, “I see nothing. I see nothing.”

I find this phrase very useful in my marriage.

Mind, Memory, and Movies

June 15, 2014

UnknownThe human brain has about 85 billion neurons, most connected to thousands of others, making for trillions of connections – the most complex object known. I’ve written before about what wonders it performs.

Recently in a newspaper I came to a page full of text of no interest, and quickly turned the page. Unknown-1But I said to myself, “Did I see the word breasts?” With scientific curiosity, I went back and searched; sure enough, there it was, buried amid thousands of words. How could my brain have picked it out in that fraction of a second? Why? (Well, one can guess why.)

We imagine memory works like a video camera. Not so. The brain does hold such information, but only briefly, then discards it. What it retains is only a bare thematic outline. Unknown-2When you later “remember,” what the brain does is to refer to that outline and to fill in the details by, basically, making them up. Really! And those confabulations change over time. (This is why “eyewitness testimony” in courts is often specious.)

This was brought home to me when I wrote an autobiographical memoir. I thought my memories were fairly accurate. But checking against diaries written when events were fresh showed how differently I remembered them years later. And when, years later still, I re-read that autobiography, I was surprised yet again to find that my memories had further changed.

And yet the brain does have an uncanny ability to file away information. Recently my wife told me someone said she reminded him of Sheila Miles.

Sara Miles?” I said.

“Maybe. Who’s that?”

Unknown-3“Actress; I think she was in a film – something about an Irish girl and a soldier? I can’t recall the title. Must’ve been 1970, since I do remember the girl I saw it with.” (And I could recall just one scene in that movie. Guess what? Breasts again.)

Next morning, while coming awake (a good time for this), the word “daughter” entered my mind. In another moment, I had it: Ryan’s Daughter.

Now, I’m no film buff, and had you asked me, “Who was in Ryan’s Daughter?” I doubt I could have answered. Yet given the name Miles – even with the wrong first name – my brain made the connection. The information was still there, buried, unthought of, for 44 years.

Then there was the time I greeted my wife with, “Good morning, old man.”

She gave me a quizzical look. “What made you call me that?”

“Why, I have no idea! It just popped out of my mouth.” I’d never said it before.

imagesWell, that night we watched The Third Man, having ordered it from Netflix. I had a vague recollection of having seen it on TV as a kid, nearly half a century earlier. If asked, I couldn’t have told you a thing about that film. Maybe that Orson Welles was in it. Maybe. And seeing the movie again now, nothing seemed familiar.

So I was gobsmacked when the Welles character calls the Joseph Cotten character “old man!”

That tiny detail wasn’t even significant in the film, but somehow, my brain had squirreled it away, and half a century later, unconsciously prompted by our Netflix order, put the words into my mouth, without my even realizing why.Unknown-4

Now if only I could remember where I left those keys . . . .

Chris Stedman: Faitheist

June 2, 2014

imagesChris Stedman’s career is in “interfaith work,” but his book, Faitheist, is addressed mainly to his fellow atheists, urging them to lighten up.

It centers upon his own story. His Minnesota family was nonreligious, but at age 11, he experienced a crisis by reading “heavy” books that exposed him to the world’s injustice and cruelty. Also, his parents divorced. Chris found refuge in his school’s Christian group, which welcomed him and assuaged his social justice discomforts.

But there was one wee problem. Christianity seemed obsessively homophobic. And Chris was starting to realize this applied to him. UnknownHis Teen Study Bible labeled him an abomination in God’s eyes, and his resulting inner struggle drove him close to suicide.

At last his mother stumbled upon his personal journal and brought him to a different kind of Christian minister – who took one look at the relevant Teen Study Bible page, drew a big red X across it, and said, “This is dehumanizing garbage.”

So Chris found a different path within Christianity, and went on to a Christian college, studying religion, headed for the ministry.

But there was another wee problem. He no longer believed in God. The book, after many pages chronicling Chris’s agony over faith versus sexuality, has relatively few about faith versus non-faith. That seemed fairly easy for him. But he completed his degree, as the class atheist, and even proceeded to divinity school, winding up as Harvard’s Assistant Humanist Chaplain. (He recently went to Yale.)

His “interfaith work” seeks to bridge religious divides by finding common ground and ways to work together and understand each other better. Stedman classifies the religious as either “totalitarians” or “pluralists,” with the latter actually having more affinities with nonbelievers than with the totalitarians.

But as noted the book is aimed mainly at atheists, who are also divided. Stedman disparages the belligerence of the so-called “New Atheism.” (He singles out PZ Myers, whose book I’ve also reviewed.) With some atheists seeing their goal as eradicating religion, Stedman is unsurprised at the religious push-back. After all, he notes in comparison, the gay rights movement hasn’t sought to end heterosexuality. He doesn’t like a “we’re right, they’re wrong” attitude.

images-1I’m guilty of some of that myself. Obviously if you believe something, you believe people thinking differently are wrong. But I draw the line at “we’re right, they’re insane,” and I’ve criticized writers like Charlie Pierce for that. It might be different if religion were practiced only by an eccentric minority; but in a country where most folks are religious, that must be considered normal and sane. And I’m all for greater mutual understanding, working together, and apple pie; and I do try to avoid personal insults, calling people crazy or stupid. Yet religion should not enjoy some special exemption from critical scrutiny; its ideas should be subjected to vigorous public debate like any others. That’s what the “New Atheism” is about.

Furthermore, it would also be different were this just a matter of personal beliefs, kept personal. But most atheists would like to see the end of religion not only because it’s false but because they consider it harmful. Religion’s defenders can’t deny some very bad things, but of course claim the good outweighs the bad. As I see it, the good works ascribed to faith are things people could, and mostly would, do even without religion,

Faith in action

Faith in action

because we are in fact more good than bad (societies like Denmark’s or Norway’s where religion has almost disappeared are some of the world’s nicest); while the bad things (9/11; Boko Haram) are uniquely products of religious belief and would be hard to imagine absent that factor.

Religionists will of course retort that some of the worst crimes have been committed by atheistic regimes (though Hitler’s at least wasn’t atheist). But those crimes were not committed in service to atheism; not motivated by disbelief in God; the concept of God was simply irrelevant. In contrast, many bloody crimes throughout history were of course motivated by religious belief.

Believers will also say such crimes are perversions of proper faith. But the problem is that religion has an unavoidable tendency to inspire absolutism (Stedman’s “totalitarianism”) – the “one truth” so powerful that it can justify almost anything in service to it. Disbelief doesn’t come close to having such inspirational power – a very good thing. In fact nobody kills for atheism.

images-2This is why we would like to see religion disappear. But it bears emphasizing that – so unlike religion throughout most of history – atheists wield the pen, not the sword; words, not violence. And, given its long history of burning people at the stake, it’s a bit rich for religion to be telling atheists to dial it back.

And Chris Stedman, of all people, should know the harm of religion. An inhumane religious dogma drove him to the brink of suicide. Just one more reason why atheists believe the world would be a better place without religion.

Utilitarianism: Is Killing One to Save Five Moral?

May 24, 2014

You are a bystander seeing a runaway trolley, about to hit and kill five people. imagesYou can grab a switch and reroute it to a different track where it will kill only one person. Should you? Most people say yes. But suppose you’re on a bridge, and can save the five lives only by pushing a fat man off the bridge into the trolley’s path? Should you? Most say no.  Or suppose you’re a doctor with five patients about to die from different organ failures. Should you save them by grabbing someone off the street and harvesting his organs? Aren’t all three cases morally identical?

Our intuitive moral brain treats them differently. Pushing the man off the bridge, or harvesting organs, seem to contravene an ethical taboo against personal violence that the impersonal act of flipping the switch does not.* (This refutes the common idea that humans have a propensity for violence. Ironically, those who believe it may do so because their own built-in anti-violence brain module is set on high.)

UnknownSuch issues are central to Joshua Greene’s book, Moral Tribes. Our ethical intuitions were acquired through evolution, adaptations that enabled our ancestors to cope and survive in close-knit tribal societies. And our moral reflexes do work pretty well in such environments, where the dilemmas tend to be of the “me” versus “us” sort. But, because our ancestral tribes were effectively competing against other tribes, “us” versus “them” issues are another matter; and different tribes may see moral issues differently too. That’s the problem really concerning Greene.

He argues for a version of utilitarianism (he calls it deep pragmatism). Now, utilitarianism has a bad rep in philosophy circles. Its precept of “the greatest good for the greatest number” is seen as excluding other valid moral considerations; e.g., in the trolley and doctor situations, violating the rights of the one person sacrificed, and Kant’s dictum that people should always be ends, never means.

Greene’s line of argument (identical to mine in The Case for Rational Optimism) starts with what he deems the key question: what really matters? You can posit a whole array of “goods” but upon analysis they all actually resolve down to one thing: the feelings of beings capable of experiencing feelings. Or, in a word, happiness.

Unknown-2Happiness is a slippery concept if you try to pin down its definition. Is it a feeling – that one is happy? That’s circular; also simplistic. As John Stuart Mill famously suggested, it’s better to be Socrates dissatisfied than a pig satisfied.Unknown-3

But in any case, nothing ultimately matters except the feelings of feeling beings, and every other value you could name has meaning only insofar as it affects such feelings. Thus the supreme goal (if not the only goal) of moral philosophy should be to maximize good feelings (or happiness, or pleasure, or satisfaction) and minimize bad ones (pain and suffering).

A common misunderstanding is that such utilitarianism is about maximizing wealth. But, while all else equal, more wealth does confer more happiness, all else is never equal and happiness versus suffering is much more complex. Some beggars are happier than some billionaires. The “utility” that utilitarianism targets is not wealth; money is only a means to an end; and the end is feelings.

This is what “the greatest good for the greatest number” is about. Jeremy Bentham, utilitarianism’s founding thinker, imagined assigning a point value to every experience. This is not intended literally; but if you could quantify good versus bad feelings, then the higher the score, the greater the “utility” achieved, and the better the world.

images-1But doesn’t this still give us the same problematic answer to the trolley and surgery hypotheticals – killing one to save five? In fact that answer flunks the utilitarian test. Because nobody would want to live in the kind of society where people can have their organs taken involuntarily (see this Monty Python sketch). That might be utilitarian from the standpoint of the people saved, but extremely non-utilitarian for everyone else. And while one can concoct bizarre hypotheticals as in trolleyology, the real world doesn’t work that way. In the real world, “utility” can’t actually be maximized by, say, 90% of the population enslaving the other 10% (another typical anti-utilitarian hypothetical).

images-2Utilitarianism doesn’t require narrow-minded calculation of “utility” within the confines of every situation and circumstance. What it tells us instead is to keep our eye on the big picture: that what really matters is feelings; what tends to make them better globally is good; what makes them worse is bad. As Greene puts it, utilitarianism supplies a “common currency,” or filter, for evaluating moral dilemmas among different “tribes.”

Meantime, if X is willing to sacrifice himself for what he thinks is the greater good, that’s fine; but if X is willing to sacrifice Y for what X thinks is the greater good, that’s not fine at all. images-4It’s the road to perdition, and we know of too many societies that actually travelled that road.

Thus, a true real-world utilitarianism incorporates the kind of inviolable human rights that protect people from being exploited for the supposed good of others – because that truly does maximize happiness, pleasure, and human flourishing, while minimizing pain and suffering.

* “Trolleyology” is big in moral philosophy precincts. For another slant on it, see an article in The Economist’s latest issue. 

Ideology and Insanity: What is Mental Illness?

May 16, 2014

UnknownI was hooked in by the book’s title, Ideology and Insanity, by psychiatrist Thomas Szasz; ideology can verge on insanity. But Szasz’s focus is instead on the “ideology” of the mental health industry. He says there is no such thing as mental illness. Szasz acknowledges the behaviors we label mental illness, but deems it a mislabeling – actually a metaphor, we apply to behaviors outside ethical and social norms. That’s very different from a true illness like, say, chicken pox, with a clear physiological etiology. (But Szasz’s brush is too broad. Some mental illness is physiological: depression, for example, is often a brain chemistry problem.)

Unknown-3Szasz’s argument has a political dimension. The basic political divide is between individualism and collectivism – and perhaps wisdom would steer a middling course, because we all crave autonomy but also social connectedness. However, Szasz importantly notes that the dichotomy isn’t symmetrical, because while in an individualistic society, images-4people would be perfectly free to also satisfy their communitarian social instincts, a collectivist society would not correspondingly allow free pursuit of their individualistic proclivities. Indeed, that can be punishable, an element of coercion that makes all the difference.

images-3The point isn’t merely theoretical. Szasz cites Joseph Brodsky, a poet in the Soviet Union who was sent to a labor camp for, literally, the crime of being a poet. The state judged that poetry was not socially useful and did not fulfill Brodsky’s obligations to the collective. So in the “worker’s paradise” you had to be a worker, with no choice about it.

Szasz’s main argument is that the whole enterprise of modern American mind doctoring aims at making us a more collectivist society. That’s the import of his saying “mental illness” labeling is a guise for enforcing social conformism. Szasz maintains that for most people in mental institutions, being “treated” for “their own good” is basically a fiction for what is really imprisonment. Moreover, since Szasz wrote in 1970, there’s been a huge shift from putting mentally ill people in asylums to literally jailing them. (See this recent article in The Economist.)

While reading all this, I kept thinking, Okay, but schizos really do have something gone wrong, and whether it has a physical cause like chicken pox or not is kind of beside the point. Unknown-2But on the other hand, the problem of stigmatization and the nexus of mental diagnosis with politics is a very real concern. I’ve written before about the plague of “analyses” by those who actually do think the views of people they disagree with – whether on matters of religion, science, or even economic policy – reflect mental disorders. Szasz describes one egregious example: in 1964 a magazine devoted an entire issue to printing psychiatrists’ diagnoses of presidential candidate Goldwater, mostly calling him a paranoid schizophrenic. None had ever even met the man. (Incredibly, the magazine’s name was Fact! Goldwater was one of the sanest politicians I’ve seen.)

images-2While the mental health industry strives mightily to cloak itself as science, especially with its sciency-seeming DSM encyclopedia of diagnosable “mental disorders,” the trouble is that none can be tested for objectively. It’s all just subjective evaluation of a person’s behavior. And if a doctor disapproves of how a person chooses to live and act — or his politics! — it’s all too easy to label him with some “disorder.”

That slipperiness is illustrated by own diagnosis. To get insurance coverage when a girlfriend and I went for counseling, the therapist said, “I’ll just put down ‘anxiety.’”

And don’t forget that, until quite recently, homosexuality was in the DSM, formally labeled a mental disorder, with gays stigmatized as diseased and defective (rather than just different) vis-à-vis the norms which, via that diagnosis, the mind doctors were indeed seeking to impose societally. And of course homosexuality was furthermore duly criminalized. (Szasz actually doesn’t even mention this because, when he was writing, few people thought twice about it.)

imagesThe term “mental illness” itself has no clear boundaries. Indeed, a lot of the “disorders” in the DSM, truth be told, fall within the spectrum of what common sense tells us is normal behavioral variation. I’ve written, for example, about “Attention Deficit Hyperactivity Disorder.” With respect to that personality feature, normality encompasses a range. Maybe extreme outliers merit the word “disorder.” But most people diagnosed with ADHD (a substantial percentage of the population!) are actually within what should be considered a normal range.

Unknown-1Society used to be more rigid about how people had to be. Today we’ve grown more open, tolerant, and accepting of diversity, more willing to allow people the freedom to be the way they are, or want to be. That’s all good. And yet, contradictorily, ever more people are diagnosed (and stigmatized) with “disorders” that really amount to non-conformance with the dictates of the normality police. Thus, while the goal of improving mental health in the abstract is hard to argue with, too much of the mental health industry is geared toward the suppression of individuality. That’s so Twentieth Century. (Or, really, Nineteenth.)

There are of course some true whackos. images-1But, as Szasz argues, most “mentally ill” people don’t actually have minds on the fritz at all but, rather, face what might better be called problems of living. How to live is the salient question in philosophy, and for many people, enmeshed in their webs of trying circumstances, its solution is far from clear. True, counseling may be helpful to them. But their problems are not all in their minds.


How I Got Irreligion

May 11, 2014

imagesAt around age six, I was sent to a Jewish “Sunday school,” featuring Bible stories: Daniel and the lions, Noah’s ark, etc. I was fine with them, as stories. But then I realized adults took them seriously; troubled by this, I confided in my mother.

No theologian, she. But I distinctly remember her ending the discussion by saying, “Well, you do believe in God, don’t you?” I said yes. And I knew I was lying.images-1

I was no rebellious kid; in fact, a meek, go-with-the-program, clueless kid. But even at six, I saw right through religion.

Odd, this common locution, “believe in God.” We don’t say we “believe in fire,” or upholsterers, or aardvarks. Few have actually seen that beast, but an aardvark nonbeliever would be pretty weird. images-2For reality, “belief” simply doesn’t enter into it. Talk of belief in God implicitly bespeaks something other than reality.

Anyway, I went on to Hebrew school, Bar-mitzvah lessons, and the Bar-mitzvah itself, on stage in the synogogue, chanting the memorized gobblydegook. It never occurred to me to say no to any of this; again, I was a go-with-the-program kid. I actually did well in Hebrew school, if only to avoid humiliation when called on in class. UnknownBut I drew the line at anything optional, to the despair of my religious teachers.

Through it all, my disbelief felt like a shameful, guilty secret, a personal failing. Performing at my Bar-mitzvah, I considered myself a fraud. The sanctimony all around me evoked virtue, propriety, right-thinking. It seemed universal – with the sole exclusion of pitiful me. Never, anywhere, was I exposed to a dissenting viewpoint. This was the ’50s, with no Dawkins or Hitchens. Nothing to suggest I was not alone, or to provide any validation for my unbelief. What was wrong with me?

In that sense, I can understand how being gay must have felt – with no validation for that either. (So underground was gayness that not till my twenties did I actually understand what it was.)

Unknown-1Yet I never agonized; never made an effort to get with the program of religion. Notwithstanding how admirable faith might appear, to me it seemed just fundamentally false. The Emperor had no clothes.

Some believers imagine atheists will eventually “see the light,” if only on their deathbeds (or in the proverbial foxholes). Human psychology varies endlessly, so it does happen, but quite rarely in fact. None of the many atheists I’ve known has ever lapsed. My own conviction has only grown stronger over time. What was at first a “simple faith” (or lack thereof) has profoundly deepened as I have learned ever more about the history of religions, the human psychology behind them, and all their spectacular philosophical contradictions. And I long ago stopped wondering “what’s wrong with me?”

My humanist atheism is indeed the essence of what’s right with me. Believers feel their faith is what gives their lives meaning. Unknown-2And if that’s really true for a person, fine. But for all the consolation claimed for religion, many are tortured by doubt. Wrestling with doubt might be portrayed, by intellectualist apologists, as part of a wholesome experience of faith. But I’m not attracted by a hopeless effort to reconcile the irreconcilable. I don’t feel it’s possible to make proper sense of anything while laboring under so basic a mistake about reality.

I have never been afflicted by doubt about my most fundamental perceptions. There’s much about life and the cosmos I don’t yet truly understand (quantum mechanics; why there’s something rather than nothing; the minds of priests who rape children); but my pursuit of such understanding is not hobbled by a need to reconcile it with preconceived dogmas that can never be squared with reality. Being thusly free to see the world as it really is, I feel, enables me to fit properly into that reality, and to make a life of authentic (not illusory) meaning.

Anyhow, that’s me. If it’s not you, I won’t try to get you burned at the stake.

Are We Becoming Less Trustworthy – Or Just Less Trusting?

April 25, 2014

A recent Associated Press-GfK poll finds declining levels of trust among Americans. Lost faith in institutions, like government, or churches, might be no surprise. But we’re also losing faith in each other. Only a third now say most people can be trusted, down from half in 1972. images

But are we becoming less trustworthy – or is it just that more of us believe so? Yet this can become a self-fulfilling prophecy if we relate to others with increasing distrust and circumspection.

The AP report said “social trust” brings benefits: people more willing to compromise, make deals, and work together; whereas distrust diverts energies and encourages corruption. So trust boosts the economy. Indeed, a generally high level of interpersonal trust is one of humanity’s “killer apps,” enabling our species to develop our uniquely elaborate social structures.

A good illustration is auto travel. UnknownIt couldn’t work if we didn’t all observe the rules of the road – and trust that everyone else will too. (Or practically everyone.)

I don’t perceive the trustworthiness of the average person as declining. Sure, some are always eager to take advantage of others (the “free rider” problem in academic discourse); but that’s a small minority. A key curb on such behavior is that it doesn’t usually occur in a vacuum; people generally foresee future interactions, wherein their past conduct will be taken into account. In my own little coin business, I send almost all orders in advance of payment, even to people I don’t know. images-1The nonpayment rate is negligible. Of course, if they hope to order again, they’ll pay. A further incentive is the “deadbeat list” publicized on my website. Such shaming is actually a very ancient method for deterring cheaters.

Interestingly, I got an e-mail recently from a guy in Tanzania I’d never heard of, selling coins. He was smart enough to realize Western buyers probably wouldn’t trust an unknown African; but also that they probably could be trusted. So he too offered sending merchandise before payment. I ordered; he sent it; I paid.

This is in fact how most of the world’s commerce takes place. Without somebody trusting somebody, elaborate and cumbersome safeguards would be needed, inhibiting trade, to everyone’s loss.

I’ve written before how China differs here, its pervasive societal norm being deceit and corruption. If the AP’s survey questions were asked in China, they’d reveal far lower levels of trust than in the West. A reading of Chinese history shows that this factor has, in past epochs, held the country back. More recently China has advanced greatly in spite of it; but this is still a fundamental handicap that cannot but limit the nation’s progress, if they don’t learn to be more transparent, trustworthy, and trusting. (Note to leadership: in traditional Communist party style you can call these “the three tr’s.”)

Getting back to America, why has trust declined? The AP report quotes some professor blaming economic inequality – “more Americans feel shut out,” and have “lost their sense of a shared fate.”  This says more about the professor than about trust, reflecting an obsession with inequality and imputing a resentment most Americans in fact do not feel (even if politicized lefties believe they should).

No – this is not about politics, or economics, this is sociology. As the AP story also does suggest, it has a lot more to do with the “Bowling Alone” phenomenon (from the title of Robert Putnam’s landmark book). Unknown-1Quite simply, we spend less time actually interacting with other people.

I’m not one of those who laments modernity as pathology; its benefits are worth the costs; but one of those costs seems to be reduced face-to-face social intercourse. That impedes building a body of experience validating an assumption of trustworthiness; while perceptions get skewed in the opposite direction by increased exposure, from ubiquitous media, to the underside of human conduct. It’s a cliché that an air crash makes the news but thousands of safe landings do not. Similarly, we are relentlessly informed about people behaving badly while the vastly commoner examples of decent behavior become invisible. I send coin orders before payment because, having done it thousands of times, I know to a fare-thee-well what the payment rate is. images-2But few people nowadays get the benefit of comparable experience with human trustworthiness.

Lucretius, The Swerve, and Understanding Reality

April 20, 2014

imagesStephen Greenblatt’s The Swerve centers on a book-length First Century BC poem by Lucretius, On The Nature Of Things; apparently lost (like so much ancient literature) until book-hunter Poggio Bracciolini unearthed a forgotten copy in a monastery in 1417. Greenblatt casts this as triggering modernity’s emergence (the “swerve” of the title).

I’ve also perused the poem itself, which Greenblatt deems a literary masterpiece. Maybe its poetic virtues didn’t survive W. E. Leonard’s translation from the Latin. It helped greatly to have first read Greenblatt’s lucid bullet-point distillation (further distilled below).

Imaginary portrait of Lucretius. No real one exists

Imaginary portrait of Lucretius. No real one exists

The poem presents a bracingly materialist view of reality and the human condition which, though rooted in the philosophy of Epicurus, even earlier, is indeed very modern, and undermined the reigning Christian thought system. But Greenblatt overstates his case that Lucretius was central to the latter’s retreat. The Renaissance was sparked by a great complex of factors, which actually gathered force gradually over a long interval; intellectual ferment was fizzing all over; Lucretius’s rediscovery fed into this but was hardly, by itself, seminal.Unknown-2 (The scientific revolution did more to change the intellectual climate.)

And if Lucretius still isn’t exactly a household name, nor was he in Roman times. While his book did enjoy some circulation among the cognoscenti, he lived and died in relative obscurity — probably because few contemporaries could have made sense of a work profoundly incompatible with then-conventional ideas.

Someone in my book group mocked things Lucretius got wrong. But I was blown away by how much he got right — considering that he predated any proper science, with human understanding of the world being a mess of clueless superstition. Lucretius could only use his reasoning mind and his observation of reality to intuit its underpinnings:

images-1Invisible particles (what we call atoms), constantly in motion, combine and recombine to make up everything in the universe, from stars to rocks to humans. They are immutable, eternal, and (till the 1940’s!) indivisible. Like the letters of an alphabet, their workings are governed by a code, though not all letters and words can combine with every other. And the code — in principle at least — could be investigated and understood by humans (what we now call chemistry).

The particles don’t move by predetermination in straight lines, but sometimes “swerve,” causing collisions and recombinations; and that indeterminacy is what gives us free will. (I have similarly suggested that at the molecular level brain activity entails quantum mechanical effects, inherently unpredictable, hence true determinism is impossible.)

images-2All living things evolved through a long complex process of trial and error. Nature engenders variations, and those better adapted to live and reproduce proliferate, while failures go extinct. Humans are merely one such resulting animal. (It took nineteen centuries for Darwin to rediscover this idea of evolution by natural selection.)

images-4Human society did not begin in some golden age of tranquility and plenty, but in a primitive struggle for survival. (The myth of a prelapsarian paradise stubbornly persists; see my review of Steve Taylor’s The Fall.) Only gradually did social cooperation evolve; likewise language, arts, agriculture, religion, law (Lucretius anticipated Hobbes and social contract theory) and other elements of culture.

Space and time are unbounded, with no beginnings or ends — and never a creator or designer. Such beings as gods, if they exist (Lucretius doesn’t say otherwise) couldn’t possibly care about you or the minutiae of human affairs.

All religions are superstitious delusions, built on primal fears and longings. They always embody the cruelty of retribution fantasies (Hell) and human sacrifice, symbolic or otherwise. Unknown-3(Lucretius could not have foreseen the mother of all such sacrifice theologies — belief that Christ had to be tortured to death to save humanity.)

There is no cosmic purpose to existence, and no afterlife. (Lucretius spends pages deconstructing the nonsensicality of belief in a “soul.”) But since you won’t be around to experience nonexistence, it shouldn’t faze you. And this life being all we have, there is no higher ethical imperative than maximizing pleasure* and minimizing pain. All others — serving the state, glorification of God, pursuing virtue through self-sacrifice, etc. — are secondary, misguided, or fraudulent. The greatest obstacle to pleasure is not pain, but delusion.

None of this is cause for despair. To the contrary, understanding these realities is crucial for the possibility of happiness. To fantasize some higher reality, to aspire toward, only puts people in a destructive relation with the environment they actually inhabit. But by looking calmly at the true nature of things, we can experience a more genuine awe, and achieve a more genuine fulfillment.images-3

Taking a cynically dim view of humankind is common among intellectuals. But I am proud of my species. And learning about this man who, so long ago, could achieve such insight — it often gave me goosebumps — redoubles that pride.

* By “pleasure” Lucretius, following Epicurus, doesn’t mean hedonism. Rather, it really means the enjoyment derived from living a fulfilling life.



Get every new post delivered to your Inbox.

Join 3,041 other followers