Archive for September, 2017

Human history in a nutshell, Part 1: Evolution

September 28, 2017

It was about six million years ago that we last shared a common ancestor with a biologically distinct gang — chimps and other apes. But our species, Homo Sapiens, is only a couple of hundred thousand years old. Between those two chronological markers, a lot of evolution happened.

In fact, over those six million years, quite a large number of more or less “human” or proto-human species came and went. The line of descent that produced us was only one of many. All the others petered out.

As the story unfolded among all these variant creatures, two different basic strategies evolved. Call one vegetarian. Its practitioners relied on a menu much like that of modern apes — fruits, nuts, berries, etc. A pretty reliable diet, but due to low nutritional content, much energy was devoted to eating and digesting — they literally had to consume a lot to get the energy to consume a lot. A big digestive system was required, diverting resources that otherwise could have gone to their brains.

The other group went for brains rather than guts. This required a high energy diet, i.e., including meat. But meat was hard to get, for such weak little critters lacking fangs and claws. Getting meat required brains.

All well and good, except that bigger brains meant bigger heads, a bit of a problem for mothers giving birth. And that was exacerbated by a second evolutionary trajectory. Hunting meat proved to be a lot easier for early humans if, instead of going on all fours, they could efficiently walk upright and even run. Doing that called for changes to pelvic architecture, which had the effect of narrowing the birth canal. So the bigger-headed babies had to fit through a smaller opening. Something had to give.

What gave was the gestation period. If humans functioned otherwise like apes do, babies would spend not nine months in the womb but twenty, and come out ready for action. But their heads by twenty months would be so big they couldn’t come out at all. So we make do with nine months, about the limit mothers can bear, and the least babies can get by with. Consequently they require lengthy attentive nurturing, which of course has had a vast impact upon humans’ way of life.

Earlier birth thus meant longer childhood, and a lot of a person’s development outside the womb as his or her brain responds to things around it. This in turn is responsible for another huge fact about human life: we are not cookie-cutter products but very different one from another. And that fundamental individualism, with each person having his own perspectives and ideas, played a great role in the evolution of our culture and, ultimately, civilization.

Another key part of the story was fire. We romanticize the mastery of fire (e.g., in the Prometheus myth) as putting us on the road to technology. But that came much later. Fire was our first foray into taking a hand in our own evolution. It began with cooking. Remember that trade-off between gut and brain? Cooking enabled us to get more nutrition out of foods and digest them more easily. That enabled us to get by with a smaller gut — and so we could afford a bigger brain.

This omnivorous big-brain model seemed to work out better than the vegetarian one; the vegetarians died out and the omnivores became us. (This is not intended as a knock on today’s vegetarians.) But notice again how much actually had to be sacrificed in order to produce our supersized brains. And that this was a bizarre one-time fluke of evolutionary adaptation. It happened exactly once. None of the other zillions of creatures that ever existed ever went in this unique evolutionary direction.

In other words, if you think evolution of a species with world-dominating intelligence was somehow inevitable or pre-ordained, consider that it didn’t happen for 99.999+% of Earth’s history. It was an extreme freak in the workings of evolution.

Indeed, it’s a mistake to conceptualize “evolution” as progress upward toward ever greater heights (culminating in Homo Sapiens). It’s because of that erroneous connotation of progress that Darwin didn’t even use the word “evolution” in his book. The process has no goal, not even the “selfish-gene” goal of making things good at reproducing themselves. It’s simply that things better at reproducing proliferate, and will grow to outnumber and eclipse those less good at reproducing. Our species happened to stumble upon a set of traits making us very good reproducers. But insects are even better at it, and there are way more of them than us.

(Much of what’s in this essay came from reading Chip Walter’s book, Last Ape Standing.)

Advertisements

The curse of Ham

September 26, 2017

I have written about Kentucky’s Creation Museum. Should be called the Museum of Ignorance, since its exhibits contradict incontestable scientific facts. Like the dinosaurs dying out 65 million years ago. The museum is off by 64.99+ million years. It shows humans living beside them. This might be fine as entertainment, but not for an institution purporting to be educational.

Earth to Creationists: I’m more than 6,000 years old. Around a million times older.

The museum was built by an outfit called Answers in Genesis. Not content with this slap in the face to intelligence, Answers is now building a replica Noah’s Ark. The project has received an $18 million tax break from the State of Kentucky (specifically, a sales tax abatement). How does this not flagrantly flout constitutional separation of church and state?

Ken Ham

The head of Answers in Genesis is a man named Ken Ham. Please linger upon this name.

For one thing, ham is just about the most un-kosher thing in Judaism. Kentucky’s public support for a Ham-centric project is plainly a gross insult to its citizens of the Jewish faith.

But there’s a much bigger issue. The name of Noah’s third son was Ham. Coincidence? Not very likely. This Mister Ken Ham must, beyond any doubt, be a direct descendant of Noah’s third son. He has never denied it; and it certainly explains his ark fetish.

Now, the Bible is very clear about this fact: Ham was cursed, for a grave insult to his father. Scholars differ in their exact interpretations. Some say Ham castrated Noah; others that he buggered Noah. Either way, it wasn’t nice, and so Ham was cursed by God. Ham’s own son Canaan was the progenitor of the Canaanite people, who of course were later wiped out by a God-ordered genocide; and also of all Africans, which is why they’re all cursed too.

But here is the point. In this Kentucky Ark project, Mister Ken Ham must sneakily be aiming to whitewash the above family history, employing lies to mislead the public and undo the curse that God, in his infinite wisdom and justice, laid upon all of his line. This is out-and-out blasphemy.

Some will say it should be left to the Lord to visit his divine justice upon this doubly accursed latter-day Ham. But of course God-fearing people have rarely been content to defer to that ultimate justice, and have instead so often taken matters into their own hands, with fire and sword.

I’d go with the latter.

Norman Dorsen, Colin Bruce, and mortality

September 24, 2017

One thing that happens as you get old is that the world is increasingly populated by ghosts.

I graduated NYU Law School in 1970. It puts out a yearly magazine, that’s gotten glitzier over the years as the school has grown in stature. Mainly I’ve enjoyed seeing in it news and photos of people I’d known, sometimes classmates, mostly professors. But gradually they have faded away (presaging my own future); the magazine became full of strangers. Yet one face I could always still count on seeing was the eternal, ubiquitous Norman Dorsen.

He was my constitutional law professor. When I opened the latest magazine, I found a full page photo of Norman Dorsen. Because he had died.

My NYU professors were not faceless anodynes; they included some powerful, dynamic personalities I still remember vividly. But even among them Dorsen was a monumental figure. Though never its actual dean, Dorsen came to be the school’s embodiment, and central to its mentioned escalation in stature over the decades.

Fonda as inmate Gideon, preparing to mail his petition to the Supreme Court

He had co-written the Army’s legal brief relating to the Army-McCarthy hearings. He also wrote a brief in Gideon v. Wainwright (Henry Fonda played Gideon in the movie; it still gooses my emotions); in the Nixon tapes case; and helped write one in Roe v. Wade. He was president of the ACLU; and director of NYU’s Civil Liberties Program for 56 years.

Under Dorsen’s leadership, in 1977, the ACLU took one of its most controversial stances: backing the right of Nazis to march in Skokie, Illinois. Dorsen considered this a civil liberties litmus test. (I too am an absolutist on freedom of expression.)

Norman Dorsen

Norman Dorsen was a man of rigor and seriousness. One episode sticks strangely in my memory. Law school classes were mainly socratic dialogs analyzing past actual cases. But with grades based solely on the final exam, students were often lax about class discussion. One day Dorsen began the session and quizzed a student, who couldn’t answer. After one or two others weren’t prepared either, Dorsen, visibly pissed off, simply closed his book and walked out.

The only such instance in my law school career. Gosh, almost half a century ago.

The same magazine also has a smaller obit for George Zeitlin, my tax law professor.

Colin Bruce

And the same day’s mail brought World Coin News with Colin Bruce’s. I don’t recall meeting in person but we corresponded over decades. Colin too was a living landmark. He’d been responsible for creating The Standard Catalog of World Coins in 1972. Non-collectors can’t appreciate this. But previously, evaluating foreign coins was mostly guesswork. What a blessing to have listings for every country in one (large!) book. Thank you, Colin Bruce.

Pricing accuracy was still always problematical. And sadly, after Colin retired, the catalog went downhill, accumulating errors and stupidities that never were corrected. Finally I published a broadside detailing the problems — with no response — and resolved to boycott further annual editions. This will make my coin dealing harder. The only consolation is that it won’t be that much longer.

Time was, my life stretching ahead felt so long it might as well have been forever. Now the end feels so near it might as well be tomorrow.

A different idea about health care

September 22, 2017

As Republicans try one more time to pass a bill to strip millions of their health care, a huge policy crap-shoot without benefit of hearings, public debate, or input from experts, here’s another idea.

We keep hearing that middle class wages have flatlined over a long period. Actually, recent data shows a significant uptick. But anyway, such numbers are misleading because they normally reflect only salaries — and not fringe benefits — which comprise a growing part of total employee compensation. The big one is health insurance.

About 150 million Americans get health insurance through their employers, and its value (i.e., its cost) now averages about $18,000 annually. Combining this with salaries tells us that total earnings have not stagnated, but risen substantially.

This also means Americans effectively spend a growing part of their incomes on health care. It’s even more than that $18,000, what with rising deductibles, co-pays, etc. Of course, health care is something of value, improving quality of life, worth paying for. But paying for health insurance is not quite the same thing. Healthy people get little benefit. Indeed, the whole system is set up for them to subsidize the sick; and Obamacare expanded on that.

Overall, Americans spend a lot more on health care/insurance than other advanced countries, without being healthier. This is fundamentally because it’s not a competitive market. There’s really no shopping around for health services; the end-user isn’t usually the one who’s paying. Obamacare didn’t fix this.

Recently, during a medical appointment with one doctor, another stopped in to “consult,” for a few minutes. He neither examined nor treated me. He billed $405. Because he could. This is why health care costs are out of control.

A NY Times op-ed last November (by Professors Regina Herzlinger, Barak Richman, and Richard Boxer) proposed a simple reform that would have a big impact.

The main reason our system evolved the way it did is because employee health benefits aren’t taxed like regular wages are (which, by the way, makes them even more valuable to workers, enlarging the impact on the “wage stagnation” picture). But, as the Times writers point out, workers have little control over this enormous expenditure made on their behalf; they cannot try to economize or shop around for insurance. If they could, they’d opt for a wide variety of different plans.

So the writers propose that, without losing the nontaxableness, moneys earmarked for health insurance be given to employees, to purchase it themselves. If you spend the whole $18,000, fine; but if you spend less, you get to pocket the savings. (Even if you’re taxed on that part, it’s still a big benefit.) This would give insurance companies a strong incentive to develop a whole array of varied (and often cheaper) options, to compete for those consumer dollars — an incentive almost wholly lacking in the existing system.

It would also make the market for health care itself more like, well, a market. Competition among insurers would in turn exert pressure on providers to likewise innovate to offer more efficient, cost-conscious care. Meantime many more people would choose to use insurance as it was originally conceived, that is, to cover only big expenses, not routine ones. For the latter they would shop around, again mindful of costs. That would have a huge positive impact on the way health care is provided — and billed.

This reform seems like a no-brainer. And a huge vote winner too. Why has no politician latched onto this? Do the insurance companies (who wouldn’t like breaking open their comfy status quo) really have the whole system locked up?

101 Stumbles in the March of History

September 19, 2017

This 2016 book, by Bill Fawcett, is a compendium of historical might-have-beens. Decisions and choices the author deems mistakes, along with speculation about how differently subsequent history might have unfolded. He’s fond of saying, “It would have been a hundred times better if . . . .”

One could read this and conclude that people — even great personages — are screw-ups. But two things must be kept in mind. First, history encompasses zillions of decisions and choices people made. Finding among them 101 mistakes is all too easy. Especially if (second point) you use 20-20 hindsight. I recall how Gibbon, in The Decline and Fall of the Roman Empire, loved applying the word “rash” to actions that turned out badly. Fawcett, in contrast, especially in military situations, often castigates undue timidity. Dumb rashness versus admirable boldness may be discernible after the fact, when we know how things turned out. It may not have been so clear at the time when the decision had to be made, often on the fly, without a crystal ball. And all too often the outcome hinged not so much on how smart the decider was, how rash or prudent, but how lucky.

For each “mistake,” Fawcett spins a counter-factual history, typically seeing a modern world surprisingly different, and usually better. These stories I found pretty laughable in their details; too facile and pat. History is messy. If one thing comes through from this book, it’s how contingent history is. Change any detail about the past, even a small one (“for want of a horseshoe nail . . .”), and the difference may well cascade through time, an historical “butterfly effect.” (The idea that a butterfly flapping its wings in, say, Brazil, can cause a storm in Canada.) And the law of unintended consequences is powerful. The results from changing something about the past might have confounded our expectations, good or bad, however logical those expectations might seem.

So one can never know what the final outcome of any action will be. Supposedly, Chinese Foreign Minister Chou En-lai, when asked about the impact of the French Revolution, replied “Too soon to tell.”

I’ve always been highly cognizant of contingency in life. I’ve written about this — how different, for example, my own subsequent life would have been, if I hadn’t happened to walk on a particular street at a particular minute on April 1, 1975. Several other people’s lives would be dramatically different too! (I can think of at least five offhand, two of whom wouldn’t even exist.) And that walk was only one link in a complex chain of consequential contingencies.

It’s customary in book reviews to cite at least one fact (usually minor) the author flubbed, to show off the reviewer’s erudition. This book is actually shot through with sloppy mistakes, often dates. Andrew Johnson was sworn in as vice president in 1865, of course, not 1864. Et cetera.

But here is one fascinating historical might-have-been, in the book. Why didn’t the Confederacy make military use of slaves? They had millions! In fact, it was proposed to offer freedom for serving in the army. It could have doubled Southern forces. And it was done, but only at war’s end, too little and too late. The fact was that the rebs were just too racist and contemptuous of blacks to stomach the idea of fighting alongside them. Even if it might have won the war. (Probably not; but you never know, history is messy.)

The last item in the book is something I myself, at the time, did see as a stupendous blunder: disbanding Iraq’s army in 2003. But at least two other recent biggies are inexplicably omitted (mistakes by Fawcett himself):

For 2000, he gives us Blockbuster’s refusal to partner with Netflix. Yet a vastly more consequential error that year was Yasser Arafat’s rejection of a very generous peace deal. It was all too foreseeable that immense evil would flow from this.

In a similar category was Obama’s 2013 decision to punt to Congress on punishing Syria for crossing his chemical weapons red line. Hearing his announcement, I could scarcely believe its stupidity.

Perhaps coming too late for inclusion were two epochal 2016 blunders. One was Britain’s Brexit vote. The resulting mess seems to grow daily. So deeply has Britain’s politics been poisoned that The Economist now sees the unthinkable as almost inevitable: Red Jeremy Corbyn becoming prime minister. Goodbye, mother country.

The other of course was our own 2016 vote — which America’s future Gibbon will surely label “rash.”

Political extremism and moderation

September 15, 2017

This is a time of extremism. Marchers with torches and swastikas chant “Jews will not replace us,” and the president sees there some “very fine people.” Maybe my own condemnatory blog posts seem extreme. Where today is the space for moderation?

The ancient Greeks deemed moderation in all things a virtue. Yet they valorized some pretty extreme doings — like the Trojan War — perhaps a wee overreaction, that?

American political extremism came to the fore in 1964, with Barry Goldwater labeled an extremist (or extremist-backed) candidate. He pushed back by declaring that “extremism in defense of liberty is no vice, and . . . moderation in pursuit of justice is no virtue.”

He had a point; yet this sidestepped the real issue. As an old song said, “it ain’t what you do, it’s the way how you do it.” It’s not whether one’s views on issues are consonant with liberty and justice (and who doesn’t think so?) or are closer to the political center or the fringes. Either can equally inspire zealotry. The moderation to be sought is not moderation of ideas but of approach. It’s the mentality you bring to the political arena.

David Brooks

A recent David Brooks column is illuminating. “Moderates do not see politics as warfare,” he writes. “Instead, national politics is a voyage with a fractious fleet. Moderation is a way of coping with the complexity of the world.” Here, with my own take, are the aspects of moderation Brooks identifies:

“The truth is plural.” When it comes to big public questions, there usually isn’t a single simple answer. Competing viewpoints may each be at least partially right. Hence “creativity is syncretistic” — combining pieces from varied viewpoints to produce a way forward which, while imperfect from the standpoint of any one of them, is pragmatically workable, given all the political and situational constraints. Again — don’t let the perfect be the enemy of the good. (Examples included Simpson-Bowles and, yes, Obamacare.)

“In politics, the lows are lower than the highs are high.” The potential for doing harm, particularly by government, exceeds the potential for doing good. Especially given the law of unintended consequences. This suggests restraint when looking to address any problem through politics.

“Truth before justice.” No cause is well served by rejecting or suppressing inconvenient facts. And “partisans tend to justify their own sins by pointing to the other side’s sins.” It’s the “what about” syndrome, as when any derogation of Trump is answered with “what about Hillary this” and “what about Hillary that.” Another refusal to confront truth. Two wrongs don’t make a right.

“Humility is the fundamental virtue.” The world’s complexities defy our understanding. And for all the certainty I feel about some beliefs — evolution, for example — I recognize that people hold opposite beliefs with equal moral certainty. If I think they’re nuts, they think I am, and there’s no intellectual Supreme Court to resolve it. I recall Cromwell saying, “I beseech you, in the bowels of Christ, think it possible you may be mistaken;” and apply it to myself.

Voltaire

I’d like to add here, “So respect others and their views.” However, I cannot; not when marchers with swastikas chant about Jews. But what I will do yet again is to quote Voltaire: “I disagree with what you say but will defend to the death your right to say it.” This is indeed a key principle that is succumbing as American politics polarizes into extremism. There are many reasons why that’s happening. Brooks has elsewhere suggested one: in an age of so much moral uncertainty, some embrace absolutism in an effort to find solid ground. Thus we get the Savonarolas who want to punish and stamp out anyone not embracing their version of truth — as in the recent case of the engineer fired from Google for writing what some read as a politically incorrect memo.

Well, you do not have to respect those you disagree with — like those marching neo-Nazis. You can call them what they are, and condemn their ideas. But what you do have to do is accept their humanity and their right to be who they are. Not fire them from their jobs or jail them — or plow your car into them.

Wage war, if you must, against ideas — not against people. That is the moderation I advocate.

Never forget that if those neo-Nazis can be fired, punished, or repressed, the same principle can be turned around one day and applied to you.

“First they came for the Jews . . . . “

Moving pictures, Myanmar, and Rohingyas

September 12, 2017

My masthead declares me an optimist but a rationalist. Humanity is on an upward path, but nothing is ever simple, it’s strewn with pitfalls. Seeming triumphs often sour.

I keep an imaginary “rogues gallery” — pictures of the world’s worst villains. Whenever I can draw a big black “X” across one of those faces, it gives me great satisfaction. But unfortunately those seem outnumbered by newly added faces.

And alas my gallery of heroes* is much the smaller one. Villainy is far easier than heroism. The latter, of course, requires courage, a willingness to do right at personal risk or cost. That’s rare. (I don’t know how courageous I’d be if really tested.)

But especially rare — and sad — is moving a picture from that gallery to the other.

Aung San Suu Kyi has certainly been heroic. Read my 2012 blog post about her. Myanmar’s (Burma’s) vile military regime long kept her under house arrest. When finally allowing free elections, the generals first stipulated, in the constitution, that no one married to a foreigner could be president. Suu Kyi’s late husband was British. But after her party swept the elections, she installed a placeholder president and created for herself a new position from which to run things.

So nominally at least Aung San Suu Kyi is now, at long last, Myanmar’s leader. O frabjous day! Callooh! Callay! He chortled in his joy.

But I say “nominally” because Myanmar’s military was unwilling to cede all power. A familiar story: not only do those in power enrich themselves by it, they dare not relinquish it and expose themselves to comeuppance for their crimes. So Myanmar’s military-written constitution leaves the army with great power, outside civilian control.

The Rohingyas are a despised Muslim minority in mostly Buddhist Myanmar, concentrated in remote Rakhine state. Most Burmese see them as illegal immigrants, despite living there for generations. They’ve been persecuted since the ’80s. Now it’s become a genocidal progrom, the army using insurgent attacks as a pretext for a mass rampage of rape, burning, and killings, apparently aiming to eliminate the Rohingyas from Myanmar. Local Rakhine Buddhists have joined in the violence (and you thought Buddhists were peaceful). At least a couple hundred thousand Rohingyas have fled, under appalling conditions, to nearby Bangladesh.

And where is Suu Kyi in all this? Nowhere.

Before the election, her silence was understandable, even defensible, so the explosive Rohingya issue would not derail the transition to democracy. And even now, she doesn’t really call the shots, governing only on the army’s sufferance. She does not command it. It’s perhaps even conceivable that a clash with the army over its Rohingya atrocities could provoke a coup, ending Myanmar’s new hard-won (quasi) democracy. One can’t be heroic all the time. Maybe she’s acting prudently; “discretion is the better part of valor.”

But “[t]the time for such delicacy is past,” The Economist writes. “Democracy is of little worth if it entails mass displacement and slaughter.”

That’s happened too many times. We say “never again,” but somehow always let it happen again. When the 1994 Rwanda genocide erupted, Bill Clinton worked mightily at the UN — to block any response. It would have been just too hard, messy, and politically hazardous. So is it always.

So it may be for Suu Kyi. But this is her greatest test. The Economist notes that even if lacking legal authority, she “retains immense moral authority.” If her life has true meaning, she must act now. Come what may.

I hate to move pictures. This one would be especially painful.

*When I was a teenager, besotted with politics, that gallery was literal, with framed signed photos of my idols. I cringe recalling some of them now. One, in more mature perspective, certainly belonged in the other gallery . . . . We grow up.

My 70th birthday speech

September 9, 2017

Holding one of my wife’s gifts: my paternal grandparents’ 1910 marriage certificate. It shows their parents’ names, which I’d never known.

My wife threw a lovely party for my 70th birthday, September 7, catered at the State Museum. Everybody was there. Here is the speech I gave:*

Today I consider myself the luckiest man on the face of the Earth. (And I don’t even have what Lou Gehrig had.) I literally wrote the book on optimism. And I’ve read a lot of the literature on happiness. Philosophers have endlessly wrestled with the concept. John Stuart Mill famously queried whether it’s better to be a pig satisfied or Socrates dissatisfied.

But one thing I’ve learned is the importance of gratitude. Let me mention two books that greatly influenced me. One was Daniel Gilbert’s Stumbling on Happiness. Its key takeaway is that people are very bad at knowing what will actually make them happy. The other was The Paradox of Choice by Barry Schwartz. He wrote about what’s called the “adaptation effect.” Whenever you get something you’ve desired, or rise in life, you adapt to that as the new normal. It no longer surprises and delights you. You take it for granted. Your happiness level doesn’t improve.

Well, I’m very grateful for having the kind of personality that makes me grateful for what I have. And I’ve always been steeped in history and world affairs, which especially makes me appreciate by comparison what blessings modern American society bestows. I don’t take any of it for granted.

People complain about air travel. We travel sometimes to California. And flying over the Rockies, I always look down at that forbidding terrain. And do you know what I see? I see a wagon train. We get to California in a morning. Gratitude.

The one thing I’m most grateful for is my marriage to Therese, who made this wonderful party. You know, the adaptation effect often applies to marriages. Newlyweds report feeling surprised and delighted; but it usually wears off. However, not in my case. After 29 years, I’m still surprised and delighted, in fact more than ever. Thank you, Therese.

And thank you all for coming to share with me.

* The official text. The remarks as actually delivered from the teleprompter varied in minor ways.

Another day, another disgrace

September 7, 2017

Trump has killed the DACA program — “Deferred Action for Childhood Arrivals” — allowing people brought to America before age 16 to stay.

Most of these 800,000 are educated and employed, pay taxes, and contribute to society. They cannot legally work absent DACA. Some have served in our military. A majority have siblings who are U.S. citizens. A quarter have kids who are. They were induced to come forward and register with the government on the promise that the information wouldn’t be used against them. To break that promise, breaking up American families, is indefensibly cruel and base.

Trump claims to love these kids — shedding buckets of crocodile tears for them. He says Congress should fix this. So has he proposed legislation? Of course not. The idea that Congress will, within his 6-month deadline, pass a law it could never pass before (remember the “Dreamer” act?) insults our intelligence. Yet another huge Trump lie.

Trump also claims this is simply about enforcing the law. Obama is condemned for promulgating DACA by executive order. Yet Trump did exactly the same, getting around existing immigration law by executive order, with his Muslim ban. Anyhow his newfound reverence for law is piquant right after he pardoned Joe Arpaio, convicted of defying court orders. But Arpaio was a poster boy for the war on immigrants; especially brown-skinned ones. These actions cater to Trump’s most xenophobic racist fans. America used to be governed toward its highest aspirations; now, the lowest.

I heard Alternative Radio the other night; it’s a left-wing program but helps sharpen my thinking. Thomas Frank was discussing the political landscape. I previously reviewed one of his books quite negatively. But he’s an engaging speaker I enjoyed hearing. And nowadays I’m weirdly sympathetic toward people like him. I particularly relished Frank calling Trump a “mountebank.” A lovely archaic word, and deliciously apt.

My local paper has been filled with anti-Trump reader letters. But one on Tuesday caught my attention — by David Hauber of Troy — who voted for Trump. “I believed that Trump would be good for America,” Hauber writes. “I thought our government needed a shakeup, and that the ‘swamp’ was spiraling out of control. How could we go wrong with a successful businessman* who claimed he would make America great again?”

He found out. “I was wrong,” says Hauber. “Failure to protect Americans, uphold our laws, and understand the difference between facts and lies has made America the laughingstock of the world and endangered us all. This is the opposite of making America great again.”

His final words: “I am sorry.”

It takes a big person to admit they were wrong and apologize (which the mountebank never does). So far it’s been disheartening that so many Trump voters won’t either. But thank you, David Hauber, for a glimmer of hope.

I too regret my last presidential vote (for Libertarian Gary Johnson). I did agonize over it; I didn’t like Clinton’s politics, character, or personality. Yet compared with Trump . . . ! Not a day passes without my reflecting how much better off we’d be if she’d won.

* Successful? At defrauding customers (Trump University) and screwing anyone who invested in, or did work on, his projects.

Was Jesus Christ a real person?

September 4, 2017

Long ago a customer sent me a book, to convert me, titled Who Moved the Stone? I read it. The whole book sought to prove by logic that Christ must have risen from the tomb, and so forth, because that’s the only possible explanation for the events chronicled in the New Testament. My friend was confounded when I told him those events simply never happened. The Bible saying so doesn’t make them true. He’d never considered that possibility.

So how did the story actually originate? How did a minor Jewish sect become, within a few decades, a separate and fairly significant religion? Was Jesus even a real person?

Aslan

Recently I read various articles (in Free Inquiry magazine) that help clarify the history. I also read Zealot, by Reza Aslan (an Iranian-born scholar who left Islam for Christianity), purportedly attempting to chronicle the “historical” Jesus as distinguished from the Biblical one.

Firstly, if Jesus did exist, he didn’t make much of a splash at the time, didn’t even make the “newspapers.” Of course there weren’t newspapers, but tons of stuff was being written, and nothing contemporaneous even mentions Jesus. One of the articles I read counts (and actually names) 126 writers of the time who, if Jesus had existed, would have been expected to mention him. They did not. Only decades later does the name first surface.

Believers point to a passage in historian Josephus’s writings, that does briefly recap the familiar Jesus story. Josephus wrote in the later decades of the First Century. However, the subject passage only appears in copies of Josephus’s work made centuries later. Not in early copies. It seems obvious it was inserted long after Josephus’s death, to give Jesus false historicity.

Even Aslan begins by conceding “there are only two hard historical facts about Jesus” — he was a Jew who led a popular movement in First Century Palestine; and he was crucified. Yet Aslan’s confidence in even these limited “facts” seems misplaced. He cites only a tangential reference to “Jesus, who they call Messiah” in Josephus, apparently a different passage from the one discussed above, which Aslan does not mention. Plausibly both suffer the same problem. Meanwhile, interestingly, Aslan does cite a whole long list of Jesus-like First Century Jewish rabble-rousers who did certifiably get into the historical record. How did one so initially obscure become so important later?

Despite the lack of documentation for Jesus, Aslan’s book is a quite detailed biography. Where do all his “facts” come from? Directly contradicting his claim to be separating historical Jesus from the Bible, his only source is — guess what — the Bible. That is, the New Testament Gospels. He just blends their stories into one coherent narrative (trying to reconcile their inconsistencies, fleshed out novelistically with loads of made-up detail, including mind-reading of the characters). It should be titled The Gospel According to Aslan.

He does reject as implausible some of what’s in the Gospels — notably their blaming the Jews, rather than the Roman Pilate (who killed thousands), for Christ’s execution. But Aslan has some trouble with Christ’s supposed miracles. He acknowledges that an objective modern reader will laugh at things like walking on water and raising the dead. He struggles to evade the issue of whether Jesus actually did these things or not (or used stagecraft). Instead Aslan confines himself to saying that a miracle-working Jesus would not have been unique in First Century Palestine — lots of guys were running around doing such magic — though normally for a fee — and also that “everybody” in Christianity’s early days (according to Christian writers) accepted Christ’s miracles as fact. Well, that settles it.

Aslan does call the resurrection a matter of faith, not history, and notes that its concept was wholly alien to Judaism. Yet he is impressed that so many of Christ’s contemporary followers are said to have testified to their personal experience of the resurrection. But here again nobody wrote anything at the time; those purported testimonies are all hearsay in the Gospels written by others much later. Meantime, as Aslan himself suggests, the resurrection story was concocted to solve a very big problem: otherwise any idea of Jesus as messiah or divine would seem to be contradicted by an ignominious death. And, most strikingly, Aslan says not even the Gospels talk about resurrection until the nineties CE.

In Aslan’s telling it would appear the Jesus cult originated in his lifetime and was sustained among his followers in the decades after, led by his brother James. This comports with most Christian histories. But this too is based on no evidence save the Gospels. Yet wasn’t Nero blaming Christians and persecuting them for Rome’s great fire as early as 64 CE? Or was he? Upon examination, the actual historical documentation for that too is dubious. Maybe more Christian mythology.

So when and how Christianity became a thing remains a puzzle shrouded in mystery. But it’s impossible that events as public and dramatic as the Gospels story could have occurred without being recorded and commented upon by numerous other contemporary chroniclers. Most likely Christ was a later fictional creation modeled upon that gaggle of familiar Judaean troublemakers Aslan describes.

His first incontestable appearance doesn’t come until Paul’s Epistles, written a couple of decades after the supposed crucifixion. While Paul’s writings apparently did exist in the 50s, there’s again the problem that we don’t have originals, and the texts in the Bible may well differ greatly. But anyhow, here’s the interesting thing: Paul says almost nothing of Christ’s life. He does not seem to be discussing an actual personage. Instead, his “Christ” was an idea, the martyr’s crucifixion as atoning sacrifice presented as a myth that Paul could internalize — “It is no longer I who live but Christ who lives in me,” and “I have been crucified with Christ.” (Note that crucifixion was actually a common punishment, meted out to thousands.) Other very early Christian writers wrote in a similar vein, with no mention of Christ’s earthly story.

Meantime there’s “Gospel Q” from about the same time as Paul and thus also predating the New Testament Gospels. No actual copy of “Q” has survived, but scholars have reconstructed it from later writings that relied on it, mainly the Biblical Gospels. Q — unlike Paul — does talk about the life of a Galilean named Jesus, and his supposed preachings. But here’s the interesting thing: Q says nothing of Jesus’s death! He doesn’t seem to be the same guy Paul was writing about.

And now we come to the Gospel of Mark, still later, dating from around 70 CE. Mark was probably writing down a story that had started as a meme a little earlier. But here for the first time Paul’s “Christ” is melded with Q’s Galilean “Jesus” — by some religious genius (we don’t know the true author; the type recurs, think Mohammad or Joseph Smith). The teachings are put together with the idea of crucifixion and all the mysticism Paul spun around that. And voilà, a compelling story of “Jesus Christ” is created.

To see why that story turned out to have such legs, Aslan does provide the historical context. In 66-70 CE there was a failed Jewish revolt against Rome. It was a national catastrophe. Jerusalem was destroyed, its inhabitants slaughtered, the survivors carried off in chains to slavery. That put paid to the cults of messianic Judaism that had been so rampant. Nobody wanted to be tarred with that stuff now; it was a dead end.

Jesus was, to be sure, yet another of those messiah figures. But with a difference: his cult could distance itself from Jewish pariahdom. His teachings differed radically from those of his insurrectionist predecessors. “Hey, lookit, this is not Judaism here,” its followers were saying. No challenge to Rome. And blaming the Jews, not Romans, for Christ’s death was politic when trying to sell their story in the Roman Empire. The Romans may indeed have tolerated them better than Jews, for a while at least. But nevertheless, it’s easy to see why in Palestine the Jewish revolt’s horrible trauma provided fertile ground to plant the seeds of a new religion. With a new divinity — Jesus Christ.