America’s partisan divide trumps all other tribalisms

October 17, 2017

“Humans are extremely loyal to members of their own group. They are even prepared to give up their own lives in defense of those with whom they identify. In sharp contrast, they can behave with lethal aggressiveness toward those who are unfamiliar to them.”

I read that in a book of science essays,* just as I was drafting this piece about about America’s partisan divide. I’ve addressed this before, because it’s hugely portentous.

Humanity as a whole has been gradually overcoming the instinctive tribalism of the quote. Over time our “tribes” have become more inclusive; some people now consider all humankind their tribe. That’s an important factor reducing conflict and violence (as Steven Pinker explained in The Better Angels of Our Nature.) Yet American tribalism still remains strong.

Standard tribalistic divisions entail race/ethnicity; religion; educational level; socio-economic status; language; gender; and age. All these continue to operate in American culture. But another one now trumps them all: political party.

That’s the key takeaway from a recent Pew survey, as well as a Stanford study. It’s not exactly news that partisan divisions have increased, with a vanishing moderate middle (or one that’s out-shouted). Further, like-minded people have tended to cluster together, making red areas more red and blue areas more blue. That’s exacerbated by peer pressure and conformism, as well as partisan gerrymandering. But what’s really striking now is that for a lot of people, political tribalism has become the most important one in shaping felt personal identity.

This may seem strange when percentages supporting either party have fallen. Yet, on the other hand, for those still loyal to a party, it’s become more intense.

Look at Pew’s graph, below. In 1994, party affiliation was already the top factor governing attitudes, though neck-and-neck with all the other tribalisms. Since then, party has gained a runaway lead, becoming totally pre-eminent.

In a sense, it’s a logical development. We sacralize the concept of individualism. And as Stanford researcher Iyengar points out, political opinion is deeply personal, seemingly a conscious voluntary choice — unlike those other tribal markers.** Thus people would tend to see their political stance as more reflective of “who they really are.”

Also, politics is a much better outlet for tribal instincts, in today’s culture, where hostility toward others on the basis of ethnicity or religion, etc., is very much frowned upon. In contrast, flaming someone over politics seems acceptable.

And while messages encouraging tribal solidarity based on race or faith or the like are muted, we’re bombarded with messages from numerous sources (not only Russian) aiming to enflame partisan passions. The internet/smartphone/social media revolution is a big factor.

The Stanford researchers found similar trends in other democracies. However, Americans are distinctive in our outward display of partisan identities — bumper stickers, lawn signs, etc. Those aren’t seen in Europe, with less citizen political engagement. (Their campaigns are much shorter too, whereas America’s political season is year-round.)

Interestingly (in light of my opening quote), Stanford also found that while Americans’ in-group favoritism toward fellow partisans is strong, their animosity toward folks in the opposing party is stronger. The two sides inhabit different worlds — they’re strangers to each other, a recipe for hostility.

I used to find Democrats and the left more guilty of demonizing opponents and their motives; but Republicans have leapfrogged over them. I was no fan of Obama or Hillary, but the intensity of their demonization by Republicans and the right exceeds the bounds of sanity.

Party actually even trumps ideology. You’d think the two go together. But the meaning of “conservatism” has changed radically, while most Republican “conservatives” seem not to have noticed the bait-and-switch. They stick with their label regardless of its altered content. For some time, Republicanism in the South has really stood for guns, God, and whiteness (look at Alabama’s senate race), but now that has spread to the entire national party, submerging its traditional ideology and values. Religious “conservatives” no longer even care about lying, cheating, or pussy-grabbing. Sticking with the team, the tribe, is the guiding principle.

Is all this just a phase, that will pass? I doubt it. Our body politic is thoroughly poisoned. We no longer have political debates, we have shouting matches, with neither side interested in considering what the other has to say. And partisan tribalism really does seem to trump everything else, including truth and reason — again, especially among my former Republican comrades.

Political partisans also seem to embody the Vince Lombardi ethos: “winning isn’t the most important thing, it’s the only thing.” Yet while battles can be won, the war is unwinnable. Neither tribe can convince their opponents, nor bludgeon them into surrender. With a political cycle where one party or the other holds power only thanks to the thinnest of electoral margins, its legitimacy always under assault, neither can accomplish much in our system of checks-and-balances.

Rational optimism? I continue to believe our species has a bright future. Our country, not so much.

* “Subverting Biology” by Patrick Bateson, in This Explains Everything, John Brockman, ed.

** Religion should similarly be a choice, but in practice it’s less so.


A cute puzzler

October 16, 2017

This came not from Car Talk but an essay I read. Imagine a ribbon girdling the Earth’s circumference. Then add just one meter to the ribbon’s length. So there’d be a little slack. How big is the gap between the ribbon and the Earth’s surface?

Most people would guess it’s extremely tiny — that was my intuitive answer — a mere meter being nugatory over such a huge distance. But the surprising answer is about 16 centimeters. If the ribbon was snug around the Equator before, how could an added meter make it that much less snug?

My wife and I puzzled over this and soon figured out the simple solution, without even using pencil and paper:

A circle’s diameter is the circumference divided by Pi (3.14+) — i.e., a bit less than a third. If two circumferences differ by a meter, then their diameters differ by almost 1/3 of a meter — say about 32 centimeters — or about 16 centimeters at each end.

The essay said only mathematicians and dressmakers get this right.

The Iran Deal — more Trump destructivism

October 13, 2017

Must I address every Trump atrocity? (Actually I don’t, it’s impossible; I haven’t discussed the NFL nonsense.) But I feel a civic duty to call out truly bad stuff.

Even The Great Liar can’t say Iran actually violated the nuclear deal, to justify trashing it. Instead his fig leaf is to claim it somehow harms U.S. security interests. But that too is a great lie. What does harm our security interests is trashing the deal.

Iran is a bad actor in many ways, yes. But the deal at issue is limited to just the nuclear program. Will undoing it make Iran a better global citizen? Certainly not; to the contrary, it will remove any leverage we have over Iran, and thus any constraints on its behavior. Is that in our national security interest?

It’s argued the deal was a bad one because it won’t stop Iran’s nuclear program. But the whole point was instead to slow the program, subject it to international inspection, and buy time. That was the best we could achieve; Iran would never agree to give up its nuclear ambitions entirely. Those who criticize the deal offer no plausible path to a better one. While undoing the deal will free Iran to go full speed ahead to a bomb, with no international inspections or other restraints. Is that in our national security interest?

But it gets worse. The Iran deal represented the kind of American leadership Trump refuses to understand. We led the international coalition of nations joining in this effort. Those others strongly support the deal. What will they think if we wreck it, reneging on the deal we had committed to? Will they look to us for leadership in the future? Will anyone make any deals with a nation that can’t be trusted to fulfill them? Or was that not covered in The Art of the Deal?

Remember the term “rogue nation?” I used to bristle when anti-Americans turned it against us. It was untrue before; but now, alas, it is true.

Abdicating America’s global leadership role leaves a void that Russia and China are all too eager to fill.

Is that in our national security interest?

Trump’s action quite simply makes no sense (except, of course, pandering to his know-nothing base). However, in typical Trump fashion, there’s less here than meets the eye — yet another in the unending saga of Trump flim-flams. It doesn’t actually tear up the Iran deal. Instead it bucks the issue over to Congress, to restore sanctions. But even if Congress does nothing, that won’t repair the grave damage to America’s international credibility and standing.


October 8, 2017

My wife applied for a writer-in-residence grant at Gettysburg. She didn’t get it, but meantime became interested in the history, and we decided to go there. It seemed timely to visit a symbol of when America was even more divided than now.

Gettysburg was the Civil War’s biggest battle; the Confederates’ lone northern invasion was repulsed. It seemed logical to start at the Visitor’s Center. Finding the parking lot was easy. But the route from there to the building was unobvious. Another jolly couple was having the same problem. We flagged down a guy in a motorized cart. He immediately guessed, “You folks want to know how to get to the Visitor’s Center.”

Once there, we began with a movie, narrated by — who else? — Morgan Freeman (a Mississippi-born descendant of slaves). It was a work of art, beautiful and moving, putting the battle in historical context; making clear that the war was about slavery, and nothing else.

Actually, what blew things up was the question of allowing slavery in the vast new plains and western territories. Previously, we’d managed to keep an equal balance between slave and free states, but northerners feared the eventual implications of ending that balance by adding new slave states. That was the issue that birthed the Republican Party. Southerners meantime feared the consequences of slavery being confined within its existing borders. They were terrified lest their slave population outgrow their ability to control it, and saw the new territories as a potential safety valve. (This I’d gleaned from a book I’ve previously reviewed.)

The film did note that most northerners were not keen to abolish slavery altogether — immigrants and northern working men feared job and wage competition from ex-slaves flooding northward. (This has current-day resonance. As it turned out, most freed slaves stayed put.)

We then toured the battlefield — a vast expanse filled with monuments and memorials. We started with the Confederate ones before they could be removed, our guide quipped. And then the museum, also beautiful and sobering, chronicling the whole war. One had to be struck by just how much human suffering slavery caused — not only to the slaves, but now too in this climactic apocalypse of violent conflict on account of slavery.

Not yet sufficiently battle-hardened, we also viewed the “cyclorama,” a painting of the battle measuring 15,834 square feet (not a typo). This was 1884’s version of an epic movie. Then, in town, a diorama show (quite good) together with another exhibit that put us (quite loudly!) in the middle of Pickett’s charge. And, before we left, my wife deemed it a duty to visit the cemetery and the site there of Lincoln’s Gettysburg Address.

Me and Abe

Meantime we also visited Eisenhower’s farm, and numerous shops containing more relics, souvenirs, Civil War books, toy soldiers, and other memorabilia and militaria, than could ever conceivably be sold.

I wrote recently about history’s blunders, and its contingent character. Gettysburg surely epitomized that. It could have gone either way — a very close fight. The South seemed to be winning till the last afternoon of the three-day battle. The “fog of war” came frequently to mind, with the impossibility of commanders having the full picture of what was happening, nor having timely communications.

Meantime, that book I reviewed about history’s mistakes ought to have included southern secession. Most analysts think their chances of victory were slim, given the North’s economic strength. Secession was a giant gamble, that turned out very badly; the South was devastated. But again this is hindsight. It might not have been clear in 1861 that the North would fight at all, let alone so fiercely. Perhaps secessionists anticipated a fait accompli, the North backing off from the extreme step of all-out war.

A southern victory at Gettysburg might not have won the war. But change any detail of history and we cannot know how the subsequent story would have unfolded. Lee winning at Gettysburg might have tipped the 1864 election to a candidate who’d negotiate a peace without restoring the union or freeing the slaves. Today’s world would be very different!

However, the nation “conceived in liberty and dedicated to the proposition that all men are created equal” did endure. And not only freed the slaves — a more abject and despised “other” could scarcely be imagined — but made them citizens and voters. The nobility of these enactments, 150 years ago, takes one’s breath away.

It was a different Republican party in charge.

Human history in a nutshell, Part 2: Civilization

October 4, 2017

Previously I recapped our biological evolution. Now for civilization.

It’s a tale of two revolutions: agricultural and industrial. For around 95% of our existence we were hunter-gatherers. And for around 98% of the remaining time, mostly farmers.

About 10,000 years ago, butting up against the limits of a hunter-gatherer lifestyle, we invented something different: growing food and domesticating animals. This was truly a revolution, humanity’s first declaration of independence — from nature’s cold mercies. Now we could exert some control, some mastery, over our conditions of existence.

Yet some view this negatively, as our “fall” from a prior paradise of harmony with nature, presaging our hubristic “rape” of the planet. Yuval Noah Harari’s book Sapiens sees agriculture as a curse, that actually made life worse in many ways: harder, less healthy, less fair and egalitarian.

Some of this could not be evident to people at the time, or else they would not have embraced agriculture. Life’s key problem was sustenance, and agriculture (for all its vagaries) seemed to enhance food security.

Harari may be right that this was a mistake — during most of human history. Only in the last century or two have we really gotten the payoff (because we finally truly mastered the thing).

But meantime agriculture made possible civilization. Now we could settle down, and stockpile production surpluses. This soon led to cities, and division of labor, with some people able to become artisans and take on diverse societal roles. Hierarchies formed, with bureaucratic governments to administer things (and fight other societies). That too Harari deems a curse, and maybe again it was, for most of history. But the rich, complex civilization (with all its material comforts) we enjoy today could never have evolved without agriculture starting it.

Which brings us to the second revolution — the industrial one. It was mainly an energy revolution. At first limited by just our own muscle power, we then exploited animals (horses, oxen) to do much more work. That was pretty good. But nothing compared to how much energy and work is gotten out of a steam engine. And a single gallon of gasoline contains energy equal to about 49 horsepower hours, or 500 hours of human work (and I don’t mean paper pushing).

This obviously was propelled by a scientific revolution, as we grew in understanding nature and how to harness its forces. But a book by historian Joyce Appleby, The Relentless Revolution: a History of Capitalism, illuminates something else little understood: how the industrial revolution also depended upon what was really a second agricultural one.

The first introduced food growing, but it was still an economy of scarcity. It took almost everybody working in the fields to feed a population, and even then it sometimes failed, with devastating effects. This was captured by Thomas Malthus, writing in 1798 of the interplay between population and food supply. In fat times, population would grow (fewer children dying), but that would inexorably outstrip the ability to raise food production. Come a lean period and population would fall (more children dying).

That cycle had indeed long been repeating; as Appleby puts it, Malthus was an excellent prophet of the past. But he wrote just when the picture was changing.

The industrial revolution’s factories were something new, and great numbers of people flocked to work in them. Yet how was that possible if almost everyone was needed on the farm just to raise enough food? And if most people were thus barely subsisting, whence came the means to buy all the new goods spewed out by factories?

The answer lies in that second agricultural revolution. It was not a thunderclap, but a gradual process, beginning around the early seventeenth century, in the Netherlands and Britain. A key factor limiting food production was soil depletion, requiring fields to “rest” every third year or so. But by fiddling with different crops, the Dutch and Brits found ways around this. Other innovations too emerged, so that, over a couple of centuries, food output (relatively speaking) soared.

So no longer did we need everybody in the fields (today it’s down to almost nil). Many could work instead in factories. Of course, that was hellish — but not as awful as life stuck on the farm. Anyway, now we could produce sufficient food plus factory goods. That made society wealthier overall — with enough money in people’s pockets to buy the added stuff.

This did not unfold without a concomitant revolution in thinking. Previously fatalism was the prevailing mindset. People did believe history went in cycles. But now they could see not only change, but positive change. The word “progress” came into use. And as Appleby writes, it was more important to believe governing arrangements could be changed than that they should be.

Likewise, we take for granted the idea of striving to improve one’s life. But there was no such thing in feudal society. People worked just to exist, nothing more. They could not have imagined the kind of life we have today. Only with the revolutions into modernity did we begin to grasp the concept of proactive self-betterment. Observers were actually surprised to see ordinary folks attracted to unaccustomed consumer goodies. This sparked a virtuous circle, energizing people into the kind of industrious striving that, in turn, turbocharged our continuing agricultural and industrial revolutions.

Of course this too meets with censure. The market economy and “consumerism” that fuels it are condemned. Yet this was also a social revolution, creating a bold new idea of the individual. The word “individual” was never even applied to human beings before the seventeenth century. Now the social chasm between oligarchs and commoners was bridgeable. The egalitarian ideal that many today put in opposition to capitalism is in fact a product of that very thing.

Today we’re at the start of another great revolution. Just as increased agricultural productivity freed people to work in factories, now more efficient manufacturing frees us to create wealth in yet newer ways. How it works out, we’ll see.

Las Vegas: It’s not about “guns”

October 2, 2017

After every mass shooting, the NRA says it’s not about guns. They’re right.

It’s about military style assault weapons.

The Supreme Court, in Heller v. DC, finally ruled that the Second Amendment’s “right to keep and bear arms” is an individual right. But no rights are ever absolute. Your free speech doesn’t allow libel. Your right to drive doesn’t include driving drunk. All rights are subject to reasonable laws to protect society, and people against harm by others (the basis for all law). Surely that applies — indeed, especially — to weapons. Does the “right to keep and bear arms” extend to howitzers? Nuclear weapons? I don’t think so.

And what about military style assault weapons? The kind used in Newtown, San Bernardino, Orlando, and now Las Vegas, to rapidly spray bullets and kill a lot of people very fast? These weapons have no legitimate sport shooting or hunting use, nor are they suited to protecting oneself against intruders. Their only function is to kill a lot of people fast.

Allowing anyone to buy such weapons is simply insane.

Stop talking about “guns” and “gun control.” It just raises the false concern that people’s guns will be taken away (which could never happen). Let’s talk instead just about military assault weapons. Maybe if gun evangelists would understand that their right to ordinary guns is not at issue, they’d see the reasonableness of banning private ownership of military assault rifles, howitzers, and nuclear weapons.


Human history in a nutshell, Part 1: Evolution

September 28, 2017

It was about six million years ago that we last shared a common ancestor with a biologically distinct gang — chimps and other apes. But our species, Homo Sapiens, is only a couple of hundred thousand years old. Between those two chronological markers, a lot of evolution happened.

In fact, over those six million years, quite a large number of more or less “human” or proto-human species came and went. The line of descent that produced us was only one of many. All the others petered out.

As the story unfolded among all these variant creatures, two different basic strategies evolved. Call one vegetarian. Its practitioners relied on a menu much like that of modern apes — fruits, nuts, berries, etc. A pretty reliable diet, but due to low nutritional content, much energy was devoted to eating and digesting — they literally had to consume a lot to get the energy to consume a lot. A big digestive system was required, diverting resources that otherwise could have gone to their brains.

The other group went for brains rather than guts. This required a high energy diet, i.e., including meat. But meat was hard to get, for such weak little critters lacking fangs and claws. Getting meat required brains.

All well and good, except that bigger brains meant bigger heads, a bit of a problem for mothers giving birth. And that was exacerbated by a second evolutionary trajectory. Hunting meat proved to be a lot easier for early humans if, instead of going on all fours, they could efficiently walk upright and even run. Doing that called for changes to pelvic architecture, which had the effect of narrowing the birth canal. So the bigger-headed babies had to fit through a smaller opening. Something had to give.

What gave was the gestation period. If humans functioned otherwise like apes do, babies would spend not nine months in the womb but twenty, and come out ready for action. But their heads by twenty months would be so big they couldn’t come out at all. So we make do with nine months, about the limit mothers can bear, and the least babies can get by with. Consequently they require lengthy attentive nurturing, which of course has had a vast impact upon humans’ way of life.

Earlier birth thus meant longer childhood, and a lot of a person’s development outside the womb as his or her brain responds to things around it. This in turn is responsible for another huge fact about human life: we are not cookie-cutter products but very different one from another. And that fundamental individualism, with each person having his own perspectives and ideas, played a great role in the evolution of our culture and, ultimately, civilization.

Another key part of the story was fire. We romanticize the mastery of fire (e.g., in the Prometheus myth) as putting us on the road to technology. But that came much later. Fire was our first foray into taking a hand in our own evolution. It began with cooking. Remember that trade-off between gut and brain? Cooking enabled us to get more nutrition out of foods and digest them more easily. That enabled us to get by with a smaller gut — and so we could afford a bigger brain.

This omnivorous big-brain model seemed to work out better than the vegetarian one; the vegetarians died out and the omnivores became us. (This is not intended as a knock on today’s vegetarians.) But notice again how much actually had to be sacrificed in order to produce our supersized brains. And that this was a bizarre one-time fluke of evolutionary adaptation. It happened exactly once. None of the other zillions of creatures that ever existed ever went in this unique evolutionary direction.

In other words, if you think evolution of a species with world-dominating intelligence was somehow inevitable or pre-ordained, consider that it didn’t happen for 99.999+% of Earth’s history. It was an extreme freak in the workings of evolution.

Indeed, it’s a mistake to conceptualize “evolution” as progress upward toward ever greater heights (culminating in Homo Sapiens). It’s because of that erroneous connotation of progress that Darwin didn’t even use the word “evolution” in his book. The process has no goal, not even the “selfish-gene” goal of making things good at reproducing themselves. It’s simply that things better at reproducing proliferate, and will grow to outnumber and eclipse those less good at reproducing. Our species happened to stumble upon a set of traits making us very good reproducers. But insects are even better at it, and there are way more of them than us.

(Much of what’s in this essay came from reading Chip Walter’s book, Last Ape Standing.)

The curse of Ham

September 26, 2017

I have written about Kentucky’s Creation Museum. Should be called the Museum of Ignorance, since its exhibits contradict incontestable scientific facts. Like the dinosaurs dying out 65 million years ago. The museum is off by 64.99+ million years. It shows humans living beside them. This might be fine as entertainment, but not for an institution purporting to be educational.

Earth to Creationists: I’m more than 6,000 years old. Around a million times older.

The museum was built by an outfit called Answers in Genesis. Not content with this slap in the face to intelligence, Answers is now building a replica Noah’s Ark. The project has received an $18 million tax break from the State of Kentucky (specifically, a sales tax abatement). How does this not flagrantly flout constitutional separation of church and state?

Ken Ham

The head of Answers in Genesis is a man named Ken Ham. Please linger upon this name.

For one thing, ham is just about the most un-kosher thing in Judaism. Kentucky’s public support for a Ham-centric project is plainly a gross insult to its citizens of the Jewish faith.

But there’s a much bigger issue. The name of Noah’s third son was Ham. Coincidence? Not very likely. This Mister Ken Ham must, beyond any doubt, be a direct descendant of Noah’s third son. He has never denied it; and it certainly explains his ark fetish.

Now, the Bible is very clear about this fact: Ham was cursed, for a grave insult to his father. Scholars differ in their exact interpretations. Some say Ham castrated Noah; others that he buggered Noah. Either way, it wasn’t nice, and so Ham was cursed by God. Ham’s own son Canaan was the progenitor of the Canaanite people, who of course were later wiped out by a God-ordered genocide; and also of all Africans, which is why they’re all cursed too.

But here is the point. In this Kentucky Ark project, Mister Ken Ham must sneakily be aiming to whitewash the above family history, employing lies to mislead the public and undo the curse that God, in his infinite wisdom and justice, laid upon all of his line. This is out-and-out blasphemy.

Some will say it should be left to the Lord to visit his divine justice upon this doubly accursed latter-day Ham. But of course God-fearing people have rarely been content to defer to that ultimate justice, and have instead so often taken matters into their own hands, with fire and sword.

I’d go with the latter.

Norman Dorsen, Colin Bruce, and mortality

September 24, 2017

One thing that happens as you get old is that the world is increasingly populated by ghosts.

I graduated NYU Law School in 1970. It puts out a yearly magazine, that’s gotten glitzier over the years as the school has grown in stature. Mainly I’ve enjoyed seeing in it news and photos of people I’d known, sometimes classmates, mostly professors. But gradually they have faded away (presaging my own future); the magazine became full of strangers. Yet one face I could always still count on seeing was the eternal, ubiquitous Norman Dorsen.

He was my constitutional law professor. When I opened the latest magazine, I found a full page photo of Norman Dorsen. Because he had died.

My NYU professors were not faceless anodynes; they included some powerful, dynamic personalities I still remember vividly. But even among them Dorsen was a monumental figure. Though never its actual dean, Dorsen came to be the school’s embodiment, and central to its mentioned escalation in stature over the decades.

Fonda as inmate Gideon, preparing to mail his petition to the Supreme Court

He had co-written the Army’s legal brief relating to the Army-McCarthy hearings. He also wrote a brief in Gideon v. Wainwright (Henry Fonda played Gideon in the movie; it still gooses my emotions); in the Nixon tapes case; and helped write one in Roe v. Wade. He was president of the ACLU; and director of NYU’s Civil Liberties Program for 56 years.

Under Dorsen’s leadership, in 1977, the ACLU took one of its most controversial stances: backing the right of Nazis to march in Skokie, Illinois. Dorsen considered this a civil liberties litmus test. (I too am an absolutist on freedom of expression.)

Norman Dorsen

Norman Dorsen was a man of rigor and seriousness. One episode sticks strangely in my memory. Law school classes were mainly socratic dialogs analyzing past actual cases. But with grades based solely on the final exam, students were often lax about class discussion. One day Dorsen began the session and quizzed a student, who couldn’t answer. After one or two others weren’t prepared either, Dorsen, visibly pissed off, simply closed his book and walked out.

The only such instance in my law school career. Gosh, almost half a century ago.

The same magazine also has a smaller obit for George Zeitlin, my tax law professor.

Colin Bruce

And the same day’s mail brought World Coin News with Colin Bruce’s. I don’t recall meeting in person but we corresponded over decades. Colin too was a living landmark. He’d been responsible for creating The Standard Catalog of World Coins in 1972. Non-collectors can’t appreciate this. But previously, evaluating foreign coins was mostly guesswork. What a blessing to have listings for every country in one (large!) book. Thank you, Colin Bruce.

Pricing accuracy was still always problematical. And sadly, after Colin retired, the catalog went downhill, accumulating errors and stupidities that never were corrected. Finally I published a broadside detailing the problems — with no response — and resolved to boycott further annual editions. This will make my coin dealing harder. The only consolation is that it won’t be that much longer.

Time was, my life stretching ahead felt so long it might as well have been forever. Now the end feels so near it might as well be tomorrow.

A different idea about health care

September 22, 2017

As Republicans try one more time to pass a bill to strip millions of their health care, a huge policy crap-shoot without benefit of hearings, public debate, or input from experts, here’s another idea.

We keep hearing that middle class wages have flatlined over a long period. Actually, recent data shows a significant uptick. But anyway, such numbers are misleading because they normally reflect only salaries — and not fringe benefits — which comprise a growing part of total employee compensation. The big one is health insurance.

About 150 million Americans get health insurance through their employers, and its value (i.e., its cost) now averages about $18,000 annually. Combining this with salaries tells us that total earnings have not stagnated, but risen substantially.

This also means Americans effectively spend a growing part of their incomes on health care. It’s even more than that $18,000, what with rising deductibles, co-pays, etc. Of course, health care is something of value, improving quality of life, worth paying for. But paying for health insurance is not quite the same thing. Healthy people get little benefit. Indeed, the whole system is set up for them to subsidize the sick; and Obamacare expanded on that.

Overall, Americans spend a lot more on health care/insurance than other advanced countries, without being healthier. This is fundamentally because it’s not a competitive market. There’s really no shopping around for health services; the end-user isn’t usually the one who’s paying. Obamacare didn’t fix this.

Recently, during a medical appointment with one doctor, another stopped in to “consult,” for a few minutes. He neither examined nor treated me. He billed $405. Because he could. This is why health care costs are out of control.

A NY Times op-ed last November (by Professors Regina Herzlinger, Barak Richman, and Richard Boxer) proposed a simple reform that would have a big impact.

The main reason our system evolved the way it did is because employee health benefits aren’t taxed like regular wages are (which, by the way, makes them even more valuable to workers, enlarging the impact on the “wage stagnation” picture). But, as the Times writers point out, workers have little control over this enormous expenditure made on their behalf; they cannot try to economize or shop around for insurance. If they could, they’d opt for a wide variety of different plans.

So the writers propose that, without losing the nontaxableness, moneys earmarked for health insurance be given to employees, to purchase it themselves. If you spend the whole $18,000, fine; but if you spend less, you get to pocket the savings. (Even if you’re taxed on that part, it’s still a big benefit.) This would give insurance companies a strong incentive to develop a whole array of varied (and often cheaper) options, to compete for those consumer dollars — an incentive almost wholly lacking in the existing system.

It would also make the market for health care itself more like, well, a market. Competition among insurers would in turn exert pressure on providers to likewise innovate to offer more efficient, cost-conscious care. Meantime many more people would choose to use insurance as it was originally conceived, that is, to cover only big expenses, not routine ones. For the latter they would shop around, again mindful of costs. That would have a huge positive impact on the way health care is provided — and billed.

This reform seems like a no-brainer. And a huge vote winner too. Why has no politician latched onto this? Do the insurance companies (who wouldn’t like breaking open their comfy status quo) really have the whole system locked up?