Archive for the ‘history’ Category

The Soul of the First Amendment

November 27, 2017

How far should free speech go?

Floyd Abrams is the country’s leading First Amendment lawyer. I bought his book, The Soul of the First Amendment, at the recent symposium on the post-truth culture (mainly for the opportunity to shake his hand).

The book’s introduction discusses my favorite painting: Norman Rockwell’s Freedom of Speech (in his “Four Freedoms” series). If not an artistic masterpiece, it’s a gem of conveying an idea that’s very dear to me. Abrams explains that it illustrates an actual event Rockwell witnessed, at a Vermont town meeting. The speaker was a lone dissenter against a popular proposal. He’s an ordinary working class Joe. A telling detail is the paper protruding from his pocket. It suggests he’s not talking through his hat, but has gathered some information — a point of particular resonance today. And even more so is the painting’s other key feature — the respectful listening by the man’s fellow citizens. For me this painting captures America — and civilization — at its best.

Freedom of speech in America is enshrined by the First Amendment — “Congress shall make no law . . . abridging the freedom of speech or of the press . . . .” (The Fourteenth Amendment made it applicable against state governments too.) A key point of the book is how unique this actually is, not only in history, but in today’s world. In fact, no other country so exalts the inviolability of free speech. All others subject it to varying restrictions. And mostly they involve what are basically political concerns — the very sphere wherein freedom of expression is actually the most consequential.

People have been jailed in Europe for the crime of Holocaust denial. That is, advocating a certain interpretation of history. Europe also has many laws against “hate speech,” quite broadly (if vaguely) defined. Abrams cites a Belgian member of parliament prosecuted for distributing leaflets calling for a “Belgians and European First” policy, sending asylum seekers home, and opposing “Islamification.” His sentence included a ten year disqualification from holding office. It was upheld by the European Court of Human Rights! And such a case is not unusual in Europe. Actress Brigitte Bardot was fined 15,000 Euros for writing a letter objecting to how French Muslims ritually slaughter sheep.

America is a free speech paradise in comparison not only to such other places, but to our own past. The First Amendment actually played almost no role in our law and culture until around the mid-20th century. Abrams cites a 1907 Colorado episode. A lame-duck governor, defeated for re-election, exploited a newly passed law to pack the state supreme court with judges who thereupon ruled that he’d actually won the election. A newspaper published an editorial criticizing this ruling. The Colorado court held the editor in contempt. And that was upheld by the U.S. Supreme Court, in an opinion written by the famed Oliver Wendell Holmes.

The idea underlying all these cases is that rights are never absolute, being always subject to a balancing against the public interest. I myself have written that the Second Amendment’s “right to keep and bear arms” does not mean you can possess howitzers or nuclear weapons. And freedom of religion doesn’t cover human sacrifice. So it’s similarly argued that freedom of speech and press must be balanced against other public goods, and may sometimes be required to give way.

Abrams argues, however, that the First Amendment’s language, absolute on its face, reflects its authors having already performed such a balancing. The benefits to society, to the kind of polity they aspired to create, of unfettered freedom of expression were balanced against what public good might otherwise be promoted. And in that balancing, freedom of expression won out, being found the weightier. It’s more important to have a society with such freedom than, for example, one where religious sensibilities are protected from insult — or where judges are shielded from editorial criticism. That’s why we have the First Amendment, and why it actually does not permit the kind of balancing underlying that 1907 Colorado case. Justice Holmes himself came to repent his decision there, dissenting in similar future cases, and eventually the Court overturned its Colorado ruling.*

As Abrams stresses, the issues raised by the Belgian and Colorado cases go to the heart of the matter: free expression with regard to issues of public concern. This is crucial for meaningful democracy, which requires open debate and dissemination of information, with contesting advocates each subjecting the other’s views to critical scrutiny. Without that, voting itself is meaningless.

The exact same considerations were central to a case Abrams argued before the Supreme Court, which he discusses. He there contended that the government, because of the First Amendment, may not criminalize distribution of a film critical of a presidential candidate. (I quoted Abrams about it on this blog.) He won the case. And given our common understanding of free speech in America, that might seem a no-brainer.

The case was Citizens United, where the movie in question had corporate funding. Abrams is unrepentant and defends the Court’s decision, which has been ferociously assailed for affirming that businesses have the same rights to free speech and public advocacy that individual citizens have, and for allowing them to spend money in such endeavors. Abrams rejects the effort to make a distinction between money and speech, arguing that no right can be meaningful without the concomitant right to spend your money in its exercise. And he insists that businesses, being part of society, must have the right to participate in public debate.

Abrams cites here a case in which Nike was accused of corporate misdeeds and sought to rebut the charges with press releases and publications. For that, the company was sued in California state court under a consumer protection law barring false advertising and the like. The real issue was whether the First Amendment protects Nike’s freedom of speech. When the case went to the U.S. Supreme Court, the New York Times submitted a brief which Abrams quotes: “businesses and their representatives have just as much a right to speak out on any public issue as do interest groups and politicians . . . .” And because issues concerning businesses “are increasingly fundamental to the world’s social and political landscape, the withdrawal of corporate voices on those issues from the media would deprive the public of vital information.” Abrams deems the newspaper’s stance there starkly at odds with the position it later took on Citizens United, where the issue was really the same. Issue advocacy, and backing candidates for office, stand on identical ground as far as the First Amendment is concerned.

For me personally, all this is not abstract, but essential to my being. Abrams discusses the landmark case of Times v. Sullivan, which particularly protects criticism of public officials. That saved my butt in 1973 when I was sued for millions by guys whose misconduct I mentioned in a book on local politics. I love the freedom to express myself like that, and in this blog. I’ve been called fearless but the fact is, in America, there’s nothing to fear. In most other places blogging like mine requires a courage I probably don’t have. People literally risk their lives, and some have been killed.

Abrams notes Europe’s “right to be forgotten,” with search engines being required to erase true information about people when requested, such as reports on criminal convictions. I blogged about this in 2009 (again quoting Abrams), when two convicted German murderers, Wolfgang Werle and Manfred Lauber, sued to erase their names from Wikipedia. In defiance of that affront to freedom of information, I made a point of putting their names in my blog post, and do so again here. God bless America and the First Amendment!

* Yet even this right isn’t actually absolute. The First Amendment doesn’t protect libel or slander, child pornography, or shouting “fire!” in a crowded theater (as the same Justice Holmes famously explained).


The Jones Act — How protectionism sank our fleet

October 28, 2017

Remember Trump ordering a temporary waiver of the Jones Act, to get help to Puerto Rico? What was that all about?

The Jones Act, passed in 1920, limits shipping between U.S. ports to American built, owned, and crewed vessels. This was to shield the U.S. shipping industry from foreign competition. A textbook example of protectionism. Though usually protectionism isn’t so blatant, telling foreign business to get lost altogether.

Railroads also lobbied for the Jones Act, fearing that foreign ships would undercut them too in the business of transporting goods. And railroads did benefit, because ships built and crewed by Americans are so much costlier that all other forms of transport are cheaper in comparison. Thus, whereas 40% of Europe’s domestic freight goes by sea, just 2% does in America (despite our 12,383-mile coastline).

The Jones Act not only inflates the cost of U.S. sea transport, above what it would be with open competition; it inflates land transport costs too, by eliminating some of its competition. All those higher costs go into the prices for things we buy. Protectionism protects businesses — well, certain favored ones — at the cost of screwing consumers — and other businesses — here, ones that ship their products. Competition always benefits consumers, and the economy as a whole.

And protectionism doesn’t save jobs — because a business that isn’t competitive without it isn’t a good long term bet anyway. The Jones Act shows this. It could protect U.S. ships against foreign ones, but not against trains, trucks, and planes. In fact, the Act sank the U.S. shipping fleet. As recently as 1960 it was 17% of the world total; today just 0.4%.

That’s why the Jones Act had to be waived for Puerto Rico — there just weren’t enough U.S. ships for the job. Indeed, while the collapse of merchant shipping leaves most of the country with reasonable non-water alternatives, that of course is not true of places like Puerto Rico, Hawaii, or Alaska. (Hawaiian cattle ranchers regularly fly animals to the mainland!) In such places the impact on consumer prices and the cost of living is severe — yet one more reason why Puerto Rico’s economy was so dire even before the storm.

The Jones Act should surely be repealed — but lobbyists from the sailors’ unions and ship owners — the few that are left — are probably still politically powerful enough to prevent it.

Theresa Cooke: Joan of Arc

October 26, 2017

She gave me this photo for my book

Theresa Cooke (like me) came to Albany in 1970. She was shocked by the misfeasance and non-transparency of local government, controlled for 50 years by the storied O’Connell Democratic machine. As an engaged citizen, she would take it on.

I first encountered her, must have been in ’71, at some civic meeting at Chancellor’s Hall, and vividly recall her dynamic speech on her fight to open Albany’s books. I too was battling the machine, in the trenches, as a Republican ward leader (I’ve written about that), and published a book dissecting the machine. This was when the local GOP was on the side of the angels, under a combative county chairman, Joe Frangella. We stood for truth. justice, reform, and the American way.

Theresa Cooke became a key figure in our moral crusade. A  fiercely intelligent and committed young woman, indefatigable, undeterrable, I saw her as though on a white horse as our Joan of Arc. How thrilling it felt to join in a standing ovation for Theresa Cooke at a Republican dinner.

After narrowly losing a city election in 1973, the following year Cooke won a squeaker, after a long recount, as County Treasurer. In ’75 the county government was being reorganized, with our first county executive, and she was running. But the GOP, with Frangella now gone, balked at backing her and nominated a third candidate. That split the anti-machine vote, enabling the Democrat, Jim Coyne, to get in. (He wound up in prison.)

That was the end of the Albany Republican party as a moral force. At the following year’s county meeting, they wanted to install as city chairman a guy I considered a creep. I spoke in opposition. When I mentioned Theresa Cooke’s name, it was booed. That was when I knew I had to quit. (The creep wound up in prison too.)

Theresa Cooke likewise exited the political scene. Thirty-odd years later, at a music festival, I spotted an elderly woman. Not sure I recognized her, I had to ask, “Are you Theresa?” But she still had that sparkle in her eye. We had a nice chat.

When I saw on Tuesday’s local front page a piece by ace columnist Chris Churchill about Theresa Cooke, I realized it must be because she’d died. On Saturday, at 82.

I recently wrote that as I age, the world seems populated by ghosts. During research for my O’Connell book, I interviewed a very old man, John Boos, who’d opposed the machine at its beginnings. It seemed like hoary ancient history, with Boos a living mummy. My own political career, I soberingly realize, is now as far in the past as his was then.

“Media in the Age of New Technology: Fake News, Information Overload & Media Literacy”

October 21, 2017


(Panelists Tim Wu (originator of “net neutrality”); Franklin Foer; Maria Hinojosa; David Goodman. Moderator: news legend Bob Schieffer)

“Satan has come back to Earth disguised as a smart phone.” The communications revolution has profoundly affected our culture, especially how we get our news. Most now get some or all from social media — but it’s not vetted.

Facebook and Google until recently saw themselves as tech companies, but they’ve really become media gatekeepers (the most powerful in history). They profit from attracting eyeballs. And having a ton of data which clues them in to what’s in your head, their algorithms try to show you things you’re apt to click on.

In the 2016 campaign, Trump seemed to understand it was similarly a battle for attention. His campaign was tailored to getting it, and the media played along, giving him around $5 billion worth of free air time, far more than other candidates. It made the election into a circus; but people like circuses. (Clinton in contrast didn’t even try playing that game, instead being wary of any unscripted TV moments.)

In the past, mainstream TV and print media spoke with authority, but the internet has “democratized” the news landscape, and sources of news no longer seem to matter much. Thus we now lack a common basis of facts in our political discourse.

Indeed, it’s a golden age of propaganda, whose essence is the “big lie” and creating a seamless version of truth. Facebook is a hothouse where such own-realities can flourish. Its content, moreover, is vulnerable to cynical manipulation, as the Russians apparently exploited. But the problem is how to combat that without a kind of censorship that impedes political discourse and violates our norms of free expression.

David Goodman is the brother of Amy Goodman (of Radio’s Democracy Now), who was on the next panel. Both trotted out the old canard that the Iraq war was based on lies, and whined that the anti-war side wasn’t given enough press coverage. Amy Goodman harped on the same claim regarding the Bernie Sanders campaign; climate change; and the Dakota pipeline protests. Such complaints are a staple of lefty grievance polemics. In fact all four stories received ample coverage. And the “Bush lied” trope is itself a lie; almost everyone believed Iraq had weapons of mass destruction.

“Presidents and the Press: Trump, Nixon & More”

October 21, 2017

(Schieffer; Amy Goodman; Historian Douglas Brinkley; Harry Rosenfeld (who was Woodward-&-Bernstein’s editor); and Shane Goldmacher)

Rosenfeld: in the Watergate story, “we escaped by a hair,” thanks to the Nixon tapes; but there’s no clear path to escaping our current predicament. He saw much resemblance to Nazi Germany (where he grew up). Hitler similarly attacked the press — a key institution for holding government accountable.

Brinkley said historians’ evaluation of Trump’s presidency already rates him the worst ever. Well, duh!

Again explaining Trump’s win was a big topic. The celebrity factor was important. Trump drew surprisingly huge crowds in the hinterlands, outside population centers. And his basic message resonated. Goldmacher noted that when the two candidates’ convention speeches were polled, without revealing their names, Trump’s outpolled Clinton’s. Schieffer said many voters knew Trump was a flawed candidate but decided, “Nothing is working, I’ll take a flyer on this guy.”

“Race, Class & the Future of Democracy”

October 21, 2017

(Panelists Carol Anderson, Jose Cruz, Juan Gonzalez, Adrian Nicole LeBlanc; moderator Gilbert King)

“Make America Great Again” — and when was it great before? When blacks and other minorities were repressed. That’s the slogan’s subtext. Trump’s announcement speech, calling Mexicans “rapists” and so forth, set the tone — of stopping the “brown tide.” But he cannot. Trumpism is really that mindset’s “last gasp.”

In 1896, Plessy v. Ferguson legalized Jim Crow. But Justice Harlan — a Kentuckian who had once supported slavery! — dissented, calling the constitution color-blind. His grandson joined in the unanimous 1954 Brown v. Board decision, overturning Plessy. That sparked a big backlash of southern resistance and the disappearance of moderates. But it also gave rise to the civil rights movement.

The 2008 election drew 15 million new voters who believed they had a stake in this democracy. Most voted for Obama. But his election provoked another white backlash, which indeed Trump has empowered. And he’s delivering the goods to those supporters — like “throwing chum to sharks.”

This includes policy reversals like militarization of police; backtracking on post-Ferguson initiatives; attacking affirmative action; and reigniting the drug war. Previous drug epidemics (heroin, crack) affected mainly minority communities, so the drug war really amounted to a war on those communities. However, today’s opioid crisis is largely white, so the response is different — many people (though not Jeff Sessions) realize that the drug war and criminalization approach is not the right one.

Anderson said the white working class must understand that their interest is not in whiteness but in having a better society overall. Gonzalez meantime suggested that today’s young people are not carrying the ethnic baggage of prior generations.


October 8, 2017

My wife applied for a writer-in-residence grant at Gettysburg. She didn’t get it, but meantime became interested in the history, and we decided to go there. It seemed timely to visit a symbol of when America was even more divided than now.

Gettysburg was the Civil War’s biggest battle; the Confederates’ lone northern invasion was repulsed. It seemed logical to start at the Visitor’s Center. Finding the parking lot was easy. But the route from there to the building was unobvious. Another jolly couple was having the same problem. We flagged down a guy in a motorized cart. He immediately guessed, “You folks want to know how to get to the Visitor’s Center.”

Once there, we began with a movie, narrated by — who else? — Morgan Freeman (a Mississippi-born descendant of slaves). It was a work of art, beautiful and moving, putting the battle in historical context; making clear that the war was about slavery, and nothing else.

Actually, what blew things up was the question of allowing slavery in the vast new plains and western territories. Previously, we’d managed to keep an equal balance between slave and free states, but northerners feared the eventual implications of ending that balance by adding new slave states. That was the issue that birthed the Republican Party. Southerners meantime feared the consequences of slavery being confined within its existing borders. They were terrified lest their slave population outgrow their ability to control it, and saw the new territories as a potential safety valve. (This I’d gleaned from a book I’ve previously reviewed.)

The film did note that most northerners were not keen to abolish slavery altogether — immigrants and northern working men feared job and wage competition from ex-slaves flooding northward. (This has current-day resonance. As it turned out, most freed slaves stayed put.)

We then toured the battlefield — a vast expanse filled with monuments and memorials. We started with the Confederate ones before they could be removed, our guide quipped. And then the museum, also beautiful and sobering, chronicling the whole war. One had to be struck by just how much human suffering slavery caused — not only to the slaves, but now too in this climactic apocalypse of violent conflict on account of slavery.

Not yet sufficiently battle-hardened, we also viewed the “cyclorama,” a painting of the battle measuring 15,834 square feet (not a typo). This was 1884’s version of an epic movie. Then, in town, a diorama show (quite good) together with another exhibit that put us (quite loudly!) in the middle of Pickett’s charge. And, before we left, my wife deemed it a duty to visit the cemetery and the site there of Lincoln’s Gettysburg Address.

Me and Abe

Meantime we also visited Eisenhower’s farm, and numerous shops containing more relics, souvenirs, Civil War books, toy soldiers, and other memorabilia and militaria, than could ever conceivably be sold.

I wrote recently about history’s blunders, and its contingent character. Gettysburg surely epitomized that. It could have gone either way — a very close fight. The South seemed to be winning till the last afternoon of the three-day battle. The “fog of war” came frequently to mind, with the impossibility of commanders having the full picture of what was happening, nor having timely communications.

Meantime, that book I reviewed about history’s mistakes ought to have included southern secession. Most analysts think their chances of victory were slim, given the North’s economic strength. Secession was a giant gamble, that turned out very badly; the South was devastated. But again this is hindsight. It might not have been clear in 1861 that the North would fight at all, let alone so fiercely. Perhaps secessionists anticipated a fait accompli, the North backing off from the extreme step of all-out war.

A southern victory at Gettysburg might not have won the war. But change any detail of history and we cannot know how the subsequent story would have unfolded. Lee winning at Gettysburg might have tipped the 1864 election to a candidate who’d negotiate a peace without restoring the union or freeing the slaves. Today’s world would be very different!

However, the nation “conceived in liberty and dedicated to the proposition that all men are created equal” did endure. And not only freed the slaves — a more abject and despised “other” could scarcely be imagined — but made them citizens and voters. The nobility of these enactments, 150 years ago, takes one’s breath away.

It was a different Republican party in charge.

Human history in a nutshell, Part 2: Civilization

October 4, 2017

Previously I recapped our biological evolution. Now for civilization.

It’s a tale of two revolutions: agricultural and industrial. For around 95% of our existence we were hunter-gatherers. And for around 98% of the remaining time, mostly farmers.

About 10,000 years ago, butting up against the limits of a hunter-gatherer lifestyle, we invented something different: growing food and domesticating animals. This was truly a revolution, humanity’s first declaration of independence — from nature’s cold mercies. Now we could exert some control, some mastery, over our conditions of existence.

Yet some view this negatively, as our “fall” from a prior paradise of harmony with nature, presaging our hubristic “rape” of the planet. Yuval Noah Harari’s book Sapiens sees agriculture as a curse, that actually made life worse in many ways: harder, less healthy, less fair and egalitarian.

Some of this could not be evident to people at the time, or else they would not have embraced agriculture. Life’s key problem was sustenance, and agriculture (for all its vagaries) seemed to enhance food security.

Harari may be right that this was a mistake — during most of human history. Only in the last century or two have we really gotten the payoff (because we finally truly mastered the thing).

But meantime agriculture made possible civilization. Now we could settle down, and stockpile production surpluses. This soon led to cities, and division of labor, with some people able to become artisans and take on diverse societal roles. Hierarchies formed, with bureaucratic governments to administer things (and fight other societies). That too Harari deems a curse, and maybe again it was, for most of history. But the rich, complex civilization (with all its material comforts) we enjoy today could never have evolved without agriculture starting it.

Which brings us to the second revolution — the industrial one. It was mainly an energy revolution. At first limited by just our own muscle power, we then exploited animals (horses, oxen) to do much more work. That was pretty good. But nothing compared to how much energy and work is gotten out of a steam engine. And a single gallon of gasoline contains energy equal to about 49 horsepower hours, or 500 hours of human work (and I don’t mean paper pushing).

This obviously was propelled by a scientific revolution, as we grew in understanding nature and how to harness its forces. But a book by historian Joyce Appleby, The Relentless Revolution: a History of Capitalism, illuminates something else little understood: how the industrial revolution also depended upon what was really a second agricultural one.

The first introduced food growing, but it was still an economy of scarcity. It took almost everybody working in the fields to feed a population, and even then it sometimes failed, with devastating effects. This was captured by Thomas Malthus, writing in 1798 of the interplay between population and food supply. In fat times, population would grow (fewer children dying), but that would inexorably outstrip the ability to raise food production. Come a lean period and population would fall (more children dying).

That cycle had indeed long been repeating; as Appleby puts it, Malthus was an excellent prophet of the past. But he wrote just when the picture was changing.

The industrial revolution’s factories were something new, and great numbers of people flocked to work in them. Yet how was that possible if almost everyone was needed on the farm just to raise enough food? And if most people were thus barely subsisting, whence came the means to buy all the new goods spewed out by factories?

The answer lies in that second agricultural revolution. It was not a thunderclap, but a gradual process, beginning around the early seventeenth century, in the Netherlands and Britain. A key factor limiting food production was soil depletion, requiring fields to “rest” every third year or so. But by fiddling with different crops, the Dutch and Brits found ways around this. Other innovations too emerged, so that, over a couple of centuries, food output (relatively speaking) soared.

So no longer did we need everybody in the fields (today it’s down to almost nil). Many could work instead in factories. Of course, that was hellish — but not as awful as life stuck on the farm. Anyway, now we could produce sufficient food plus factory goods. That made society wealthier overall — with enough money in people’s pockets to buy the added stuff.

This did not unfold without a concomitant revolution in thinking. Previously fatalism was the prevailing mindset. People did believe history went in cycles. But now they could see not only change, but positive change. The word “progress” came into use. And as Appleby writes, it was more important to believe governing arrangements could be changed than that they should be.

Likewise, we take for granted the idea of striving to improve one’s life. But there was no such thing in feudal society. People worked just to exist, nothing more. They could not have imagined the kind of life we have today. Only with the revolutions into modernity did we begin to grasp the concept of proactive self-betterment. Observers were actually surprised to see ordinary folks attracted to unaccustomed consumer goodies. This sparked a virtuous circle, energizing people into the kind of industrious striving that, in turn, turbocharged our continuing agricultural and industrial revolutions.

Of course this too meets with censure. The market economy and “consumerism” that fuels it are condemned. Yet this was also a social revolution, creating a bold new idea of the individual. The word “individual” was never even applied to human beings before the seventeenth century. Now the social chasm between oligarchs and commoners was bridgeable. The egalitarian ideal that many today put in opposition to capitalism is in fact a product of that very thing.

Today we’re at the start of another great revolution. Just as increased agricultural productivity freed people to work in factories, now more efficient manufacturing frees us to create wealth in yet newer ways. How it works out, we’ll see.

Human history in a nutshell, Part 1: Evolution

September 28, 2017

It was about six million years ago that we last shared a common ancestor with a biologically distinct gang — chimps and other apes. But our species, Homo Sapiens, is only a couple of hundred thousand years old. Between those two chronological markers, a lot of evolution happened.

In fact, over those six million years, quite a large number of more or less “human” or proto-human species came and went. The line of descent that produced us was only one of many. All the others petered out.

As the story unfolded among all these variant creatures, two different basic strategies evolved. Call one vegetarian. Its practitioners relied on a menu much like that of modern apes — fruits, nuts, berries, etc. A pretty reliable diet, but due to low nutritional content, much energy was devoted to eating and digesting — they literally had to consume a lot to get the energy to consume a lot. A big digestive system was required, diverting resources that otherwise could have gone to their brains.

The other group went for brains rather than guts. This required a high energy diet, i.e., including meat. But meat was hard to get, for such weak little critters lacking fangs and claws. Getting meat required brains.

All well and good, except that bigger brains meant bigger heads, a bit of a problem for mothers giving birth. And that was exacerbated by a second evolutionary trajectory. Hunting meat proved to be a lot easier for early humans if, instead of going on all fours, they could efficiently walk upright and even run. Doing that called for changes to pelvic architecture, which had the effect of narrowing the birth canal. So the bigger-headed babies had to fit through a smaller opening. Something had to give.

What gave was the gestation period. If humans functioned otherwise like apes do, babies would spend not nine months in the womb but twenty, and come out ready for action. But their heads by twenty months would be so big they couldn’t come out at all. So we make do with nine months, about the limit mothers can bear, and the least babies can get by with. Consequently they require lengthy attentive nurturing, which of course has had a vast impact upon humans’ way of life.

Earlier birth thus meant longer childhood, and a lot of a person’s development outside the womb as his or her brain responds to things around it. This in turn is responsible for another huge fact about human life: we are not cookie-cutter products but very different one from another. And that fundamental individualism, with each person having his own perspectives and ideas, played a great role in the evolution of our culture and, ultimately, civilization.

Another key part of the story was fire. We romanticize the mastery of fire (e.g., in the Prometheus myth) as putting us on the road to technology. But that came much later. Fire was our first foray into taking a hand in our own evolution. It began with cooking. Remember that trade-off between gut and brain? Cooking enabled us to get more nutrition out of foods and digest them more easily. That enabled us to get by with a smaller gut — and so we could afford a bigger brain.

This omnivorous big-brain model seemed to work out better than the vegetarian one; the vegetarians died out and the omnivores became us. (This is not intended as a knock on today’s vegetarians.) But notice again how much actually had to be sacrificed in order to produce our supersized brains. And that this was a bizarre one-time fluke of evolutionary adaptation. It happened exactly once. None of the other zillions of creatures that ever existed ever went in this unique evolutionary direction.

In other words, if you think evolution of a species with world-dominating intelligence was somehow inevitable or pre-ordained, consider that it didn’t happen for 99.999+% of Earth’s history. It was an extreme freak in the workings of evolution.

Indeed, it’s a mistake to conceptualize “evolution” as progress upward toward ever greater heights (culminating in Homo Sapiens). It’s because of that erroneous connotation of progress that Darwin didn’t even use the word “evolution” in his book. The process has no goal, not even the “selfish-gene” goal of making things good at reproducing themselves. It’s simply that things better at reproducing proliferate, and will grow to outnumber and eclipse those less good at reproducing. Our species happened to stumble upon a set of traits making us very good reproducers. But insects are even better at it, and there are way more of them than us.

(Much of what’s in this essay came from reading Chip Walter’s book, Last Ape Standing.)

101 Stumbles in the March of History

September 19, 2017

This 2016 book, by Bill Fawcett, is a compendium of historical might-have-beens. Decisions and choices the author deems mistakes, along with speculation about how differently subsequent history might have unfolded. He’s fond of saying, “It would have been a hundred times better if . . . .”

One could read this and conclude that people — even great personages — are screw-ups. But two things must be kept in mind. First, history encompasses zillions of decisions and choices people made. Finding among them 101 mistakes is all too easy. Especially if (second point) you use 20-20 hindsight. I recall how Gibbon, in The Decline and Fall of the Roman Empire, loved applying the word “rash” to actions that turned out badly. Fawcett, in contrast, especially in military situations, often castigates undue timidity. Dumb rashness versus admirable boldness may be discernible after the fact, when we know how things turned out. It may not have been so clear at the time when the decision had to be made, often on the fly, without a crystal ball. And all too often the outcome hinged not so much on how smart the decider was, how rash or prudent, but how lucky.

For each “mistake,” Fawcett spins a counter-factual history, typically seeing a modern world surprisingly different, and usually better. These stories I found pretty laughable in their details; too facile and pat. History is messy. If one thing comes through from this book, it’s how contingent history is. Change any detail about the past, even a small one (“for want of a horseshoe nail . . .”), and the difference may well cascade through time, an historical “butterfly effect.” (The idea that a butterfly flapping its wings in, say, Brazil, can cause a storm in Canada.) And the law of unintended consequences is powerful. The results from changing something about the past might have confounded our expectations, good or bad, however logical those expectations might seem.

So one can never know what the final outcome of any action will be. Supposedly, Chinese Foreign Minister Chou En-lai, when asked about the impact of the French Revolution, replied “Too soon to tell.”

I’ve always been highly cognizant of contingency in life. I’ve written about this — how different, for example, my own subsequent life would have been, if I hadn’t happened to walk on a particular street at a particular minute on April 1, 1975. Several other people’s lives would be dramatically different too! (I can think of at least five offhand, two of whom wouldn’t even exist.) And that walk was only one link in a complex chain of consequential contingencies.

It’s customary in book reviews to cite at least one fact (usually minor) the author flubbed, to show off the reviewer’s erudition. This book is actually shot through with sloppy mistakes, often dates. Andrew Johnson was sworn in as vice president in 1865, of course, not 1864. Et cetera.

But here is one fascinating historical might-have-been, in the book. Why didn’t the Confederacy make military use of slaves? They had millions! In fact, it was proposed to offer freedom for serving in the army. It could have doubled Southern forces. And it was done, but only at war’s end, too little and too late. The fact was that the rebs were just too racist and contemptuous of blacks to stomach the idea of fighting alongside them. Even if it might have won the war. (Probably not; but you never know, history is messy.)

The last item in the book is something I myself, at the time, did see as a stupendous blunder: disbanding Iraq’s army in 2003. But at least two other recent biggies are inexplicably omitted (mistakes by Fawcett himself):

For 2000, he gives us Blockbuster’s refusal to partner with Netflix. Yet a vastly more consequential error that year was Yasser Arafat’s rejection of a very generous peace deal. It was all too foreseeable that immense evil would flow from this.

In a similar category was Obama’s 2013 decision to punt to Congress on punishing Syria for crossing his chemical weapons red line. Hearing his announcement, I could scarcely believe its stupidity.

Perhaps coming too late for inclusion were two epochal 2016 blunders. One was Britain’s Brexit vote. The resulting mess seems to grow daily. So deeply has Britain’s politics been poisoned that The Economist now sees the unthinkable as almost inevitable: Red Jeremy Corbyn becoming prime minister. Goodbye, mother country.

The other of course was our own 2016 vote — which America’s future Gibbon will surely label “rash.”