Archive for the ‘Science’ Category

The book that changed America: Darwin, Slavery, and God

February 27, 2018


The Book That Changed America is the title of one by Randall Fuller. It’s about Darwin’s On the Origin of Species, looking at its impact particularly in Concord, Massachusetts.

That wasn’t just Anytown, U.S.A. Concord was the center of America’s intellectual ferment. The protagonists in Fuller’s book include Emerson, Thoreau, Hawthorne, Bronson and Louisa May Alcott, Franklin Sanborn, Louis Agassiz, and Asa Gray — all living in or near Concord and interacting with each other and with Darwin’s bombshell book.


It hit Concord almost simultaneously with another bombshell in late 1859: John Brown’s attack on the Harper’s Ferry arsenal and his subsequent execution. Brown was not, as often portrayed, a madman. He considered slavery a great sin that could be undone only through war, which he aimed to start. He was just about a year early.

America was already, of course, hotly divided over slavery, and Harper’s Ferry raised the temperature further. So did Darwin’s book.

How so? The only possible excuse for slavery was the idea of blacks’ racial inferiority. Thus their constant denigration as a degenerate, brutish species. And slavery apologists, being besotted with religion, had to believe God intentionally made blacks separately and enslavement-worthy. Efforts to prove their inferiority litters Nineteenth century science. (See Stephen Jay Gould’s The Mismeasure of Man.)

(Even most abolitionists thought blacks inferior. But they opposed slavery nonetheless because it was cruel and unjust. This applies to every pogrom, genocide, or other ethnically based abuse or exploitation. Even if its victims were lesser, degraded creatures — it’s never true, but even if it were — their mistreatment would still be cruel and unjust. The creatures proven inferior and degraded are the perpetrators.)

Anyhow, the races’ biological separateness continued to be a matter of intense science-oriented debate.* That’s where Darwin came in.

His book prudently refrained from specifically addressing human origins. (Darwin bit that bullet later in The Descent of Man.) Origin discussed living things in general, and all its numerous examples and case studies concerned non-human life. Many at the time imagined humans were something apart from all that. Yet many others were not so deluded, and they realized that if Darwin’s varied finches and so forth were all close cousins, branches of the same tree, obviously then so were whites and blacks. (We now know that blacks came first, and whites descended from them.)

Thus did Origin explode the moral underpinnings of slavery. And Darwin was not just another polemicist with an axe to grind. Not only was his a science book, it was powerfully supported and argued, hence a devastating blow.

Yet still it was disputed. Inevitably, for a book that gored cherished oxen. And slavery was not the only ox. The other was God himself.

Gods have always been the answer for natural and cosmic mysteries people couldn’t otherwise penetrate. That territory used to be huge. But science has progressively answered those mysteries, inexorably shrinking godly territory.

To naive eyes, the world might look designed, the only possible way to explain life’s diversity and complexity. Literature is filled with rhapsodizing on this theme. Though would any intelligent designer have so filled creation with pain and suffering? Calling this a mystery is no answer.

Thoreau had studied nature intensively, and likewise studied Darwin’s book. He got it, completely; it explained so much of what he’d actually observed. Fuller casts Thoreau as holding that the world is indeed filled with magic and mystery — just not the kind religion postulates.

But Darwin greatly demystified life. His theory was a revelation, a revolution. He called it “natural selection” and “descent with modification;” for short, evolution. His book explained it thoroughly and cogently; there’s hardly a word in it that doesn’t still hold up. A stupendous achievement of human intellect.

And once Darwin unveiled it, the idea of evolution was actually obvious. (I recall Richard Milner’s song, wherein other scientists of the time moan, “Why didn’t I think of that?!”) As Thoreau found, evolution instantly made sense of everything observable about the natural world, everything previously so puzzling. The great geneticist Theodosius Dobzhansky put it thusly: “Nothing in biology makes sense except in the light of evolution.”

Yet, to this day, half of Americans reject it. Fuller’s book recaps the opposition to evolution as it played out at its advent, with famed scientist Louis Agassiz in the attack’s vanguard. Its essence remains unchanged. Evolution shrinks God almost to irrelevance. And not just in biology. If life is attributable to natural, not supernatural causes, couldn’t the same be true of the entire cosmos? To Agassiz, all this was something literally unthinkable.** As it is for his modern counterparts.

Likewise that we “come from monkeys” (or even lesser creatures). Some believe that degrades us. But “there is grandeur in this view of life,” connecting us to every other living thing. And our animal antecedents make us all the more remarkable. It’s sublime that a Darwin, descended from apes, could have the insight to see it. All we’ve achieved we’ve done ourselves, with no help from any god.

A reader of Fuller’s book must be struck by how one key mistake — belief in a god — traps you in a carnival house of mirrors, distorting everything about life and the world. Escape it and all becomes clear. This is the main reason why Agassiz and other scientists of the time failed to see what Darwin saw. Religion blinded them. And even when shown the light, they hold tight to their blindfolds. They torture facts, evidence, and logic, struggling to hammer the square peg of their belief into the round hole of reality.

I find it far better to just accept reality.

* Some even argued for different species on the basis (by analogy to mules) that mixed-race people tend to be sterile — simply untrue. Furthermore, the vast genre of argument that race mixing somehow “pollutes” and degrades the quality of the white race likewise contradicts manifest biological fact: mixing different gene pools improves strength and quality. It’s called hybrid vigor.

** Scientist Asa Gray entered the fray on Darwin’s side, but even he was unmoored by God’s banishment, coming up with the fallback idea that evolution is God’s method for managing life’s pageant. And even Darwin himself seemed queasy about a purely mechanistic view of creation.


Being and nothingness: How the brain creates mind and self

February 14, 2018

Phineas Gage was a big name in brain science. Not a scientist — but a railroad construction foreman. Until in 1848 an accidental explosion rammed a three-foot iron rod through his cheek and out the top of his head.

Gage actually recovered, with little outward impairment. But his character and personality were transformed. Previously admirable, he became an irresponsible jerk. A part of his brain governing temperament was destroyed.

This famous case opens Antonio Damasio’s landmark 1994 book, Descartes’ Error: Emotion, Reason, and the Human Brain. Though not the latest word in neuroscience, I felt it was worth reading, in my eternal quest to understand the most important thing in the world — my self. What, in that sentence, “I” and “felt” really mean.

I’ve written about this before; here are links: (1), (2), (3), (4), (5), (6), (7).

Of course, like everyone, I know perfectly well what being me feels like. But why does it feel that way? Why does anything feel like anything? By what mechanism?

Obviously, it has to do with the workings of the brain. I say “obviously,” but some might disagree. In fact, that was “Descartes’ error” of the book title — the famous philosopher posited the mind being something apart from anything physical. Like the idea of a soul. But these are nonsensical concepts. Not only is there no evidence for them, there’s no possible coherent explanation for how they could be true. There’s no plausible alternative to our minds being rooted in the workings of our brains.

Yet it’s difficult to come up with a coherent explanation for that too (so far, anyway). Brains have been analogized to computers, but computers aren’t conscious (so far, anyway). It’s been suggested that the difference is complexity — the brain’s trillions of synapses vastly dwarf what’s in any computer. Still, this seems more like a label than an explanation.

Some common-sense ideas don’t work. Like there’s somebody in charge in there, a captain at your helm. That’s certainly an illusion — the mind is bottom-up, not top-down. That is, whatever you think you are thinking, it’s not the work of some central command, but a product of a lot of undirected neuronal signaling, actually distributed among various brain modules, that somehow comes together. Similarly, we imagine seeing as a “Cartesian theater” (named for the same Descartes), i.e., as if a signal coming in from the eyes gets projected onto a screen in the brain, viewed by a little person (“homunculus”) in there. But does the homunculus have a Cartesian theater — and a smaller homunculus — in its brain? And so forth? The idea falls apart.

Further, not only is the mind not somehow separate from the brain, it’s not even separate from the whole rest of the body. Another point Damasio makes clear. “Keeping body and soul together” is a paradoxically apt expression here, because the brain evolved, after all, as a device to keep the body going, for its ultimate purpose (to the genes) of reproducing. So the body is the brain’s primary focus, and monitoring and regulating the body, and responding to its cues, is most of what the brain is doing at any given moment. (Thus the sci-fi notion of a disembodied brain in a vat, having normal consciousness, is probably absurd.)

To understand how the mind works, the concept of representation seems crucial. (No mentation without representation!) Start with the idea of reality. There is a reality that obtains within your body; also a reality outside it, that you interact with. But how does the mind perceive these realities? Through senses, yes; but they can’t give the brain direct contact with reality. The reality outside — it’s raining, say — cannot itself get inside your head. It can’t be raining in there. It’s even true of your inner bodily reality. If your stomach hurts, you can’t have a stomachache in your brain. But what your brain can do is construct a representation of a stomachache, or rain shower. Like an artist creates a representation of a still life on his canvas.

Of course the brain doesn’t use paints; it only has neurons and their signaling. Somehow the brain takes the incoming sensory information — you see it raining — and translates it into a representation constructed with neuronal signaling. A mental picture of the raining. And notice this can’t merely be like snapping a photo. The representation has to be sustained — continually refreshed, over some length of time.

This is starting to be complicated. But more: how do “you” (without a homunculus) “see” the representation? Why, of course, by means of a further representation: of yourself perceiving and responding to the first one.

But even this is not the end of it. It’s actually three balls the brain must keep in the air simultaneously: the representation of the reality (the rain); second, the representation of the self reacting to it; and, finally, a third order representation, of your self in the act of coordinating the prior two representations, creating a bridge between them. Only now do “you” decide you need an umbrella.

This at least is Damasio’s theory, insofar as I could understand it. Frankly that third part is the hard one. I’m a little queasy that we might have here another endless homuncular recursion: the representation of the self perceiving the representation of the self perceiving . . . . Yet we know the buck must stop somewhere, because we do have selves that somehow know when it’s raining, and know they know it, and grab umbrellas. And one can see that the first two representation levels don’t quite get us there. So there must be the third.

Pain too is a representation. When the body signals the brain that something’s amiss, it could register the fact without suffering. The suffering is an emotion, triggered by the brain creating a representation of “you” experiencing that feeling. That’s why it hurts. Of course, we evolved this way to make us respond to bodily problems. Rare individuals who can’t feel pain damage themselves — very non-adaptive. And Damasio tells of one patient with an extremely painful condition. After an operation snipping out a bit of brain, he was thoroughly cheerful. Asked about the pain, he said, “Oh, the pains are the same, but I feel fine now.” His brain was no longer representing pain as suffering.

Meantime, while the mind is doing all that representation stuff — continually, as new signals keep arriving — keeping “you” in touch with what’s going on — there’s yet another ball it must keep aloft: who “you” are. Part of it again is the bodily aspect. But you’re not an empty vessel. Damasio likens the representation of your self to the kind of file J. Edgar Hoover’s FBI might have kept on you. Though it’s not all in one file, or file cabinet, but distributed among many different brain modules. It includes data like what you do, where you live, other people important to your life, knowledge of your entire past, and your ideas looking ahead to your future. Everything that makes you you. And it’s not just filed away; all of it the mind must constantly refresh and update. To keep in being the “you” in its representations of “you” interacting with realities like rain or pain.

Of course all the foregoing is merely schematic. We know how painters paint pictures, but how, exactly, neuronal signaling does it remains a very hard problem. But yet again we know it must. There’s no alternative.

And for humans at least, we do know at least part of the answer. We know how to paint word pictures. And they entail a lot of metaphors — another form of representation. In fact, thinking this way is so second-nature that most of us have struggled to imagine what thinking without words could be like. Of course, other animals do it, and have consciousness, without language. But undoubtedly having it is a tremendous enhancer for the three-stage model via representation that I’ve described. I think it gives humans a much deeper, higher-level self-awareness than other animals enjoy. (Damasio, somewhat enigmatically, says this: “Language may not be the source of the self, but it certainly is the source of the ‘I.'”)

What Damasio’s book is really famous for is his take on reason and emotion. Phineas Gage’s iron rod opened not only a hole in his head but a window on the subject. Damasio also discusses the similar case of “Elliot,” a normal, smart, successful man until a lesion destroyed a bit of his brain. He was still perfectly rational. But like Gage’s, his life fell apart, because he could not behave as reason dictated. The explanation turned out to be a loss of emotional capacity. Emotions give us the reasons to utilize our reason! Elliot no longer cared about anything; not even his life falling apart. The lesson is that emotion and reason are not, as many people imagine, separate or even at odds with one another. They are bound together. Moreover, emotion on its own terms isn’t unreasonable. There are always reasons for the emotions we feel (or if not, that’s insanity).

A final point. While Damasio’s book helped a bit, I still can’t say I have a good handle on what accounts for this phenomenon I experience as being me. It still feels like a will-o’-the-wisp that slithers away whenever I try to grasp it. And as difficult as it is to grasp being in existence, it is likewise difficult to grasp the idea of nonexistence.

Upgrading to Humanity 2.0

February 4, 2018

Tech guru Ray Kurzweil called it “The Singularity” – when artificial intelligence outstrips human intelligence – and starts operating on its own. Then everything changes. Some, like Stephen Hawking, fear those super-intelligent machines could enslave or even dispense with us.

But in my famous 2013 Humanist magazine article, The Human Future: Upgrade or Replacement, I foresaw a different trajectory – not conflict between people and machines, or human versus artificial intelligence, but rather convergence, as we increasingly replace our biological systems with technologically better ones. The end result may resemble those cyborg superbeings that some fear will supplant us. Yet they will be us. The new version, Humanity 2.0.

I call this debiologizing, not roboticizing. We may be made mostly if not wholly of artificial parts, but won’t be “robots,” which connotes acting mechanically. Humanity 2.0 will be no less conscious, thinking, and feeling than the current version. Indeed, the whole point is to upgrade the species. Two-point-zero will think and feel more deeply than we can. Or, perhaps, can even imagine.

This transformation’s early stages fall under the rubric of “enhancement,” referring, generally, to improving individual capabilities, via pharmacology, hardware, or genetic tinkering. This gives some people the heebie-jeebies. But every technological advancement always evokes dystopian fears. The first railroads were denounced as inhuman and dangerously messing with the natural order of things. A more pertinent example was organ transplants, seen as crossing a line, somehow profoundly wrong. Likewise in-vitro fertilization. The old “playing god” thing.

The fact is that we have always messed with the natural order, in countless ways, to improve our lives. It’s the very essence of humanity. And the “enhancement” concept is not new. It began with Erg, the first human who made a crutch so he could walk. (No doubt Glorg scolded, “if God meant you to walk . . . .”) Today people have prosthetics controlled by brain signaling.

A lot of it is to counter aging. Euphemisms like “golden years” can’t hide the reality of decline, always physical, and usually (to some degree) mental. We’ve already extended life far longer than nature intended, and make people healthier longer too. If all that’s good, why not strive to delay decrepitude further still – or reverse it?

And why not other interventions to improve human functionality? If we can enable the disabled, why not super-able others? If we use medicines like Ritalin to improve mental function for people with problems, why not extend the concept to improving everyone’s abilities? Through all the mentioned means – pharmacology, hardware, genetics – we can make people stronger, healthier, and smarter.

Yet some viscerally oppose all this, as a corruption of our (god-given?) human nature. Paradoxically, some of the same people are cynical pessimists about that human nature, vilifying it as a fount of evil. Is it nevertheless sacred, that we shouldn’t tamper with it? Steven Pinker argued persuasively, in The Better Angels of Our Nature: Why Violence has Declined, that humanity has in fact progressed, gotten better, and better behaved, mainly because in many ways we’ve gotten smarter. If we can make people smarter still, through all those kinds of technological enhancements, won’t that likely make us better yet, kissing off the ugliest parts of our (god-given) nature?

The idea of people being able to choose enhancements for themselves also irks misanthropes who see in it everything they dislike about their fellow humans. It’s the ultimate in sinful consumerism. An illegitimate “shortcut” to self-improvement without the hard work that it should rightly entail, thus cheapening and trivializing achievement. Life, these critics seem to say, should be hard. By this logic, we should give up washing machines, microwaves, airplanes, all those “shortcuts” we’ve invented to make life easier. And go back to living in caves.

A perhaps more serious version of their argument is that enhancement, taken sufficiently far, would strip human life of much of what gives it meaning. Much as we’ve progressed, with washing machines and microwaves, etc., and with health and longevity, still a great deal of what invests life with meaning and purpose is the struggle against the limitations and frailties and challenges we continue to face. Remove those and would we become a race of lotus-eaters, with an empty existence?

But consider that early peoples faced challenges of a wholly different order from ours. Getting food was critical, so they sacralized the hunt, and the animals hunted, which loomed large in their systems of meaning. Now we just saunter to the grocery, and that ancient source of meaning is gone. Does that make us shallower? Hardly. Instead it liberates us to focus upon other things. Maybe higher things.

The fundamental mistake of enhancement’s critics is to imagine life for a Human 2.0 by reference to life for a Human 1.0, when they will be as different as we are from our stone age ancestors. Or more so. Our future descendants, relieved of so many concerns that preoccupy us (and not detoured by supernatural beliefs), will find life richer than we can dream.

Of course there will be profound impacts – economic, environmental, cultural, social. Not only will 2.0 be very different, their world itself will be transformed by that difference. But with greater smarts and wisdom they should be able to deal with the challenges.

Our species is only a couple hundred thousand years old; civilization, ten thousand. Billions of years lie ahead. Thus we are humanity’s infancy. Adulthood will be really something.



Idiocracy: The Death of Expertise

January 9, 2018

Our pockets hold a device to access all the information in the world. We use it to view cat videos. And while the sum of human knowledge grows hyperbolically, and we’re getting more education than ever, the average American’s ignorance is rising.

This is the nub of Tom Nichols’s 2017 book, The Death of Expertise. (Nichols is a professor and foreign affairs wonk.) Expertise itself isn’t dying — it’s being rejected.

Take vaccination. Expert, knowledgeable, responsible opinion is clear about its benefits, and the baselessness of fears about it. They began with a fraudulent “study” by a British doctor, Andrew Wakefield, that was authoritatively debunked. Wakefield’s medical license was even revoked. That hasn’t stopped the nonsense, still spewed by irresponsible people like former Playboy pin-up Jenny McCarthy. Too many listen to her rather than the medical establishment, and refuse vaccination. Result: children dying of illnesses previously almost eliminated. (See my commentary on a previous book, Denialism; it also discusses the similarly misguided (and likewise deadly) campaign against GM foods.)

Civilization is grounded upon division of labor and specialization of function. We have doctors who doctor and plumbers who plumb, with arcane expertise not possessed by the mass of others. This is how airplanes are engineered and flown. We trust such experts to do these things. Nobody would imagine they could build and fly a plane equally well. Yet plenty do somehow imagine they know better about vaccination than the experts.

“A little knowledge is a dangerous thing.” That old saw is weaponized by the internet, spreading what might appear to be “knowledge” but actually isn’t. While previously, discourse about matters like science or public policy was largely confined within intellectual ghettoes, those walls have been blown down.

Anti-intellectualism and magical thinking have long afflicted American culture. Worse now, many people, fortified by a college degree, deem themselves their own intellectual experts. But Nichols, who delves deeply into the subject, says going to college is not the same as getting a college education. Students arrive there already spoiled by the coddling of helicopter parents, giving them an arrogant attitude of entitlement. (I can’t count how often I’ve heard that word, “entitlement,” spoken by professionals discussing interactions with young people.)

Schools find themselves forced to surrender to this ethos, with fluff courses, “safe spaces” against intellectual challenge, feelings allowed to trump facts, and gradeflation to flatter fragile egos. “When college is a business, you can’t flunk the customers,” Nichols says. Critical thinking? Rational discourse? Forget it.

The more I learn, the more I realize how little I know. Thusly stepping back to see oneself objectively is metacognition. Such humility is fading from America’s narcissistic culture, where college makes people imagine they’re smart without giving them the tools to recognize their own deficiencies (or deficiencies in the “information” they imbibe). Anti-vaccine madness is more rampant among the college-educated than the uneducated.

Social science actually has a name for this phenomenon, the Dunning-Kruger effect. Most people think they are (like all Lake Wobegone children) above average. Those who don’t understand logic don’t recognize their own illogicality. They don’t actually understand the concepts of knowledge and expertise.

It’s an irony that in the past, with far fewer people “educated,” there was more respect for education, expertise, seriousness, and indeed facts. A less educated past America would never have tolerated the lies and vulgarities of a Trump.

But there’s also the cynical feeling that experts and elites have their own self-serving agendas and have led us astray. Look at the Vietnam War; the 2008 financial crisis. Nichols addresses at length the problem of expert error. But he invokes the old conundrum of a plane crash getting headlines while thousands of daily safe flights are just taken for granted. In fact, everything about modernity and its benefits — medical science, air travel, and so much else — is the work of experts. If they weren’t generally very good at it, planes wouldn’t fly. You would not board one staffed by a bunch of Joe Sixpacks. Experts can be wrong, but is it likelier that Jenny McCarthy is right about vaccines?

“Rocket science” is used as a metaphor for extreme expertise. I recently saw a TV documentary about the Hubble Space Telescope* — which, after launch, didn’t work, a huge bungle by experts. But even more striking was how, against all odds, NASA people managed to figure out, and execute, a fix. Expertise more than redeemed itself.

Another factor in the shunning of expertise is a rising ethos of individualism and egalitarianism. It’s the idea that you — and your opinions (however derived) — are as good as anyone else and their opinions (expertise be damned). Nichols thinks Americans misunderstand democracy, confusing the concept of equality of rights with actual equality, and equal validity of all opinions. Yet at the same time there’s a refusal to engage in a serious way with the public sphere — “a collapse of functional citizenship.” Democracy is corrupted if voting isn’t based on a grasp of facts that are actually facts.

I keep mentioning confirmation bias because it’s such a big factor. We welcome information that seemingly validates our pre-existing beliefs, and insulate ourselves against anything contrary. Smarter, educated people are actually better at constructing such rationalizations. And modern media facilitates this cherry-picking; we embed ourselves in comfortable cocoons of confirmation.

Declining trust in experts is part of a larger trend of declining social trust generally. Polls show a belief that other people are getting less trustworthy (for which there’s no evidence). Mainstream news has been a victim of this. Many Americans don’t know who to believe. Or, worse, their cynical lapse of confidence, in conventional repositories of trust, paradoxically leads them to swallow what should be trusted least. Like all that garbage from the internet — and the White House.

So rejecting input from real experts opens a field day for phony ones. The Jenny McCarthys, conspiracy freaks like Alex Jones, not to mention legions of religious and spiritualist frauds. Nichols cites Sturgeon’s law (Theodore Sturgeon was a sci-fi writer): 90% of everything is crap.

The ironies multiply. Trump’s election, and the Brexit vote too, were revolts against experts and elites, seen as lording over common folk. Yet those voters have delivered themselves, gift-wrapped, to the not-so-tender mercies of a different gang that exploits their ignorance and credulity for its own bad ends.

Americans are losing their grasp of the nation’s founding ideals and values (no longer taught in schools). Without such understanding, those principles cannot be sustained. Nichols sees a “toxic confluence of arrogance, narcissism, and cynicism that Americans now wear like [a] full suit of armor against . . . experts and professionals.” This, he says, puts democracy in a “death spiral” as disregard for informed expert viewpoints (and, one might add, just plain reality) produces ever worse results in the public sphere. This embitters citizens even more.

I’ve always seen a dichotomy between the smartest people, who really understand and know things, and the rest of humanity. And it’s only the former — maybe 1% of the species — at the far end of the bell curve of cognitive ability — who actually run things. Who are indeed responsible for all we’ve achieved. Literally all. Without that 1%, we’d still be in caves.

* A picture in that documentary included someone who was at my last birthday party!

Statistical wisdom and Weldon’s dice

November 13, 2017


I went to a library talk, by my friend Jonathan Skinner, reviewing a book, The Seven Pillars of Statistical Wisdom, by Stephen Stigler. Jonathan was a professional statistician. One thing I enjoyed was his quoting Christopher Hitchens: “What can be asserted without evidence can also be dismissed without evidence.”

I also learned how the Arctic and Antarctic got their names. Skinner said Aristotle named them, based on the Greek word for “bear.” That surprised me; could Aristotle have been aware of the poles’ existence? And how could he have known about polar bears? But when I mentioned this to my (smarter) wife, she suggested the “bear” reference was to a constellation. I checked Wikipedia and while the Greek origin is correct, there was no mention of Aristotle. And of course my wife was right.

Skinner discussed a basic statistical concept: probability. He talked about it in connection with dice as an example. This reminded me of the Tom Stoppard play, Rosencrantz and Guildenstern Are Dead, and Guildenstern’s repeated coin flips. They come up heads every time. Not statistically impossible, but increasingly unlikely as the number of flips mounts. Stoppard is jesting with the laws of probability.

Of course they tell us heads and tails should be 50-50. But I also remembered a guy who wrote in to Numismatic News, doubting that theory, and reporting his own test. He flipped a coin 600 times and got 496 heads! Of course, the probability of that result is not zero. But I actually calculated it, and the answer is one divided by 6.672 times 10 to the 61st power. For readers not mathematically inclined, that’s an exceedingly tiny probability. Ten to the 61st power means 1 followed by 61 zeroes.

However, that guy, as if to flaunt his scientific rigor, explained his procedure: on each of his 600 tosses, he methodically started with the coin in the heads-up position, and then . . . well, enough said.

But Skinner related a similar tale, of Frank Weldon who (in 1894) really did try to put the theory to a rigorous test. He rolled a dozen dice 26,306 times, and recorded the results. That huge effort would make him either a martyr to science, or a fool (like the Numismatic News guy) because, after all, what is there to test? Is there any sane reason to doubt what such a simple probability calculation dictates?

However, Skinner quoted Yogi Berra: “In theory, theory and practice are the same. In practice they are not.”

Well, guess what. Weldon found the numbers five and six over-represented. With six faces to each die, you should expect any two numbers to come up one-third of the time, or 33.33%. But Weldon got 33.77%. You might think that’s a minor deviation, down to random chance. But statisticians have mathematical tools to test for that, i.e., whether a result is “statistically significant.” And the odds against Weldon’s result were calculated to be 64,499 to one.

So another fool (er, researcher), Zacariah Labby, decided to repeat Weldon’s experiment, but this time using machinery to roll the dice, and a computer to tabulate the results. He got 33.43%, a smaller deviation, but still statistically significant.

How can this be explained? It had been suggested that the small concave “pips” on the die faces denoting the numbers might affect the results. And then Labby measured his die faces with highly accurate equipment and found the dice were not absolutely perfect cubes.

But don’t rush out to a casino to try to capitalize on the Weldon/Labby deviation. Labby concluded his paper by noting that casinos use dice without concave pips and more precisely engineered, to scotch any such bias.

A cute puzzler

October 16, 2017

This came not from Car Talk but an essay I read. Imagine a ribbon girdling the Earth’s circumference. Then add just one meter to the ribbon’s length. So there’d be a little slack. How big is the gap between the ribbon and the Earth’s surface?

Most people would guess it’s extremely tiny — that was my intuitive answer — a mere meter being nugatory over such a huge distance. But the surprising answer is about 16 centimeters. If the ribbon was snug around the Equator before, how could an added meter make it that much less snug?

My wife and I puzzled over this and soon figured out the simple solution, without even using pencil and paper:

A circle’s diameter is the circumference divided by Pi (3.14+) — i.e., a bit less than a third. If two circumferences differ by a meter, then their diameters differ by almost 1/3 of a meter — say about 32 centimeters — or about 16 centimeters at each end.

The essay said only mathematicians and dressmakers get this right.

Human history in a nutshell, Part 1: Evolution

September 28, 2017

It was about six million years ago that we last shared a common ancestor with a biologically distinct gang — chimps and other apes. But our species, Homo Sapiens, is only a couple of hundred thousand years old. Between those two chronological markers, a lot of evolution happened.

In fact, over those six million years, quite a large number of more or less “human” or proto-human species came and went. The line of descent that produced us was only one of many. All the others petered out.

As the story unfolded among all these variant creatures, two different basic strategies evolved. Call one vegetarian. Its practitioners relied on a menu much like that of modern apes — fruits, nuts, berries, etc. A pretty reliable diet, but due to low nutritional content, much energy was devoted to eating and digesting — they literally had to consume a lot to get the energy to consume a lot. A big digestive system was required, diverting resources that otherwise could have gone to their brains.

The other group went for brains rather than guts. This required a high energy diet, i.e., including meat. But meat was hard to get, for such weak little critters lacking fangs and claws. Getting meat required brains.

All well and good, except that bigger brains meant bigger heads, a bit of a problem for mothers giving birth. And that was exacerbated by a second evolutionary trajectory. Hunting meat proved to be a lot easier for early humans if, instead of going on all fours, they could efficiently walk upright and even run. Doing that called for changes to pelvic architecture, which had the effect of narrowing the birth canal. So the bigger-headed babies had to fit through a smaller opening. Something had to give.

What gave was the gestation period. If humans functioned otherwise like apes do, babies would spend not nine months in the womb but twenty, and come out ready for action. But their heads by twenty months would be so big they couldn’t come out at all. So we make do with nine months, about the limit mothers can bear, and the least babies can get by with. Consequently they require lengthy attentive nurturing, which of course has had a vast impact upon humans’ way of life.

Earlier birth thus meant longer childhood, and a lot of a person’s development outside the womb as his or her brain responds to things around it. This in turn is responsible for another huge fact about human life: we are not cookie-cutter products but very different one from another. And that fundamental individualism, with each person having his own perspectives and ideas, played a great role in the evolution of our culture and, ultimately, civilization.

Another key part of the story was fire. We romanticize the mastery of fire (e.g., in the Prometheus myth) as putting us on the road to technology. But that came much later. Fire was our first foray into taking a hand in our own evolution. It began with cooking. Remember that trade-off between gut and brain? Cooking enabled us to get more nutrition out of foods and digest them more easily. That enabled us to get by with a smaller gut — and so we could afford a bigger brain.

This omnivorous big-brain model seemed to work out better than the vegetarian one; the vegetarians died out and the omnivores became us. (This is not intended as a knock on today’s vegetarians.) But notice again how much actually had to be sacrificed in order to produce our supersized brains. And that this was a bizarre one-time fluke of evolutionary adaptation. It happened exactly once. None of the other zillions of creatures that ever existed ever went in this unique evolutionary direction.

In other words, if you think evolution of a species with world-dominating intelligence was somehow inevitable or pre-ordained, consider that it didn’t happen for 99.999+% of Earth’s history. It was an extreme freak in the workings of evolution.

Indeed, it’s a mistake to conceptualize “evolution” as progress upward toward ever greater heights (culminating in Homo Sapiens). It’s because of that erroneous connotation of progress that Darwin didn’t even use the word “evolution” in his book. The process has no goal, not even the “selfish-gene” goal of making things good at reproducing themselves. It’s simply that things better at reproducing proliferate, and will grow to outnumber and eclipse those less good at reproducing. Our species happened to stumble upon a set of traits making us very good reproducers. But insects are even better at it, and there are way more of them than us.

(Much of what’s in this essay came from reading Chip Walter’s book, Last Ape Standing.)

The curse of Ham

September 26, 2017

I have written about Kentucky’s Creation Museum. Should be called the Museum of Ignorance, since its exhibits contradict incontestable scientific facts. Like the dinosaurs dying out 65 million years ago. The museum is off by 64.99+ million years. It shows humans living beside them. This might be fine as entertainment, but not for an institution purporting to be educational.

Earth to Creationists: I’m more than 6,000 years old. Around a million times older.

The museum was built by an outfit called Answers in Genesis. Not content with this slap in the face to intelligence, Answers is now building a replica Noah’s Ark. The project has received an $18 million tax break from the State of Kentucky (specifically, a sales tax abatement). How does this not flagrantly flout constitutional separation of church and state?

Ken Ham

The head of Answers in Genesis is a man named Ken Ham. Please linger upon this name.

For one thing, ham is just about the most un-kosher thing in Judaism. Kentucky’s public support for a Ham-centric project is plainly a gross insult to its citizens of the Jewish faith.

But there’s a much bigger issue. The name of Noah’s third son was Ham. Coincidence? Not very likely. This Mister Ken Ham must, beyond any doubt, be a direct descendant of Noah’s third son. He has never denied it; and it certainly explains his ark fetish.

Now, the Bible is very clear about this fact: Ham was cursed, for a grave insult to his father. Scholars differ in their exact interpretations. Some say Ham castrated Noah; others that he buggered Noah. Either way, it wasn’t nice, and so Ham was cursed by God. Ham’s own son Canaan was the progenitor of the Canaanite people, who of course were later wiped out by a God-ordered genocide; and also of all Africans, which is why they’re all cursed too.

But here is the point. In this Kentucky Ark project, Mister Ken Ham must sneakily be aiming to whitewash the above family history, employing lies to mislead the public and undo the curse that God, in his infinite wisdom and justice, laid upon all of his line. This is out-and-out blasphemy.

Some will say it should be left to the Lord to visit his divine justice upon this doubly accursed latter-day Ham. But of course God-fearing people have rarely been content to defer to that ultimate justice, and have instead so often taken matters into their own hands, with fire and sword.

I’d go with the latter.


August 22, 2017

Me, viewing eclipse

With all the pre-eclipse coverage, I somehow initially didn’t grasp there’d be much to see here in Albany, NY, quite far from the totality band. But the map in The Economist Saturday showed we’d get about 2/3 of it. So then my wife and I started scrambling for viewing options. By now it was too late to obtain the needed glasses (one risks eye damage looking directly at an eclipse without special protection). And most venues with eclipse activities were already fully booked.

We decided to try our luck at Rensselaer Polytechnic Institute, across the river in Troy, offering free glasses. The hoo-ha was scheduled to begin at 1:22 PM; we arrived about 40 minutes before, and there was already a huge line snaking around the building. Within a short time it was twice as long. They had 400 pairs of glasses, and we did get one. Our neighbor and acquaintance Heidi Newberg (an RPI astrophysicist) was there, helping to instruct the crowd. (I think the last time our paths crossed was actually in the Beijing subway.)

With my wife’s pinhole box

The weather report called for cloudless skies, and it started that way, so we got some good looks. My wife had also made a pinhole camera for viewing, which worked pretty well; she had decorated it with relevant poems. Unfortunately, it clouded over during the time of maximum eclipse, so we had only glimpses of that. Of course, we didn’t get real darkness, but during the maximum it did seem eerily dimmer than it should have been on a cloudy afternoon. The whole experience was pretty cool.

We get solar eclipses like this only due to a freakish confluence of facts: the moon is vastly smaller than the Sun, but it’s far closer to Earth, so when the two line up, it just happens that their profiles exactly match, producing the dramatic effect.

My wife Therese

Well. The next solar eclipse will occur in April, 2024, and with that one, we’ll get the Full Monty quite near us. I’m already praying for clear skies 😉

The Bonobo and the Atheist

August 20, 2017

Our closest biological relatives are chimpanzees. They’re not as cute as you might think; often nasty and violent. How nice then to have discovered the bonobo — an equally close cousin, but a much better role model. Anatomically chimplike, bonobos behave very differently, very social, peaceable, and they’re sex fiends. A lot of humans are in love with the idea of the bonobo, seeing them as living in a prelapsarian paradise of free love, undarkened by sin. They’re even matriarchal. How politically correct can an animal get?

This evokes Rousseau’s “noble savage” and Margaret Mead’s idealization of Samoan sexual promiscuity (which turned out to be fake news).

De Waal (at right)

The book, The Bonobo and the Atheist, seems to have been written by the bonobo. Actually by primatologist Frans de Waal, who’s studied them. He likes them. Atheists, not so much. Even though he is one himself.

A self-hating atheist, then? No, he sets himself apart from atheists who make a big deal of it. His own attitude is nonchalant — “I don’t believe that stuff, but if others do, so what?” Too many atheists, he feels, are overly obsessed with the question of truth, which he deems “uninteresting.”

De Waal’s critique of assertive “new atheists” (like Dawkins, Harris, Hitchens) has become familiar. We’re told they do the cause no favor by insulting religious believers. I’ll make three points.

First, through most of history, religious dissent was not only taboo but cowed into silence by the threat of fire. Subjecting religious ideas to serious intellectual challenge is long overdue.

Second, about those fires: many atheists believe religion has done great harm, being a wellspring of violence, and we’d be better off without it. (I recently reviewed a book arguing the contrary.) This too is a debate we need to have.

And third, when billions do believe in religious dogmas (with vast impacts upon human society), their truth is hardly an “uninteresting” matter. Even leaving aside the violence, such beliefs dominate one’s entire engagement with the world. You cannot have a sound conception about the human condition and the issues facing us while being fundamentally mistaken about the essential nature of reality. That truth matters.*

But back to bonobos. For de Waal, they’re Exhibit A for the book’s main point — that morality and altruism do not come from religion. They long antedate religion’s beginnings and in fact are seen among other animals. The bonobo “too, strives to fit in, obeys social rules, empathizes with others, amends broken relationships, and objects to unfair arrangements.” De Waal relates an observation of two young chimps quarreling over a leafy branch. An older one intervenes, breaks it in two, and hands a piece to each youngster! And in a famous experiment, chimps would happily perform a task for cucumber slices, until seeing other chimps getting grapes, a more coveted reward. Then, offended by the unfairness, they spurn the cucumber and go on strike. (Some grape receivers even joined them in solidarity.) The Occupy movement sprang from the same primordial feelings.

Altruism evolved because it was beneficial within the groups that practiced it. De Waal reminds us that the most conspicuous form of altruism throughout nature is often overlooked: parental nurturing and even self-sacrifice. Not surprisingly, the basic trait extends beyond just one’s own progeny.

Altruism is commonly defined as doing something for another at cost to oneself. Yet if that makes you feel good, is it really costing you? And why are we programmed to feel good when acting altruistically? De Waal points out that, logically enough, nature makes it pleasurable to do things we need to do — like eating and copulating. Altruism falls in the same category.

The idea that humans need religion for morality is actually insulting to us. And ridiculous. While religionists say without God anything goes, we could all rape, steal, and murder, nobody wants to live in such a world, and most of us recognize that that means we don’t rape, steal, and murder. Which we wouldn’t do anyway because of our nature-given moral instincts. God is irrelevant.

De Waal doesn’t join those who wish we could be more like our bonobo cousins about sex. He explains that their promiscuity makes it impossible to know who anyone’s father is. That diffuse paternity creates a certain kind of societal structure. We humans went down a different path, with pair bonding and clear paternity, so fathers are invested in protecting and raising their offspring. Emulating bonobos would wreak havoc in human society. Indeed, to the extent some people do emulate them, it does cause social havoc.

De Waal also discusses the religion-versus-science thing. No contest, really; religion comes much more naturally to us, fulfilling deep needs. Science does not, and is a far more recent and fragile invention. He says a colony of children left alone would not descend into the barbarism of Golding’s Lord of the Flies, but would develop a hierarchical society as apes do — and likely some sort of religion — but not science.

De Waal suggests that when humans lived in small bands, moral instincts could serve their function effortlessly because everybody knew what everyone else was doing.** But not when societies grew much larger. Thus were gods invented to keep “sin” (i.e., antisocial behavior) in check.

Religion serves other needs too. Some go to church for the donuts. That’s shorthand for all the social togetherness religion entails. For many it’s a matter of finding meaning in an otherwise cold cosmos, and in their own lives. And of course palliating fear of death.

And what’s truth got to do with it? It turns out truth and reality actually rank pretty low on many people’s priority lists. Indeed, many seem to have a fuzzy grasp on the concept. We see this in the political realm, where tolerance for lies is far greater than I once imagined. In religion, people believe things mainly because they want to; and this extends to other aspects of life.

But I’ll repeat: you cannot live an authentically meaningful life if its foundation is lies. And as de Waal recognizes, humanism does enable us to find meaning in life while embracing its reality rather than cocooning ourselves in fairy tales. The essence of humanism is the recognition that life is intrinsically valuable for its own sake, that our purpose is to live it as well as we can, and to make it as good as we can for everyone.

De Waal argues that religion is deeply embedded because of its roots in our biology. But we have overcome innumerable constraints imposed by nature. He does acknowledge a “giant experiment” in Northern Europe’s recent and really remarkably rapid turning away from conventional religion. And these societies have seen nothing whatsoever of the negative consequences that religious apologists warned about for eons. Those Europeans who have largely freed themselves from religion are not going to Hell — neither figuratively nor literally.

* An example of how this messes up thinking is strong support for a moral creep like Trump among the devout, who forget, among much else, the commandment against lying.

** Note the importance of language. If one chimp mistreats another, no one else may know. But in human society, with talking, word gets around. This raises the stakes for violations of social norms.