Archive for the ‘Science’ Category

Truth, beauty, and goodness

September 20, 2018

Which among the three would you choose?

I read Howard Gardner’s 2011 book, Truth, Beauty, and Goodness Reframed: Educating for the Virtues in the Twenty-first Century. (Frankly I’d picked it up because I confused him with Martin Gardner; but never mind.)

Beauty I won’t discuss. But truth and goodness seem more important topics today than ever.

Many people might feel their heads spin as age-old seeming truths fall. “Eternal verities” and folk wisdom have been progressively undermined by science. Falling hardest, of course, is God, previously the explanation for everything we didn’t understand. He clings on as the “God of the gaps,” the gaps in our knowledge that is, but those continue to shrink.

Darwin was a big gap-filler. One might still imagine a god setting in motion the natural processes Darwin elucidated, but that’s a far cry from his (God’s) former omnicompetence.

While for me such scientific advancements illuminate truth, others are disconcerted by them, often refusing to accept them, thus placing themselves in an intellectually fraught position with respect to the whole concept of truth. If one can eschew so obvious a fact as evolution, then everything stands upon quicksand.

Muddying the waters even more is postmodernist relativism. This is the idea that truth itself is a faulty concept; there really is no such thing as truth; and science is just one way of looking at the world, no better than any other. What nonsense. Astronomy and astrology do not stand equally vis-a-vis truth. (And if all truth is relative, that statement applies to itself.)

Though postmodernism did enjoy a vogue in academic circles, as a provocatively puckish stance against common sense by people who fancied themselves more clever, it never much infected the wider culture, and even its allure in academia deservedly faded. And yet postmodernism did not sink without leaving behind a cultural scum. While it failed to topple the concept of truth, postmodernism did inflict some lasting damage on it, opening the door to abuse it in all sorts of other ways.

All this background helped set the stage for what’s happening in today’s American public square. One might have expected a more gradual pathology until Trump greatly accelerated it by testing the limits and finding they’d fallen away. Once, a clear lie would have been pretty much fatal for a politician. Now one who lies continuously and extravagantly encounters almost no consequences.

It’s no coincidence that many climate change deniers and believers in Biblical inerrancy, young Earth creationism, Heaven, and Hell, are similarly vulnerable to Trump’s whoppers. Their mental lie detector fails here because it’s already so compromised by the mind contortions needed to sustain those other counter-factual beliefs.

But of course there’s also simple mental laziness — people believing things with no attempt at critical evaluation.

A long-ago episode in my legal career sticks with me. I was counsel for the staff experts in PSC regulatory proceedings. We had submitted some prepared testimony; the utility filed its rebuttal. I read their document with a horrible sinking feeling. They’d demolished our case! But then we went to work carefully analyzing their submittal, its chains of logic, evidence, and inferences. In the end, we shot it as full of holes as they had initially seemed to do to ours.

The point is that the truth can take work. Mark Twain supposedly said a lie can race around the globe while the truth is putting its shoes on. Anyone reading that utility rebuttal, and stopping there, would likely have fallen for it. And indeed, that’s how things usually do go. Worse yet, polemical assertions are often met with not critical thinking but, on the contrary, receptivity. That’s the “confirmation bias” I keep stressing. People tend to believe things that fit with their preconceived opinions — seeking them out, and saying, “Yeah, that’s right” — while closing eyes and ears to anything at odds with those beliefs.

A further aspect of postmodernism was moral relativism. Rejection of empirical truth as a concept was extended to questions of right and wrong — if there’s no such thing as truth, neither are right and wrong valid concepts. The upshot is nonjudgmentalism.

Here we see a divergence between young and old. Nonjudgmentalism is a modern tendency. Insofar as it engenders an ethos of tolerance toward human differences, that’s a good thing. It has certainly hastened the decline of prejudice toward LGBTQs.

Yet tolerance and nonjudgmentalism are not the same. Tolerance reflects the fundamental idea of libertarianism/classical liberalism — that your right to swing your fist stops at my nose — but otherwise I have no right to stop your swinging it. Nor to stop, for example, sticking your penis in a willing orifice. Nonjudgmentalism is, however, a much broader concept, embodying again the postmodernist rejection of any moral truths. Thus applied in full force it would wipe out even the fist/nose rule.

That is not as absurd a concern as it might seem. Howard Gardner’s book speaks to it. He teaches at Harvard and expresses surprise at the extent to which full-bore nonjudgmentalism reigns among students. They are very reluctant to judge anything wrong. Such as cheating on exams, shoplifting, and other such behaviors all too common among students. A situational ethic of sorts is invoked to excuse and exculpate, and thereby avoid the shibboleth of judgment.

Presumably they’d still recognize the clearest moral lines, such as the one about murder? Not so fast. Gardner reports on conducting “numerous informal ‘reflection’ sessions with young people at various secondary schools and colleges in the United States.” Asked to list people they admire, students tend to demur, or confine themselves only to ones they know personally. And they’re “strangely reluctant” to identify anyone they don’t admire. “Indeed,” Gardner writes, “in one session [he] could not even get students to state that Hitler should be featured on a ‘not-to-be-admired’ list.”

Well, ignorance about history also seems lamentably endemic today. But what Gardner reports is actually stranger than might first appear. As I have argued, we evolved in groups wherein social cooperation was vital to survival, hence we developed a harsh inborn judgmentalism against anything appearing to be anti-social behavior. That (not religion) is the bedrock of human morality. And if that deep biological impulse is being overridden and neutered by a postmodernist ethos of nonjudgmentalism, that is a new day indeed for humankind, with the profoundest implications.

Advertisements

Was America founded as a “Christian nation?”

August 13, 2018

We’re often told that it was. The aim is to cast secularism as somehow un-American, and override the Constitution’s separation of church and state. But it’s the latter idea that’s un-American; and it’s historical nonsense. Just one more way in which the religious right is steeped in lies (forgetting the Ninth Commandment).

Jacoby

They assault what is in fact one of the greatest things about America’s birth. It’s made clear in Susan Jacoby’s book, Freethinkers: A History of American Secularism.

Firstly, it tortures historical truth to paint the founding fathers as devout Christians. They were not; instead men of the Enlightenment. While “atheism” wasn’t even a thing at the time, most of them were as close to it as an Eighteenth Century person could be. Franklin was surely one of the century’s most irreverent. Washington never in his life penned the name “Christ.” Jefferson cut-and-pasted his own New Testament, leaving out everything supernatural and Christ’s divinity. In one letter he called Christian doctrine “metaphysical insanity.”

The secularism issue was arguably joined in 1784 (before the Constitution) when Patrick Henry introduced a bill in Virginia’s legislature to tax all citizens to fund “teachers of the Christian religion.” Most states still routinely had quasi-official established churches. But James Madison and others mobilized public opinion onto an opposite path. The upshot was Virginia passing not Henry’s bill but, instead, one Jefferson had proposed years earlier: the Virginia Statute of Religious Freedom.

It was one of three achievements Jefferson had engraved on his tombstone.

The law promulgated total separation of church and state. Nobody could be required to support any religion, nor be penalized or disadvantaged because of religious beliefs or opinions. In the world of the Eighteenth Century, this was revolutionary. News of it spread overseas and created an international sensation. After all, this was a world still bathed in blood from religious believers persecuting other religious believers. It was not so long since people were burned at the stake over religion, and since a third of Europe’s population perished in wars of faith. Enough, cried Virginia, slashing this Gordian knot of embroiling governmental power with religion.

Soon thereafter delegates met in Philadelphia to create our Constitution. It too was revolutionary; in part for what it did not say. The word “God” nowhere appears, let alone the word “Christian.” Instead of starting with a nod to the deity, which would have seemed almost obligatory, the Constitution begins “We the people of the United States . . . .” We people did this, ourselves, with no god in the picture.

This feature did not pass unnoticed at the time; to the contrary, it was widely denounced, as an important argument against ratifying the Constitution. But those views were outvoted, and every state ratified.

It gets better. Article 6, Section 3 says “no religious test shall ever be required” for holding any public office or trust. This too was highly controversial, contradicting what was still the practice in most states, and with opponents warning that it could allow a Muslim (!) president. But the “no religious test” provision shows the Constitution’s framers were rejecting all that, and totally embracing, instead, the religious freedom stance of Virginia’s then-recent enactment. And that too was ratified.

Indeed, it still wasn’t even good enough. In the debates over ratification, many felt the Constitution didn’t sufficiently safeguard freedoms, including religious freedom, and they insisted on amendments, which were duly adopted in 1791. That was the Bill of Rights. And the very first amendment guaranteed freedom of both speech and religion — which go hand-in-hand. This made clear that all Americans have a right to their opinions, and to voice those opinions, including ideas about religion, and that government could not interfere. Thus would Jefferson later write of “the wall of separation” between church and state.

All this was, again, revolutionary. The founders, people of great knowledge and wisdom, understood exactly what they were doing, having well in mind all the harm that had historically been done by government entanglement with religion. What they created was something new in the world, and something very good indeed.

Interestingly, as Jacoby’s book explains, much early U.S. anti-Catholic prejudice stemmed from Protestants’ fear that Catholics, if they got the chance, would undermine our hard-won church-state separation, repeating the horrors Europe had endured.

A final point by Jacoby: the religious attack on science (mainly, evolution science) does not show religion and science are necessarily incompatible. Rather, it shows that a religion claiming “the one true answer to the origins and ultimate purpose of human life” is “incompatible not only with science but with democracy.” Because such a religion really says that issues like abortion, capital punishment, or biomedical research can never be resolved by imperfect human opinion, but only by God’s word. This echoes the view of Islamic fundamentalists that democracy itself, with humans presuming to govern themselves, is offensive to God. What that means in practice, of course, is not rule by (a nonexistent) God but by pious frauds who pretend to speak for him.

I’m proud to be a citizen of a nation founded as a free one* — not a Christian one.

* What about slaves? What about women? Sorry, I have no truck with those who blacken America’s founding because it was not a perfect utopia from Day One. Rome wasn’t built in a day. The degree of democracy and freedom we did establish were virtually without precedent in the world of the time. And the founders were believers in human progress, who created a system open to positive change; and in the centuries since, we have indeed achieved much progress.

How to become a Nazi

July 9, 2018

You’re a nurse, and a doctor instructs you, by phone, to give his patient 20 Mg of a certain drug. The bottle clearly says 10 Mg is the maximum allowable daily dose. Would you administer the 20 Mg? Asked this hypothetical question, nearly all nurses say no. But when the experiment was actually run, 21 out of 22 nurses followed the doctor’s orders, despite knowing it was wrong.

Then there was the famous Milgram experiment. Participants were directed to administer escalating electric shocks to other test subjects for incorrect answers. Most people did as instructed, even when the shocks elicited screams of pain; even when the victims apparently lost consciousness. (They were actors and not actually shocked.)

These experiments are noted in Michael Shermer’s book, The Moral Arc, in a chapter about the Nazis. Shermer argues that in the big picture we are morally progressing. But here he examines how it can go wrong, trying to understand how people became Nazis.

Normal people have strong, deeply embedded moral scruples. But they are very situation-oriented. Look at the famous “runaway trolley” hypothetical. Most people express willingness to pull a switch to detour the trolley to kill one person to prevent its killing five. But if you have to physically push the one to his death — even though the moral calculus would seem equivalent — most people balk.

So it always depends on the circumstances. In the nurse experiment, when it came down to it, the nurses were unwilling to go against the doctor. Likewise in Milgram’s experiment, it was the authority of the white-coated supervisor that made people obey his order to give shocks, even while most felt very queasy about it.

Nazis too often explained themselves saying, “I was only following orders.” And, to be fair, the penalty for disobeying was often severe. But that was hardly the whole story. In fact, the main thing was the societal normalization of Nazism. When your entire community, from top to bottom, is besotted with an idea, it’s very hard not to be sucked in.

Even if it is, well, crazy. Nazi swaggering might actually not have been delusional if confined to the European theatre. They overran a lot of countries. But then unbridled megalomania led them to take on, as well, Russia — and America. This doomed insanity they pursued to the bitter end.

Yet they didn’t see it that way. The power of groupthink.

And what about the idea of exterminating Jews? They didn’t come to it all at once, but in incremental steps. They actually started with killing “substandard” Germans — mentally or physically handicapped, the blind, the deaf — tens of thousands. With the Jews they began with social ostracizing and increasing curtailment of rights.

This was accompanied by dehumanization and demonization. Jews were not just called inferior, genetically and morally, but blamed for a host of ills, including causing WWI, and causing Germany’s defeat. Thusly Germans convinced themselves the Jews deserved whatever they got, had “brought it on themselves.” These ideas were in the very air Germans breathed.

Part of this was what Shermer calls “pluralistic ignorance” — taking on false beliefs because you imagine everyone holds them. Like college students who’ve been shown to have very exaggerated ideas of their peers’ sexual promiscuity and alcohol abuse, causing them to conform to those supposed norms. Germans similarly believed negative stereotypes about Jews because they thought most fellow Germans held such views. Actually many did not, but kept that hidden, for obvious reasons. There was no debate about it.

Of course it was all factually nonsense. An insult to intelligence, to anyone who knew anything about anything. Yet Germany — seemingly the most culturally advanced society on Earth, the epicenter of learning, philosophy, the arts — fell completely for this nonsense and wound up murdering six million in its name.*

Which brings me to Trumpism. (You knew it would.) Am I equating it with Nazism? No. Not yet. But the pathology has disturbing parallels. The tribalism, the groupthink, the us-versus-them, nationalism, racism, and contempt for other peoples. The demonization of immigrants, falsely blaming them for all sorts of ills, to justify horrible mistreatment like taking children from parents — even saying, “they brought it on themselves.” And especially the suspension of critical faculties to follow blindly a very bad leader and swallow bushels of lies.

I might once have said “it can’t happen here” because of our strong democratic culture. Today I’m not so sure. Culture can change. That within the Republican party certainly has. Not so long ago the prevailing national attitude toward politicians was “I’m from Missouri,” and “they’re all crooks and liars.” Too cynical perhaps but the skepticism was healthy, and it meant that being caught in a lie (or even flip-flopping) was devastating for a politician. Contrast Republicans’ attitude toward Trump (a politician after all). Not only a real crook and constant flip-flopper, but a Brobdingnagian liar. That 40% of Americans line up in lockstep behind this is frightening. And as for our democratic culture, the sad truth is that too few still understand its principles and values. Germans in their time were the apogee of civilization, and then they became Nazis.

Shermer quotes Hitler saying, “Give me five years and you will not recognize Germany again.” Fortunately Trump will have only four — let’s hope. But America is already becoming unrecognizable.

* My grandfather was a good patriotic German who’d even taken a bullet for his country in WWI. But that didn’t matter; he was Jewish. Fortunately he, with wife and daughter, got out alive. His mother did not.

Are humans smarter than (other) animals?

June 27, 2018

Around 1900, “Clever Hans” was a famous German horse with seeming mathematical ability. Asked “what is four times three?” Hans would tap his hoof twelve times. He was usually right even when his owner wasn’t present; and even when given the questions in writing!

Animal intelligence — and consciousness — are age old puzzles to us. French philosopher Rene Descartes saw other animals as, in effect, mechanical contrivances. And even today many see all their behaviors as produced not by intelligent consciousness (like ours) but rather by instinct — pre-installed algorithms that dictate responses to stimuli — like computers running programs.

Clever Hans’s story is recapped in Yuval Noah Harari’s book, Homo Deus. It was eventually proven that Hans knew no math at all. Instead, he was cued to stop tapping his hoof by onlookers’ body language and facial expressions. But, Harari says, that didn’t debunk Hans’s intelligence, it did the opposite. His performance required far more brain power than simple math! You might have memorized 4×3=12 — but could you have gotten the answer the way Hans did?

This points up the difficulty of inferring animal mentation using human yardsticks. Harari explains Hans’s abilities by noting that horses, unequipped for verbal language, communicate instead through body language — so they get pretty good at it. Much better than us.

So if horses are so smart, why aren’t they sitting in the stands at Saratoga while humans run around the track? Well, for one thing, building that sort of facility would have been a lot harder for horses with hooves rather than our dextrous five-fingered hands. Our tool-making capability is a huge factor. And our intelligence, taken as a whole, probably does outstrip that of any other animal. It had to, because early humans faced far more complex survival challenges. Countless other species failed such tests and went extinct. We did not because an evolutionary fluke gave us, just in time, an extreme adaptation in our brains, unlike any other animal’s. Our equivalent of the narwhal’s huge tusk or the giraffe’s neck.

That happened around a couple of hundred thousand years ago. Yet for around 98% of those years, humans achieved little more than mere survival. Only in the last few thousand have we suddenly exploded into a force dominating the Earth as no creature before.

Why that delay? In fact, Harari notes, our stone age ancestors must have been even smarter than people today. After all, their lives were much tougher. One mistake and you’d be dead; your dumb genes would not make it into the next generation.

Harari thinks — I tend to agree — that cooperation proved to be humanity’s killer app. PBS TV’s recent “Civilizations” series illuminates how things really got going with the development of agriculture about 10,000 years ago. Arguably farmers were actually worse off in many ways; and maybe even humanity as a whole for about 9,800 of those years. But agriculture, and the production of food surpluses, did make possible the rise of cities, where people could specialize in particular enterprises, and interact and exchange ideas with large numbers of other people. That eventually paid off spectacularly, in terms of human material well-being, in modern times.

Harari notes that ants and bees too live in large cooperative communities. So why haven’t they developed computers and spaceships? Our super intelligent consciousness also gave us great flexibility to adapt to changing circumstances. Insects have a far more limited repertoire of responses. As Harari writes, “If a hive faces a new threat or a new opportunity, the bees cannot, for example, guillotine the queen and establish a republic.”

Modern life: the big challenge we face

June 23, 2018

Tom Friedman’s latest book made my head spin. It’s Thank You for Being Late: An Optimist’s Guide to Thriving in the Age of Accelerations. He’s a bigger optimist than me.

The “accelerations” in question concern technology, globalization, and climate change, all transforming the world at breakneck speed. Faster, indeed, than human psychology and culture can keep up with.

Friedman

What spun my head was Friedman’s rundown of technology’s acceleration. He sees 2007 as an inflection point, with the iPhone and a host of other advances creating a newly powerful platform that he calls not the Cloud but the “Supernova.” For instance there’s Hadoop. Ever heard of it? I hadn’t. It’s a company, that also emerged in 2007, revolutionizing the storage and organization of “Big Data” (as best I understand it), making possible explosions in other technologies. And GitHub — 2007 again — blasting open the ability to create software.*

All this is great — for people able to swim in it. But that’s not everybody. A lot of people are thrown for a loop, disoriented, left behind. Bringing them up to speed is what Friedman says we must do. Otherwise, we’ll need a level of income redistribution that’s politically impossible.

The age-old fear (starting with the Luddites) is “automation” making people obsolete and killing jobs. It’s never happened — yet. Productivity improvements have always made society richer and created more jobs than those lost. But Friedman stresses that the new jobs are of a different sort now. No longer can routine capabilities produce a good income — those capabilities are being roboticized. However, what robots can’t substitute for is human social skills, which are increasingly what jobs require. AI programs can, for example, perform medical diagnoses better than human doctors, so the role of a doctor will become more oriented toward patient relations, where humans will continue to outperform machines.

But schools aren’t teaching that. Our education system is totally mismatched to the needs of the Twenty-first Century. And I can’t see it undergoing the kind of radical overhaul required.

I’ve often written how America’s true inequality is between the better educated and the less educated, which have become two separate cultures. Friedman says a college degree is now an almost indispensable requirement for the prosperous class, but it’s something children of the other class find ever harder to obtain. All the affirmative action to help them barely nibbles at the problem.

On NPR’s This American Life I heard a revealing profile of an apparently bright African-American kid who did make it into a good college, with a scholarship no less. But he had no idea how to navigate in that unfamiliar environment, and got no help there, left to sink or swim on his own. He sank.

Friedman talks up various exciting innovative tools available to such people not born into the privileged class, to close the gap. But to take advantage of them you have to be pretty smart and clued in. I keep thinking about all the people who aren’t, with no idea how they might thrive, or even just get by, in the new world whooshing up around them. I’ve written about them in discussing books like The End of Men and Hillbilly Elegy. It wasn’t just “hillbillies” Vance was talking about there, but a big swath of the U.S. population. A harsh observer might call them losers; throw-away people.

I’m enraged when charter schools are demonized as a threat to public education. That’s a Democrat/liberal counterpart to Republican magical thinking. These liberals who spout about inequality and concern for the disadvantaged are in denial about how the education system is part of the problem. Public schools do fine in leafy white suburbs; schools full of poor and minority kids do not. For those kids, charter school lotteries offer virtually the only hope.

Of course, the problem of people unfitted for modernity isn’t unique to America. There are billions more in other countries. Yet most of us don’t realize how fast an awful lot of those people are actually coming up to speed. But there’s still going to be a hard core who just cannot do it, and no conceivable government initiatives or other innovations will be a magic wand turning them into fairies. Instead it seems we’re headed toward one of those future-dystopia sci-fi films where humanity is riven between two virtually distinct species — the golden ones who live beautiful lives, forever, and the rest who sink into immiseration. I do think most people can be in the former group. And I hope they’ll be generous enough to carry the others at least partway to the Eden.

But what Friedman keeps stressing is the need for culture, especially in politics, to change along with the landscape. He applies what he says is the real lesson of biological evolution: it’s not the strongest that thrive, but the most adaptable. In many ways America does fulfill this criterion. Yet in other ways we’re doing the opposite, especially in the political realm where so much of the problem needs to be addressed. The mentioned need for radical education reform is just one example. Our constitution worked great for two centuries; now, not so much. Our political life has become sclerotic, frozen. Add to that our inhabiting a post-truth world where facts don’t matter. Can’t really address any problems that way.

Friedman enumerates an 18-point to-do list for American public policy. Mostly no-brainers. But almost none of it looks remotely do-able today. In fact, on a lot of the points — like opening up more to globalized trade — we’re going the wrong way.

He concludes with an extended look at the Minnesota community where he grew up in the ’50s and ’60s. It echoed Robert Putnam’s describing his own childhood community in Our Kids. Both were indeed communities, full of broad-based community spirit. Friedman contrasts the poisonously fractious Middle East where he spent much of his reporting career. He also reported a lot about Washington — and sees U.S. politics increasingly resembling the Middle East with its intractable tribal conflicts.

I’ve seen this change too in my lifetime — remembering when, for all our serious political disagreements, adversaries respected each other and strove to solve problems in a spirit of goodwill. Most politicians (and their supporters) embodied civic-mindedness, sincerity, and a basic honesty. No longer. Especially, sadly, on the Republican side, which for decades I strongly supported. Now it’s dived to the dark side, the road to perdition.

Friedman wrote before the 2016 election — where America turned its back on all he’s saying. Can we repent, and veer toward a better road, before it’s too late?

*Microsoft has just bought GitHub.

Adventures with technology

May 19, 2018

Turning on the TV, we’d see a gal from cable company Spectrum talking about the all-digital story. I’d click past her, thinking we were all set with a big modern flat-screen, and anyway wasn’t the digital thing old news?

Then I remembered the other, old little TV in my upstairs home office where I like to hear the PBS Newshour while working. So I finally listened to the gal, and it seemed I’d soon need a new gizmo, which they’d supply free. So I called the number and was told, yes, free . . . for a year, then there’d be a monthly charge. Screw that, I said (to myself). I asked if there was another option. The guy said Roku.

I knew nothing of Roku. Mentioning it to my wife, she said she had a Roku! A recent freebie, from her Great Courses program; didn’t know what to do with it.

She meanwhile suggested I could simply tune in to PBS on my desktop computer. But I’m a stubborn old cuss, and this is 2018, can’t I have a TV in my office if I want?

So I happily took the Roku device up there, turned around my little ancient TV to look for the port to connect it, and of course there wasn’t one.

So I needed a newer TV; no taller than 15 inches to fit on my shelf. Called various stores — no dice. I was finally told they just don’t make TVs that small any more.

Not long before, I’d wanted to replace my office lamp — simple thing with two spotlights adjustable on a pole. Went to three stores; no dice. Then I called a giant lighting specialty place. “They just don’t make lamps like that anymore.” Okay, so I’ll go to eBay, where you can find any old thing (like the obsolete walkmans I’ve thusly replaced several times). But no such lamps on eBay either.

But I did find a TV there — 19″ flat screen, brand new, 75 bucks, shipping included, which arrived in two days. (TV screens are measured diagonally to make them sound bigger; this fit on my shelf with room to spare.)

So I managed to get it set up and working, except of course no cable signal, needing instead the Roku, which did connect to it. Following the instructions in the Roku booklet, I got some screens with more instructions; had to dig up my wireless router password; go to a website on my computer to enter a code to get some software sent to the TV; then similarly to PBS’s website to enable the Roku to have intercourse with PBS.

So finally I had PBS on my TV screen. I was so proud of myself! I could bring up any number of past PBS shows. But simply watch PBS real-time? Uh-uh. Nothing I tried would allow that.

So I went back to the Roku booklet, hoping for a tech support number. Nope. Then to their website, clicking through a bunch of screens and FAQ answers. After much searching around, it sounded like watching PBS would actually require another piece of equipment, an antenna. Oy!

So I thought I’d try my luck calling Spectrum, expecting to hear, “Sorry, we don’t service Roku.” But to my surprise, the tech guy (sounded like in the Philippines) said sure, he’d walk me through it. This involved downloading onto the TV yet another software package, from Spectrum (which entailed laboriously picking out the letters S-P-E-C-T-R-U-M on one of those patience-shredding onscreen keypad thingies), and many further steps. Unfortunately, while trying to juggle the remotes for both the TV and Roku plus my phone, I accidentally disconnected the call. When I redialed, their system insisted on running an automated recording following up to the previous call simultaneously with one for the new call. Eventually I reached another tech guy and completed the process.

And lo, some PBS kids’ cartoon started playing on my little TV! O frabjous day! Callooh! Callay! He chortled in his joy.

Except that when I switch on the TV, PBS still doesn’t simply come on, I have to click through several screens. I timed it, 37 seconds. An eternity by modern tech standards. But I’m an old dinosaur, I guess I can live with that.

What occurs to me is this. I’m pretty smart. How do all the idiots out there (without a ten-year-old helping them) manage with this stuff?

Stephen Hawking

March 28, 2018

Stephen Hawking had a horrible illness, given only a few years to live.

He lived them, and then fifty more. He had ALS (motor neuron disease) which destroys muscle control. There is no cure or treatment.

You know that sci-fi trope of the disembodied brain in a vat? That was Stephen Hawking, more or less, because his body was so ruined he might as well have had none. All he had was his brain. But what a brain.

So despite losing virtually everything else, against all odds his brain kept him going for over half a century. To me, this is the Stephen Hawking story. I’m unable to appreciate fully his scientific achievement. But I’m awed by its being achieved in the face of adversity that also defies my comprehension. Stephen Hawking represents the godlikeness of the human mind.

Another awesome thing about humanity is the ability to adapt. That’s why our species thrives from the Gobi Desert to the Arctic tundra. And as individuals we often make truly heroic adaptations to what life throws at us. Viktor Frankl wrote (in Man’s Search for Meaning) about accommodating oneself psychologically to surviving in a concentration camp. Stephen Hawking too adapted to horrible circumstances. Perhaps he did not curse the fates for that, instead thanking them for vouchsafing his mind. Which, undaunted, he employed to get on with his life and his calling.

That included authoring the least read best-selling book ever, A Brief History of Time. I actually did read it, and was on board till the last chapter, which kind of baffled me.

A character conspicuous by his absence in that book was God. We have trouble wrapping our heads around how the cosmos can have come into existence without him. Of course, that merely begs the question of where he came from. But Hawking’s scientific work (as partly embodied in his book), while not dotting every “i” and crossing every “t” in explaining the existence of existence, did carry us closer to that ultimate understanding. He didn’t conclusively disprove God — but did make that superstition harder to sustain. (And why would God create ALS?)

Hawking was a scientist, but not a “hands-on” scientist, because he soon lost use of his hands, could not even write. Communicating became increasingly difficult. Only thanks to advanced computer technology was he able to produce that familiar mechanized voice — in the end, only by twitching a muscle on his cheek. This too a triumph of mind over matter.

And so it was literally only within the confines of his brain that he worked, probing at the profoundest mysteries of the Universe by pure thought alone. (That was true of Einstein as well.) Of course, lots of other people do likewise and produce moonshine. Hawking (like Einstein) produced deep wisdom, expanding our understanding of the reality we inhabit. An existence upon which his own frail purchase was so tenuous.

An existence that’s poorer without him.

The book that changed America: Darwin, Slavery, and God

February 27, 2018

Darwin

The Book That Changed America is the title of one by Randall Fuller. It’s about Darwin’s On the Origin of Species, looking at its impact particularly in Concord, Massachusetts.

That wasn’t just Anytown, U.S.A. Concord was the center of America’s intellectual ferment. The protagonists in Fuller’s book include Emerson, Thoreau, Hawthorne, Bronson and Louisa May Alcott, Franklin Sanborn, Louis Agassiz, and Asa Gray — all living in or near Concord and interacting with each other and with Darwin’s bombshell book.

Brown

It hit Concord almost simultaneously with another bombshell in late 1859: John Brown’s attack on the Harper’s Ferry arsenal and his subsequent execution. Brown was not, as often portrayed, a madman. He considered slavery a great sin that could be undone only through war, which he aimed to start. He was just about a year early.

America was already, of course, hotly divided over slavery, and Harper’s Ferry raised the temperature further. So did Darwin’s book.

How so? The only possible excuse for slavery was the idea of blacks’ racial inferiority. Thus their constant denigration as a degenerate, brutish species. And slavery apologists, being besotted with religion, had to believe God intentionally made blacks separately and enslavement-worthy. Efforts to prove their inferiority litters Nineteenth century science. (See Stephen Jay Gould’s The Mismeasure of Man.)

(Even most abolitionists thought blacks inferior. But they opposed slavery nonetheless because it was cruel and unjust. This applies to every pogrom, genocide, or other ethnically based abuse or exploitation. Even if its victims were lesser, degraded creatures — it’s never true, but even if it were — their mistreatment would still be cruel and unjust. The creatures proven inferior and degraded are the perpetrators.)

Anyhow, the races’ biological separateness continued to be a matter of intense science-oriented debate.* That’s where Darwin came in.

His book prudently refrained from specifically addressing human origins. (Darwin bit that bullet later in The Descent of Man.) Origin discussed living things in general, and all its numerous examples and case studies concerned non-human life. Many at the time imagined humans were something apart from all that. Yet many others were not so deluded, and they realized that if Darwin’s varied finches and so forth were all close cousins, branches of the same tree, obviously then so were whites and blacks. (We now know that blacks came first, and whites descended from them.)

Thus did Origin explode the moral underpinnings of slavery. And Darwin was not just another polemicist with an axe to grind. Not only was his a science book, it was powerfully supported and argued, hence a devastating blow.

Yet still it was disputed. Inevitably, for a book that gored cherished oxen. And slavery was not the only ox. The other was God himself.

Gods have always been the answer for natural and cosmic mysteries people couldn’t otherwise penetrate. That territory used to be huge. But science has progressively answered those mysteries, inexorably shrinking godly territory.

To naive eyes, the world might look designed, the only possible way to explain life’s diversity and complexity. Literature is filled with rhapsodizing on this theme. Though would any intelligent designer have so filled creation with pain and suffering? Calling this a mystery is no answer.

Thoreau had studied nature intensively, and likewise studied Darwin’s book. He got it, completely; it explained so much of what he’d actually observed. Fuller casts Thoreau as holding that the world is indeed filled with magic and mystery — just not the kind religion postulates.

But Darwin greatly demystified life. His theory was a revelation, a revolution. He called it “natural selection” and “descent with modification;” for short, evolution. His book explained it thoroughly and cogently; there’s hardly a word in it that doesn’t still hold up. A stupendous achievement of human intellect.

And once Darwin unveiled it, the idea of evolution was actually obvious. (I recall Richard Milner’s song, wherein other scientists of the time moan, “Why didn’t I think of that?!”) As Thoreau found, evolution instantly made sense of everything observable about the natural world, everything previously so puzzling. The great geneticist Theodosius Dobzhansky put it thusly: “Nothing in biology makes sense except in the light of evolution.”

Yet, to this day, half of Americans reject it. Fuller’s book recaps the opposition to evolution as it played out at its advent, with famed scientist Louis Agassiz in the attack’s vanguard. Its essence remains unchanged. Evolution shrinks God almost to irrelevance. And not just in biology. If life is attributable to natural, not supernatural causes, couldn’t the same be true of the entire cosmos? To Agassiz, all this was something literally unthinkable.** As it is for his modern counterparts.

Likewise that we “come from monkeys” (or even lesser creatures). Some believe that degrades us. But “there is grandeur in this view of life,” connecting us to every other living thing. And our animal antecedents make us all the more remarkable. It’s sublime that a Darwin, descended from apes, could have the insight to see it. All we’ve achieved we’ve done ourselves, with no help from any god.

A reader of Fuller’s book must be struck by how one key mistake — belief in a god — traps you in a carnival house of mirrors, distorting everything about life and the world. Escape it and all becomes clear. This is the main reason why Agassiz and other scientists of the time failed to see what Darwin saw. Religion blinded them. And even when shown the light, they hold tight to their blindfolds. They torture facts, evidence, and logic, struggling to hammer the square peg of their belief into the round hole of reality.

I find it far better to just accept reality.

* Some even argued for different species on the basis (by analogy to mules) that mixed-race people tend to be sterile — simply untrue. Furthermore, the vast genre of argument that race mixing somehow “pollutes” and degrades the quality of the white race likewise contradicts manifest biological fact: mixing different gene pools improves strength and quality. It’s called hybrid vigor.

** Scientist Asa Gray entered the fray on Darwin’s side, but even he was unmoored by God’s banishment, coming up with the fallback idea that evolution is God’s method for managing life’s pageant. And even Darwin himself seemed queasy about a purely mechanistic view of creation.

Being and nothingness: How the brain creates mind and self

February 14, 2018

Phineas Gage was a big name in brain science. Not a scientist — but a railroad construction foreman. Until in 1848 an accidental explosion rammed a three-foot iron rod through his cheek and out the top of his head.

Gage actually recovered, with little outward impairment. But his character and personality were transformed. Previously admirable, he became an irresponsible jerk. A part of his brain governing temperament was destroyed.

This famous case opens Antonio Damasio’s landmark 1994 book, Descartes’ Error: Emotion, Reason, and the Human Brain. Though not the latest word in neuroscience, I felt it was worth reading, in my eternal quest to understand the most important thing in the world — my self. What, in that sentence, “I” and “felt” really mean.

I’ve written about this before; here are links: (1), (2), (3), (4), (5), (6), (7).

Of course, like everyone, I know perfectly well what being me feels like. But why does it feel that way? Why does anything feel like anything? By what mechanism?

Obviously, it has to do with the workings of the brain. I say “obviously,” but some might disagree. In fact, that was “Descartes’ error” of the book title — the famous philosopher posited the mind being something apart from anything physical. Like the idea of a soul. But these are nonsensical concepts. Not only is there no evidence for them, there’s no possible coherent explanation for how they could be true. There’s no plausible alternative to our minds being rooted in the workings of our brains.

Yet it’s difficult to come up with a coherent explanation for that too (so far, anyway). Brains have been analogized to computers, but computers aren’t conscious (so far, anyway). It’s been suggested that the difference is complexity — the brain’s trillions of synapses vastly dwarf what’s in any computer. Still, this seems more like a label than an explanation.

Some common-sense ideas don’t work. Like there’s somebody in charge in there, a captain at your helm. That’s certainly an illusion — the mind is bottom-up, not top-down. That is, whatever you think you are thinking, it’s not the work of some central command, but a product of a lot of undirected neuronal signaling, actually distributed among various brain modules, that somehow comes together. Similarly, we imagine seeing as a “Cartesian theater” (named for the same Descartes), i.e., as if a signal coming in from the eyes gets projected onto a screen in the brain, viewed by a little person (“homunculus”) in there. But does the homunculus have a Cartesian theater — and a smaller homunculus — in its brain? And so forth? The idea falls apart.

Further, not only is the mind not somehow separate from the brain, it’s not even separate from the whole rest of the body. Another point Damasio makes clear. “Keeping body and soul together” is a paradoxically apt expression here, because the brain evolved, after all, as a device to keep the body going, for its ultimate purpose (to the genes) of reproducing. So the body is the brain’s primary focus, and monitoring and regulating the body, and responding to its cues, is most of what the brain is doing at any given moment. (Thus the sci-fi notion of a disembodied brain in a vat, having normal consciousness, is probably absurd.)

To understand how the mind works, the concept of representation seems crucial. (No mentation without representation!) Start with the idea of reality. There is a reality that obtains within your body; also a reality outside it, that you interact with. But how does the mind perceive these realities? Through senses, yes; but they can’t give the brain direct contact with reality. The reality outside — it’s raining, say — cannot itself get inside your head. It can’t be raining in there. It’s even true of your inner bodily reality. If your stomach hurts, you can’t have a stomachache in your brain. But what your brain can do is construct a representation of a stomachache, or rain shower. Like an artist creates a representation of a still life on his canvas.

Of course the brain doesn’t use paints; it only has neurons and their signaling. Somehow the brain takes the incoming sensory information — you see it raining — and translates it into a representation constructed with neuronal signaling. A mental picture of the raining. And notice this can’t merely be like snapping a photo. The representation has to be sustained — continually refreshed, over some length of time.

This is starting to be complicated. But more: how do “you” (without a homunculus) “see” the representation? Why, of course, by means of a further representation: of yourself perceiving and responding to the first one.

But even this is not the end of it. It’s actually three balls the brain must keep in the air simultaneously: the representation of the reality (the rain); second, the representation of the self reacting to it; and, finally, a third order representation, of your self in the act of coordinating the prior two representations, creating a bridge between them. Only now do “you” decide you need an umbrella.

This at least is Damasio’s theory, insofar as I could understand it. Frankly that third part is the hard one. I’m a little queasy that we might have here another endless homuncular recursion: the representation of the self perceiving the representation of the self perceiving . . . . Yet we know the buck must stop somewhere, because we do have selves that somehow know when it’s raining, and know they know it, and grab umbrellas. And one can see that the first two representation levels don’t quite get us there. So there must be the third.

Pain too is a representation. When the body signals the brain that something’s amiss, it could register the fact without suffering. The suffering is an emotion, triggered by the brain creating a representation of “you” experiencing that feeling. That’s why it hurts. Of course, we evolved this way to make us respond to bodily problems. Rare individuals who can’t feel pain damage themselves — very non-adaptive. And Damasio tells of one patient with an extremely painful condition. After an operation snipping out a bit of brain, he was thoroughly cheerful. Asked about the pain, he said, “Oh, the pains are the same, but I feel fine now.” His brain was no longer representing pain as suffering.

Meantime, while the mind is doing all that representation stuff — continually, as new signals keep arriving — keeping “you” in touch with what’s going on — there’s yet another ball it must keep aloft: who “you” are. Part of it again is the bodily aspect. But you’re not an empty vessel. Damasio likens the representation of your self to the kind of file J. Edgar Hoover’s FBI might have kept on you. Though it’s not all in one file, or file cabinet, but distributed among many different brain modules. It includes data like what you do, where you live, other people important to your life, knowledge of your entire past, and your ideas looking ahead to your future. Everything that makes you you. And it’s not just filed away; all of it the mind must constantly refresh and update. To keep in being the “you” in its representations of “you” interacting with realities like rain or pain.

Of course all the foregoing is merely schematic. We know how painters paint pictures, but how, exactly, neuronal signaling does it remains a very hard problem. But yet again we know it must. There’s no alternative.

And for humans at least, we do know at least part of the answer. We know how to paint word pictures. And they entail a lot of metaphors — another form of representation. In fact, thinking this way is so second-nature that most of us have struggled to imagine what thinking without words could be like. Of course, other animals do it, and have consciousness, without language. But undoubtedly having it is a tremendous enhancer for the three-stage model via representation that I’ve described. I think it gives humans a much deeper, higher-level self-awareness than other animals enjoy. (Damasio, somewhat enigmatically, says this: “Language may not be the source of the self, but it certainly is the source of the ‘I.'”)

What Damasio’s book is really famous for is his take on reason and emotion. Phineas Gage’s iron rod opened not only a hole in his head but a window on the subject. Damasio also discusses the similar case of “Elliot,” a normal, smart, successful man until a lesion destroyed a bit of his brain. He was still perfectly rational. But like Gage’s, his life fell apart, because he could not behave as reason dictated. The explanation turned out to be a loss of emotional capacity. Emotions give us the reasons to utilize our reason! Elliot no longer cared about anything; not even his life falling apart. The lesson is that emotion and reason are not, as many people imagine, separate or even at odds with one another. They are bound together. Moreover, emotion on its own terms isn’t unreasonable. There are always reasons for the emotions we feel (or if not, that’s insanity).

A final point. While Damasio’s book helped a bit, I still can’t say I have a good handle on what accounts for this phenomenon I experience as being me. It still feels like a will-o’-the-wisp that slithers away whenever I try to grasp it. And as difficult as it is to grasp being in existence, it is likewise difficult to grasp the idea of nonexistence.

Upgrading to Humanity 2.0

February 4, 2018

Tech guru Ray Kurzweil called it “The Singularity” – when artificial intelligence outstrips human intelligence – and starts operating on its own. Then everything changes. Some, like Stephen Hawking, fear those super-intelligent machines could enslave or even dispense with us.

But in my famous 2013 Humanist magazine article, The Human Future: Upgrade or Replacement, I foresaw a different trajectory – not conflict between people and machines, or human versus artificial intelligence, but rather convergence, as we increasingly replace our biological systems with technologically better ones. The end result may resemble those cyborg superbeings that some fear will supplant us. Yet they will be us. The new version, Humanity 2.0.

I call this debiologizing, not roboticizing. We may be made mostly if not wholly of artificial parts, but won’t be “robots,” which connotes acting mechanically. Humanity 2.0 will be no less conscious, thinking, and feeling than the current version. Indeed, the whole point is to upgrade the species. Two-point-zero will think and feel more deeply than we can. Or, perhaps, can even imagine.

This transformation’s early stages fall under the rubric of “enhancement,” referring, generally, to improving individual capabilities, via pharmacology, hardware, or genetic tinkering. This gives some people the heebie-jeebies. But every technological advancement always evokes dystopian fears. The first railroads were denounced as inhuman and dangerously messing with the natural order of things. A more pertinent example was organ transplants, seen as crossing a line, somehow profoundly wrong. Likewise in-vitro fertilization. The old “playing god” thing.

The fact is that we have always messed with the natural order, in countless ways, to improve our lives. It’s the very essence of humanity. And the “enhancement” concept is not new. It began with Erg, the first human who made a crutch so he could walk. (No doubt Glorg scolded, “if God meant you to walk . . . .”) Today people have prosthetics controlled by brain signaling.

A lot of it is to counter aging. Euphemisms like “golden years” can’t hide the reality of decline, always physical, and usually (to some degree) mental. We’ve already extended life far longer than nature intended, and make people healthier longer too. If all that’s good, why not strive to delay decrepitude further still – or reverse it?

And why not other interventions to improve human functionality? If we can enable the disabled, why not super-able others? If we use medicines like Ritalin to improve mental function for people with problems, why not extend the concept to improving everyone’s abilities? Through all the mentioned means – pharmacology, hardware, genetics – we can make people stronger, healthier, and smarter.

Yet some viscerally oppose all this, as a corruption of our (god-given?) human nature. Paradoxically, some of the same people are cynical pessimists about that human nature, vilifying it as a fount of evil. Is it nevertheless sacred, that we shouldn’t tamper with it? Steven Pinker argued persuasively, in The Better Angels of Our Nature: Why Violence has Declined, that humanity has in fact progressed, gotten better, and better behaved, mainly because in many ways we’ve gotten smarter. If we can make people smarter still, through all those kinds of technological enhancements, won’t that likely make us better yet, kissing off the ugliest parts of our (god-given) nature?

The idea of people being able to choose enhancements for themselves also irks misanthropes who see in it everything they dislike about their fellow humans. It’s the ultimate in sinful consumerism. An illegitimate “shortcut” to self-improvement without the hard work that it should rightly entail, thus cheapening and trivializing achievement. Life, these critics seem to say, should be hard. By this logic, we should give up washing machines, microwaves, airplanes, all those “shortcuts” we’ve invented to make life easier. And go back to living in caves.

A perhaps more serious version of their argument is that enhancement, taken sufficiently far, would strip human life of much of what gives it meaning. Much as we’ve progressed, with washing machines and microwaves, etc., and with health and longevity, still a great deal of what invests life with meaning and purpose is the struggle against the limitations and frailties and challenges we continue to face. Remove those and would we become a race of lotus-eaters, with an empty existence?

But consider that early peoples faced challenges of a wholly different order from ours. Getting food was critical, so they sacralized the hunt, and the animals hunted, which loomed large in their systems of meaning. Now we just saunter to the grocery, and that ancient source of meaning is gone. Does that make us shallower? Hardly. Instead it liberates us to focus upon other things. Maybe higher things.

The fundamental mistake of enhancement’s critics is to imagine life for a Human 2.0 by reference to life for a Human 1.0, when they will be as different as we are from our stone age ancestors. Or more so. Our future descendants, relieved of so many concerns that preoccupy us (and not detoured by supernatural beliefs), will find life richer than we can dream.

Of course there will be profound impacts – economic, environmental, cultural, social. Not only will 2.0 be very different, their world itself will be transformed by that difference. But with greater smarts and wisdom they should be able to deal with the challenges.

Our species is only a couple hundred thousand years old; civilization, ten thousand. Billions of years lie ahead. Thus we are humanity’s infancy. Adulthood will be really something.