Archive for the ‘Science’ Category

My pro basketball experience

March 31, 2019

This pic of me at the game didn’t come out so good

Last Sunday we went to Boston for a Celtics game. I’m no sports fan. In fact, the last pro sports event I attended was a Dodgers baseball game. When they were still in Brooklyn (and Ike was president).

But my wife is a basketball aficionado, and we’ve been hosting a gal from Somaliland who plays it in high school. So I went with them.

 

I really enjoyed the fan-cam and people’s reactions seeing themselves on the jumbotron. Most didn’t immediately realize they were having their fifteen nanoseconds of fame. A few never did, eyes glued to their phones. Most did exuberant dancing and arm-waving. One woman grabbed her husband’s head and kissed him on the lips. But I thought the most romantic one was the gal holding up a sign saying, “Marcus Smart will you marry me?” — until (silly me) I learned Smart is a Celtics player, not (presumably) her inamorata.

The game itself was less entertaining. Very much the same thing repeated over and over. Speaking of repetition, the jumbotron kept showing the word “DEFENSE” in giant block letters crashing down and crushing a bunch of what appeared to be pick-up sticks. And the crowd would duly pick up the chant, “DEFENSE! DEFENSE!” I waited, in vain, for a little offense; especially as the Celtics’ defense was being crushed by the San Antonio Spurs.

 

They lost 486 to 9. Or something like that.

Wrong

I am no basketball expert. Yet I could have advised one thing to improve their score: doing free throws underhand (“granny style”) rather than overhead. Studies have in fact been done, and it’s proven that the former gives a higher success rate. Yet players universally ignore this. Why? They think it looks girly, not macho. So Vince Lombardi was actually wrong — winning isn’t the only thing.

 

Anyhow, some fans were deflated by the Celtics’ drubbing. Some even left early, in disgust, or perhaps to avoid the traffic crush. But most seemed to have a good time nevertheless. Even sports nuts ultimately understand that these games are Not Really Truly Important. They’re harmless. At least we no longer gather in stadiums to watch combatants literally kill each other. And at least these Celtics fans wore green hats, not red ones, and their chants weren’t hateful.

And I achieved my own personal goal for the evening: home and snug in bed by 1:30 AM.

Advertisements

The truth about vaccines, autism, measles, and other illnesses

February 26, 2019

The left derides the right for science denialism, on evolution and climate change. But many on the left have their own science blind spots, on GM foods and vaccination.

The anti-vax movement is based on junk science. The fraudulent study that started the whole controversy, by Andrew Wakefield, supposedly linking vaccines and autism, has been totally debunked. The true causes of autism remain debatable, but in the wake of Wakefield there have been numerous (genuine) scientific studies, and now at least one thing can be ruled out with certainty: vaccination.

“But my kid became autistic right after vaccination” — we hear this a lot. Post hoc ergo propter hoc (after which, therefore because of which) is a logic fallacy. One thing may follow another with no causal link. Kids are typically scheduled for vaccinations at right around the same age that autism first shows up. It’s just coincidence.

Anti-vaxers throw up a flurry of other allegations of harm, and keep insisting science hasn’t answered them. Not so. All such claims have been conclusively refuted. True, it’s possible to have a bad reaction to any injection, but with vaccination such cases are so extremely rare that all the fearmongering is totally disproportionate. The fundamental safety of vaccines is proven beyond any rational doubt.

I heard it reported that parents objecting to vaccination actually tend to be smarter than average. Proving you can be too smart for your own good. Tom Nichols’s book The Death of Expertise shows education often leads people to overrate their own knowledge, making them confident to just reject conventional medical science. They make the mistake of deferring instead to a movement that’s rooted in a mindset of hostility toward elites and experts of all stripes, and receptiveness to conspiracy theories, ready to believe big pharma, the medical establishment, and of course the government, all promote vaccination for evil purposes. People go online and find all this nonsense, and it fits with their pre-existing mindset, so they become impervious to the facts.

Still, we’re told this is a free country and people should be allowed to make these decisions for themselves and their own children. Such pleas resonate with my libertarian instincts; I don’t like government telling us what to do. But the vaccination issue isn’t so simple. Children are unable to choose for themselves. While parents are free to raise kids as they see fit, we don’t allow child abuse. And the law steps in, rightly, when Christian Scientists for example want to deny their kids needed medical treatment.

The same principle should apply to vaccination. Indeed, more so — because parental decisions here don’t just affect their own kids. When a high enough share of a population is vaccinated, a disease is blocked from propagating, so even the unvaccinated are safe. It’s called “herd immunity.” But with enough unvaccinated available victims, the disease can get a toehold and spread. Vaccinated people are still safe, but not babies too young for vaccination, and people who can’t be vaccinated, for various legitimate medical reasons.

Our herd immunities are now in fact being broken by the widespread refusal of vaccination. Thus dangerous illnesses, like whooping cough and measles, that had been virtually eradicated, are making a big comeback, with sharply rising infection rates.

This is a serious public health issue, and for once the solution is simple. Vaccination must be mandatory, absent valid medical reasons. Opt-outs on religious or “philosophical” grounds should be ended. There are no arguably legitimate religious or other doctrines that could justify refusal to vaccinate. These are just pretexts by people suckered by the pseudo-scientific anti-vax campaign.

We all should be free to do as we please, as long as it harms no others. The freedoms that matter are living as one chooses, and self-expression. Requiring vaccination does not violate these freedoms in a meaningful way; while refusing it does harm others. While you might argue that you have a right against unwanted injections, they are a far less drastic impingement upon personal freedom than is quarantining people with contagious illnesses. Their personal freedom is surely trumped by society’s right to protect others from disease.

To anti-vaxers, the minuscule risk from vaccination may seem larger than the risk from illnesses like whooping cough. That’s only because vaccination had practically eradicated those diseases. Anti-vaxers are getting a free ride from the herd immunity conferred by the vaccination of others. Anti-vax parents act as though only their kids matter, other kids and the herd immunity do not. Where is the social solidarity? Doing something because it’s good for all of us together?

Vaccination is a fantastic accomplishment of humankind, conquering the dread specters of so many diseases that afflicted life, and brought early death, throughout most of history. If you want to shout from the rooftops arguing that vaccination is a devil’s plot, you should have a right to do so. As long as you’re vaccinated.

Pachinko by Min Jin Lee — a novel of identity

February 22, 2019

Min Jin Lee

I read this 2017 novel for a book group. A nice thing about such groups is exposure to rewarding reads you’d never otherwise pick up.

Japan occupied Korea from 1910 to 1945. Sunja is born there around 1916. Her mother subsists running a humble boarding house. Teenaged Sunja is pursued, and impregnated, by businessman Koh Hansu. She vaguely expects marriage; but surprise surprise, he already has a wife back in Japan.

Then an ethereal young Korean Christian minister, Isak, rescues Sunja by marrying her. They relocate to Japan, where he has a posting waiting, and live with his brother and sister-in-law. The child is named Noa; later Isak and Sunja have their own son, Mozasu. (Their names are derived from Noah and Moses.) Both eventually wind up running pachinko parlors; pachinko is a pinball-like game very popular in Japan.

But the book’s main focus is on Korean identity in a Japanese culture that despises Koreans. They are stereotyped negatively and suffer systematic discrimination (despite the impossibility of identifying Koreans by appearance). Japan’s forcing many thousands of Korean women into brothels for soldiers during WWII is well known. Japan (unlike Germany) has been recalcitrant on repentance for this and other crimes.

The novel barely mentions those “comfort women,” but describes much other mistreatment suffered by Koreans. Isak is jailed, suspected of insufficient loyalty to the Emperor, and dies from his horrible ordeal.

Koreans living in Japan remain distinctly second-class citizens — if allowed citizenship at all, after generations of residence. Mozasu’s son, in 1989, works there for an investment bank, until he’s screwed over because he’s Korean.

But what really prompts me to write is Noa’s story. (BIG SPOILER ALERT) He didn’t know Koh Hansu was his real father. Koh reappears, now quite wealthy, as Noa’s benefactor, financing his much coveted university education. Noa and his mother Sunja are resistent, but accept Koh’s largesse. But then Noa’s girlfriend meets Koh, sees the resemblance, and taunts Noa with the obvious. Also that Koh must be a yakuza— a gangster.*

These revelations crush Noa. Cursing what his mother did, he runs away to start a new life, cutting all ties to his family, and starting his own new one, with a wife and children (and passing as Japanese). He sends Koh money to repay what he’d received. He also sends Sunja money but never divulges contact information. For sixteen years.

Finally Koh locates Noa, now 45, and Sunja goes to him, in his office. The reunion is difficult but doesn’t go too badly. Noa promises to come visit her. Then he shoots himself.

He had thought he’d escaped his parentage, but now must have realized he could not. And he could not live with that.

Koh was indeed a gangster. A nasty piece of work, as revealed in only a few glimpses. But as far as Sunja’s family knew, he was just a “businessman.” Noa’s girlfriend could not have known the truth about Koh, nor could Noa, it was just an unsubstantiated suspicion. Perhaps Noa should have probed further before shooting himself.

Or perhaps that’s nitpicking. The real issue here is the heart of human identity. Noah felt himself irremediably contaminated. He had bad blood.

This idea of “bad blood” reverberates throughout human history. The sins of the father visited upon the sons. How many people have indeed been punished for crimes or derelictions (real or just imagined) by forebears?

It’s the heart of racism. The notion that all members of some group are birds of a feather, sharing some (stereotyped) characteristics. As vividly depicted in this book, where the antipathy of Japanese toward “those people” (Koreans) is a constant.

Here’s some science. Biology is not destiny. Even where genes are indicative of certain behavioral traits (and there are such), genes never determine how any individual will behave in any situation. At most, they may delineate proclivities, but an individual’s actual behavior results from too many variables to be predicted by genes or anything else. And it’s certainly untrue that any human subgroup shares biologically determined behavioral traits (different from other subgroups).

Of course there are human behaviors, genetically evolved, which we share as a species. But they don’t differ among subgroups. And even if there were such subgroup-specific genes, their effect would be overwhelmed by all the other factors influencing a given individual’s personal behavior.

That’s not to deny cultural differences. Cultural groups do have their own characteristics, that’s the definition of culture. But it’s not genetic. Remove an individual at birth from their specific culture, and there’s no innate biological reason for replicating behavior particular to that culture.

So Noa’s human identity was not dictated by his father’s gangsterhood. His blood was no more bad than anyone else’s. It was up to him to shape his own life. And, even if there were gangster genes inherited from his father (a dubious idea), those genes would not anyway determine his own character, which would still be his to create.

You can be what you choose to be.

*An echo of Great Expectations? Noa studies literature — he loves Dickens!

Evolution by natural selection is a fact

February 5, 2019

My recent “free will” essay prompted some comments about evolution (on the Times-Union blog site.) One invoked (at verbose length) the old “watchmaker” argument. Nature’s elegant complexity is analogized to finding a watch in the sand; surely it couldn’t have assembled itself by random natural processes. There had to be a watchmaker.

This argument is fallacious because a watch is purpose-built and nature is not. Not the result of a process aimed at producing what we see today; instead one that could just as well have produced an infinity of alternative possibilities.

Look at a Jackson Pollock painting and you could say that to create precisely this particular pattern of splotches must have (like the watch) taken an immense amount of carefully planned work. Of course we know he just flung paint at the canvas. The complex result is what it is, not something Pollock “designed.”

Some see God in a similar role, not evolution’s designer but, rather, just setting it in motion. Could life have arisen out of nowhere, from nothing? Or could the Universe itself? Actually science has some useful things to say about that — better than positing a God who always existed or “stands outside time and space,” or some such woo-woo nonsense. And for life’s beginnings, while we don’t have every “i” dotted and “t” crossed (the earliest life could not have left fossils), we do know the basic story:

Our early seas contained an assortment of naturally occurring chemicals, whose interactions and recombinations were catalyzed by lightning, heat, pressure, and other natural phenomena. Making ever more complex molecules, by the trillion. One of the commonest elements is carbon, very promiscuous at hooking up with other atoms to create elaborate combinations.

Eventually one of those had the property of duplicating itself, by glomming other chemical bits floating by, or by splitting. Maybe that was an extremely improbable fluke. But realize it need only have happened once. Because each copy would go on to make more, and soon they’d be all over the place.

However, the copying would not have been perfect; there’d be occasional slight variations; with some faulty but also some better at staying intact and replicating. Those would spread more widely, with yet more variations, some yet more successful. Developing what biologist Richard Dawkins, in The Selfish Gene, called “survival machines.” Such as a protective coating or membrane. We’ve discovered a type of clay that spontaneously forms such membranes, which moreover divide upon reaching a certain size. So now you’ve got the makings of a primitive cell.

Is this a far-fetched story? To the contrary, given early Earth’s conditions, it actually seems inevitable. It’s hard to imagine it not happening. The 1952 Miller-Urey experiment reproduced those conditions in a test tube and the result was the creation of organic compounds, the “building blocks of life.”

That’s how evolution began. The duplicator molecules became genes (made of DNA). Their “survival machines” became organisms. That’s what we humans really are, glorified copying machines. A chicken is just an egg’s way to make another egg.

Of course DNA and genes, and Nature itself, do nothing with conscious purpose. Replicators competing with each other is simply math. Imagine your computer screen with one blue and one red dot. And a program saying every three seconds the blue dot will make another blue dot; but the red one will make two. Soon your screen will be all red.

A parable: A king wishes to bestow a reward, and invites the recipient to suggest one. He asks for a single rice grain — on a chessboard’s first square — then two on the second — and so on. The king, thinking he’s getting away cheaply, readily agrees. But before even reaching the final square, it’s all the rice in the kingdom.

This is the power of geometric multiplication. The power of genes replicating, in vast numbers, over vast time scales. (A billion years is longer than we can grasp.) And recall how genes are effectively in competition because occasionally their copies are imperfect (“mutations”), so no two organisms are exactly identical, and some are better at surviving and reproducing. Those supplant the others, just like red supplanted blue on your computer screen. But the process never stops, and in the fulness of time, new varieties evolve into new species. It’s propelled by ever-changing environments, requiring that organisms adapt by changing, or perish. This is evolution by natural selection.

Fossils provide indisputable proof. It’s untrue that there are “missing links.” In case after case, fossils show how species (including humans) have changed and evolved over time. (The horse is a great example. My illustration is from a website actually denying horse evolution, arguing that each of the earlier versions was a stand-alone species, unrelated to one another!)

We even see evolution happening live. Antibiotics changed the environment for bacteria. So drug-resistant bacteria rapidly evolved. Once-rare mutations enabling them to survive antibiotics have proliferated while the non-resistant are killed off.

Note that evolution doesn’t mean inexorable progression toward ever more complex or “higher” life forms. Again, the only thing that matters is gene replication (remember that red computer screen). Whatever works at causing more copies to be made is what will evolve. Humans evolved big brains because that happened to be a very successful adaptation. If greater simplicity works better, then an animal will evolve in that direction. There are in fact examples of this.

Another false argument against evolution is so-called “irreducible complexity.” Author Michael Behe claimed something like an eye could never have evolved without a designer because an incomplete, half-formed eye would be useless, conferring no advantage on an organism. In fact eyes did evolve through a long process beginning with light-sensitive cells that were primitive motion detectors, not at all useless. They did entail a survival advantage, albeit small, but it multiplied over eons, and improved by gradual incremental tweaks. So the eye, far from rebutting evolution, thus beautifully illustrates how evolution actually proceeds, and refutes any idea of intelligent design.

In fact, because our eyes did evolve in the undirected the way they did, they’re very sub-optimal. A competent designer would have done far better. He would not have put the wiring in front of the light-sensitive parts, blocking some light, nor bunched the optic nerve fibers to cause a blind spot. So we can’t see well in dim light. Some other animals (like squids) have much better eye design. And wouldn’t a really intelligent design include a third eye in the back?

Evolution by natural selection is the one great fact of biology. Not merely the best explanation for what we see in Nature, but the only possible rational explanation, and one that explains everything. As the geneticist Theodosius Dobzhansky said, “Nothing in biology makes sense except in the light of evolution.”

Consciousness, Self, and Free Will

January 29, 2019

What does it really mean to be conscious? To experience things? To have a self? And does that self really make choices and decisions?

I have wrestled with these issues numerous times on this blog. Recently I gave a talk, trying to pull it all together. Here is a link to the full text: http://www.fsrcoin.com/freewill.html. But here is a condensed version:

It might seem that the more neuroscience advances, the less room there is for free will. We’re told it’s actually an illusion; that even the self is an illusion. But Daniel Dennett, in 2003, wrote Freedom Evolves, arguing that we do have a kind of free will after all.

The religious say evil exists because God gave people free will. But can you really have free will if God is omniscient and knows what you will do? This equates to the concept of causation; of determinism. Laplace was a French thinker who posited that if a mind (“Laplace’s demon”) could know every detail of the state of the Universe at a given moment, it would know what will happen next. But Dennett says this ignores the random chance factor. And quantum mechanics tells us that, at the subatomic level at least, things do happen randomly, without preceding causes.

Nevertheless, the deterministic argument against free will says that everything your brain does and decides is a result of causes beyond conscious control. That if you pick chocolate over vanilla, it’s because of something that happened among your brain neurons, whose structure was shaped by your biology, your genes, by everything that happened before. Like a computer program that cannot “choose” how it behaves.

Schopenhauer said, “a man can do what he wants but cannot will what he wants.” In other words, you can choose chocolate over vanilla, but can’t choose to have a preference for chocolate. Or: which gender to have sex with.

And what does the word “you” really mean? This is the problem of the self, of consciousness, entwined with the problem of free will. We all know what having a conscious self feels like. Sort of. But philosopher David Hume said no amount of introspection enabled him to catch hold of his self.

Another philosopher, Rene Descartes, conceived mind as something existing separately from our physical bodies. This “Cartesian dualism” is a false supernatural notion. Instead, mind and self can only be produced by (or emerge from) physical brain activity. There’s no other rational possibility.

Let’s consider how we experience vision. We not only see what’s before us, but also things we remember, or even things we imagine. All of it could be encoded (like in a computer) into 1s and 0s — zillions of them. But then how do “you” see that as a picture? We imagine what’s been called a “Cartesian theatre” (from Descartes again), with a projection screen, viewed by a little person in there (a “homunculus”). But how does the homunculus see? Is there another smaller one inside his brain? And so on endlessly?

A more helpful concept is representation, applicable to all mental processing. Nothing can be experienced directly in the brain. If it’s raining it can’t be wet inside your brain. But your brain constructs a representation of the rain. Like an artist painting a scene. And how exactly does the brain do that? We’re still working on that.

Similarly, what actually happens when you experience something like eating a cookie, or having sex? The experience isn’t mainly in the mouth or genitals but in the mind. By creating (from the sensory inputs) a representation. But then how do “you” (without a homunculus) see or experience that representation? Why, of course, by means of a further representation: of yourself having that experience.

And according to neuroscientist Antonio Damasio, in his book Descartes’ Error, we need yet another, third order representation, so that you not only know it’s raining, but know you know it. Still further, the mind also must maintain a representation of who “you” are. Including information like knowledge of your past, and ideas about your future, which must be constantly refreshed and updated.

All pretty complicated. Happily, our minds — just like our computer screens — hide from us all that internal complexity and give us a smooth simplified interface.

 

A totally deterministic view might make our lives might seem meaningless. But Dennett writes that we live in an “atmosphere of free will” — “the enveloping, enabling, life-shaping, conceptualatmosphere of intentional action, planning and hoping and promising — and blaming, resenting, punishing and honoring.” This is all independent of whether determinism is true in some physical sense.

Determinism and causality are actually tricky concepts. If a ball is going to hit you, but you duck, would Laplace’s demon have predicted your ducking, so you were never going to be hit? In other words, whatever happens is what had to happen.

Dennett poses the example of a golfer missing a putt who says, “I could have made it.” What does that really mean? Repeat the exact circumstances and the result must be the same. However, before he swung, was it possible for him to swing differently than he wound up doing? Or was it all pre-ordained? Could he have, might he have, swung differently?

Martin Luther famously said, “Here I stand, I can do no other.” Was he denying his own free will? Could he have done otherwise? Or was his stand indeed a supreme exercise of personal will?

Jonathan Haidt, in his book The Righteous Mind, likened one’s conscious self to a rider on an elephant, which is the unconscious. We suppose the rider is the boss, directing the elephant, but it’s really the other way around. The rider’s role is just to come up with rationalizations for what the elephant wants. (This is a key factor in political opinions.)

And often we behave with no conscious thought at all. When showering, I go through an elaborate sequence of motions as if on autopilot. My conscious mind might be elsewhere. And how often have I (consciously) deliberated over whether to say a certain thing, only to hear the words pop suddenly out of my mouth?

A famous experiment, by neurologist Benjamin Libet, seemingly proved that a conscious decision to act is actually preceded, by some hundreds of milliseconds, by an unconscious triggering event in your brain. This has bugged me no end. I’ll try to beat it by, say, getting out of bed exactly when I myself decide, bypassing Libet’s unconscious brain trigger. I might decide I’ll get up on a count of three. But where did that decision come from?

However, even if the impetus for action arises unconsciously, we can veto it. If not free will, this has been called “free won’t.” It comes from our ability to think about our thoughts.

There’s a fear that without free will, there’s no personal responsibility, destroying the moral basis of society. Illustrative was a 2012 article in The Humanist magazine arguing against punishing Anders Breivik, the Norwegian mass murderer, because the killings were caused by brain events beyond his control. But “Free won’t” is a helpful concept here. Psychologist Thomas Szasz has argued that we all have antisocial impulses, yet to act upon them crosses a behavioral line that almost everyone can control. So Breivik was capable of choosing not to kill 77 people, and can be held responsible for his choice.

As his book title suggests, Dennett maintains that evolution produced our conscious self with free will. But those were unnecessary for nearly all organisms that ever existed. As long as the right behavior was forthcoming, there was no need for it “to be experienced by any thing or anybody.” However, as the environment and behavioral challenges grow more complex, it becomes advantageous to consider alternative actions. In developing this ability, Dennett says a key role was played by communication in a social context, with back-and-forth discussion of reasons for actions, highly enhanced by language. Recall the importance of representation. I mentioned the artist and his canvas. Our minds don’t have paints, but create word pictures and metaphors, multiplying the power of representation.

Another book by Dennett, in 1991, was Consciousness Explained. It said that the common idea of your self as a “captain at the helm” in your mind is wrong. It’s really more like a gaggle of crew members fighting over the wheel. A lot of neurons sparking all over the place. And what you’re thinking at any given moment is a matter of which gang of neurons happens to be on top.

Yet in Freedom Evolves, Dennett now winds up insisting that we can and do use rationality and deliberation to resolve such internal conflicts, and that “there is somebody home” (the self) after all, to take responsibility and be morally accountable. This might sound like positing a sort of homunculus in there. But let me offer my own take.

When the crewmen battle over the wheel, to say the outcome is deterministically governed by a long string of preceding causes is too simplistic. Instead, everything about that competition among neuron groups embodies who you are, your personality and character, constructed over years. Shaped by many deterministic factors, yes — your biology, genes, upbringing, experiences, a host of other environmental influences, etc. But also, importantly, shaped by all your past choices and decisions. We are not wholly self-constructed, but we are partly self-constructed. Your past history reflects past battles over the wheel, but in all those too, personality and character factors came into play.

They can change throughout one’s life, even sometimes from conscious efforts to change. And no choice or decision is ever a foregone conclusion. Even if most people, most of the time, do behave very predictably, it’s not like the chess computer that will play the same move every time. Causation is not compulsion. People are not robots.

Nothing is more deterministically caused than a smoker’s lighting up, a consequence of physical addiction on top of psychological and behavioral conditioning, and even social ritual. Seemingly a textbook case of B.F. Skinner’s deterministic behaviorism. Yet smokers quit! Surely that’s free will.

Now, you might say the quitting itself actually has its own deterministic causes — predictable by Laplace’s demon — whatever happens is what had to happen. But this loads more weight upon the concept of determinism than it can reasonably be made to carry. In fact, there’s no amount of causation, biological or otherwise, that predicts behavior with certainty. There are just too many variables. Including the “free won’t” veto power.

And even if Libet was right, and a decision like exactly when to move your finger (or get out of bed) really is deterministically caused — how is that relevant to our choices and decisions that really matter? When in college, I’d been programmed my whole life to become a doctor. But one night I thought really hard about it and decided on law instead. Concerning a decision like that, the Libet experiment, the whole concept of determinism, tells us nothing.

This is compatibilism: a view of free will that’s actually compatible with causation and determinism.

We started with the question, how can you have free will if an omniscient God knows what you’ll do? Well, the answer is, he cannot know. But — even if God — or Laplace’s demon — could (hypothetically) predict what your self will do — so what? It’s still your self that does it. A different self would do different. And you’re responsible (at least to a considerable degree) for your self. That’s my view of free will.

 

No, Virginia, there is no Santa Claus

December 22, 2018

We gave our daughter the middle name Verity, which actually means truth, and tried to raise her accordingly.

About the Easter Bunny and the Tooth Fairy, she wised up pretty early, as a toddler. About Santa, she was skeptical, but brought scientific reason to bear. A big unwieldy rocking horse she doubted could have gotten into the house without Santa’s help. So that convinced her — for a while at least.

Recently a first grade teacher was fired for telling students there is no Santa (nor any other kind of magic). This reality dunk was considered a kind of child abuse; puncturing their illusions deemed cruel; plenty of time for that when they grow up. However, the problem is that a lot of people never do get with reality. As comedian Neal Brennan said (On The Daily Show), belief in Santa Claus may be harmless but is a “gateway drug” to other more consequential delusions.

People do usually give up belief in Santa. But not astrology, UFOs, and, of course (the big ones) God and Heaven. The only thing making those illusions seemingly more credible than Santa Claus is the fact that so many people still cling to them.

America is indeed mired in a pervasive culture of magical beliefs, not just with religion, but infecting the whole public sphere. Like the “Good guy with a gun” theory. Like climate change denial. And of course over 40% still believe the world’s worst liar is somehow “making America great again.” (History shows even the rottenest leaders always attract plenty of followers.)

Liberals are not immune. Beliefs about vaccines and GM foods being harmful are scientifically bunk. In fact it’s those beliefs that do harm.

I’ve written repeatedly about the importance of confirmation bias — how we love information that seemingly supports our beliefs and shun anything contrary. The Economist recently reported on a fascinating study, where people had to choose whether to read and respond to eight arguments supporting their own views on gay marriage, or eight against. But choosing the former could cost them money. Yet almost two-thirds of Americans (on both sides of the issue) actually still opted against exposure to unwelcome advocacy! In another study, nearly half of voters made to hear why others backed the opposing presidential candidate likened the experience to having a tooth pulled.

And being smarter actually doesn’t help. In fact, smarter people are better at coming up with rationalizations for their beliefs and for dismissing countervailing information.

Yet a further study reported by The Economist used an MRI to scan people’s brains while they read statements for or against their beliefs. Based on what brain regions lit up, the study concluded that major beliefs are an integral part of one’s sense of personal identity. No wonder they’re so impervious to reality.

Remarkably, given the shitstorm so totally perverting the Republican party, not a single Republican member of Congress has renounced it.

The Economist ended by saying “accurate information does not always seem to have much of an effect (but we will keep trying anyway).”

So will I.

The REALLY big picture

December 19, 2018

We start from the fact that the Universe was created by God in 4004 BC.

Oops, not exactly. It was actually more like 13,800,000,000 BC (give or take a year or two). The event is called the Big Bang — a name given by astronomer Fred Hoyle intended sarcastically — and it was not an “explosion.” Rather, if you take the laws of physics and run the tape backwards, you get to a point where the Universe is virtually infinitely tiny, dense, and hot. A “singularity,” where the laws of physics break down — and we can’t go farther back to hypothesize what came before. Indeed, since Time began with the Big Bang, “before” has no meaning. Nevertheless, while some might say God did it, it’s reasonable instead to posit some natural phenomenon, a “quantum fluctuation” or what have you.

So after the Big Bang we started with what’s called the “Quantum Gravity Epoch.” It was rather brief as “epochs” go – lasting, to be exact, 10-43 of a second. That’s 1 divided by the number 1 followed by 43 zeroes.

That was followed by the “Inflationary Epoch,” which also went fairly quick, ending when the Universe was still a youngster 10-34 of a second old.

But in that span of time between 10-43 and 10-34 of a second, something big happened. You know how it is when you eat a rich dessert and virtually blow up in size? We don’t know what the Universe ate, but it did blow up, going from a size almost infinitely small to one almost infinitely large, in just that teensy fraction of a second; thus expanding way faster than the speed of light.

After that hectic start, things became more leisurely. It took another few hundred million years, at least, for the first stars to twinkle on.

This is the prevailing scientific model. If you find this story hard to believe, well, you can believe the Bible instead.

Here are some more facts to get your head around. Our galaxy comprises one or two hundred billion stars, and is around 100,000 light years across. A light year is the distance light travels in a year – about 6 trillion miles. And ours is actually a pipsqueak galaxy; at the bottom of the range which goes up to ten times bigger. And how many galaxies are there? Wait for it . . . two trillion. But that’s only in the observable part of the Universe; we can only see objects whose light could reach us within the 13.8 billion years the Universe has existed. Because of its expansion during that time, the observable part actually stretches 93 billion light years. We don’t know how much bigger the total Universe might be. Could be ten trillion light years across. (I don’t want to talk about “infinite.”)

Now, it was Hubble who in 1929 made the astounding discovery that some of the pinpoints of light we were seeing in the sky are not stars but other galaxies. And more, they are moving away from us; the farther away, the faster. Actually, it’s not that the galaxies are moving; rather, space itself is expanding. Jain analogized the galaxies to ants on the surface of a balloon. If you inflate it, the distance between ants grows, even while they themselves don’t move. And note, space is not expanding into anything. It is making more space as it goes along.

But there are two big mysteries. Newton posited that the force of gravity is proportional to mass and diminishes with the square of the distance between masses. However, what we see in other galaxies does not conform to this law; it’s as though there has to be more mass. We don’t yet know what that is; we call it “dark matter.” (There is an alternative theory, that Newton’s law of gravity doesn’t hold true at great distances, which might account for what we see with no “dark matter.”)

The other problem is that what we know of physics and gravity suggests that the Universe’s expansion should be slowing. But we have found that at a certain point during its history, the expansion accelerated, and continues to do so. This implies the existence of a force we can’t yet account for; we label it “dark energy.”

“Ordinary matter” (that we can detect) accounts for only 5% of the Universe. Another 24% is dark matter and 71% dark energy. (Remember that matter and energy are interchangeable. That’s how we get atom bombs.)

But, again, the story is a lot simpler if you choose instead to believe the Bible.

(This is my recap of a recent talk by Vivek Jain, SUNY Associate Professor of Physics, at the Capital District Humanist Society.)

“The Discovery” — Scientific proof of Heaven

December 6, 2018

Our daughter recommended seeing this Netflix film, “The Discovery.” It starts with scientist Thomas Harbor (Robert Redford) giving a rare interview about his discovery proving that we go somewhere after death.

This has precipitated a wave of suicides. Asked if he feels responsible, Harbor simply says “no.” Then a man shoots himself right in front of him.

Next, cut to Will and Ayla (“Isla” according to Wikipedia) who meet as the lone passengers on an island ferry. Talk turns to “the discovery.” Will is a skeptic who doesn’t think it’s proven.

Turns out Will is Harbor’s estranged son, traveling to reconnect with him at Harbor’s island castle. Where he runs a cult peopled with lost souls unmoored by “the discovery.” While continuing his work, trying to learn where, exactly, the dead go.

Meantime, people keep killing themselves, aiming to “get there” — wherever “there” is. Will saves Ayla from drowning herself and brings her into the castle.

Harbor has created a machine to get a fix on “there” by probing a brain during near-death experiences — his own. It doesn’t work. “We need a corpse,” he decides.

So Will — his skepticism now forgotten — and Ayla steal one from a morgue. This is where the film got seriously silly. (Real scientists nowadays aren’t body snatchers.) The scene with the dead guy hooked up to the machine and subjected to repeated electrical shocks was straight out of Frankenstein 1931.

This doesn’t work either. At first. But later, alone in the lab, Will finds a video actually had gotten extracted from the corpse’s brain. Now he’s on a mission to decode it.

I won’t divulge more of the plot. But the “there” in question is “another plane of existence.” Whatever that might actually mean. There’s also some “alternate universes” thing going on, combined with some Groundhog Dayish looping. A real conceptual mishmash.

One review faulted the film for mainly wandering in the weeds of relationship tensions rather than really exploring the huge scientific and philosophical issues. I agree.

The film’s metaphysical incoherence goes with the territory of “proving” an afterlife. There was no serious effort at scientific plausibility, which would be a tall order. Mind and self are entirely rooted in brain function. When the brain dies, that’s it.

The film didn’t delve either into the thinking of any of the folks who committed suicide, which would have been interesting. After all, many millions already strongly believe in Heaven, yet are in no hurry to go. But, as I have said, “belief” is a tricky concept. You may persuade yourself that you believe something, while another part of your mind does not.

The film’s supposed scientific proof presumably provides the clincher. Actually, religious people, even while professing that faith stands apart from questions of evidence, nevertheless do latch on to whatever shreds of evidence they can, to validate their beliefs. For Heaven, there’s plenty, including testimonies of people who’ve been there. But there’s still that part of the brain that doesn’t quite buy it. Would an assertedly scientific discovery change this?

I doubt it. Most people have a shaky conception of science, with many religious folks holding an adversarial stance toward it. Science is, remember, the progenitor of evolution, which they hate. Meantime — this the film completely ignored — religionists generally consider suicide a sin against God. Surely that can’t be your best route to Heaven!

The film did mention that people going on a trip want to see a brochure first. That’s what Harbor’s further work aimed to supply. Without it — without “the discovery” having provided any idea what the afterlife might be like — killing oneself to get there seems a pretty crazy crapshoot. Even for religious nuts.

Truth, beauty, and goodness

September 20, 2018

Which among the three would you choose?

I read Howard Gardner’s 2011 book, Truth, Beauty, and Goodness Reframed: Educating for the Virtues in the Twenty-first Century. (Frankly I’d picked it up because I confused him with Martin Gardner; but never mind.)

Beauty I won’t discuss. But truth and goodness seem more important topics today than ever.

Many people might feel their heads spin as age-old seeming truths fall. “Eternal verities” and folk wisdom have been progressively undermined by science. Falling hardest, of course, is God, previously the explanation for everything we didn’t understand. He clings on as the “God of the gaps,” the gaps in our knowledge that is, but those continue to shrink.

Darwin was a big gap-filler. One might still imagine a god setting in motion the natural processes Darwin elucidated, but that’s a far cry from his (God’s) former omnicompetence.

While for me such scientific advancements illuminate truth, others are disconcerted by them, often refusing to accept them, thus placing themselves in an intellectually fraught position with respect to the whole concept of truth. If one can eschew so obvious a fact as evolution, then everything stands upon quicksand.

Muddying the waters even more is postmodernist relativism. This is the idea that truth itself is a faulty concept; there really is no such thing as truth; and science is just one way of looking at the world, no better than any other. What nonsense. Astronomy and astrology do not stand equally vis-a-vis truth. (And if all truth is relative, that statement applies to itself.)

Though postmodernism did enjoy a vogue in academic circles, as a provocatively puckish stance against common sense by people who fancied themselves more clever, it never much infected the wider culture, and even its allure in academia deservedly faded. And yet postmodernism did not sink without leaving behind a cultural scum. While it failed to topple the concept of truth, postmodernism did inflict some lasting damage on it, opening the door to abuse it in all sorts of other ways.

All this background helped set the stage for what’s happening in today’s American public square. One might have expected a more gradual pathology until Trump greatly accelerated it by testing the limits and finding they’d fallen away. Once, a clear lie would have been pretty much fatal for a politician. Now one who lies continuously and extravagantly encounters almost no consequences.

It’s no coincidence that many climate change deniers and believers in Biblical inerrancy, young Earth creationism, Heaven, and Hell, are similarly vulnerable to Trump’s whoppers. Their mental lie detector fails here because it’s already so compromised by the mind contortions needed to sustain those other counter-factual beliefs.

But of course there’s also simple mental laziness — people believing things with no attempt at critical evaluation.

A long-ago episode in my legal career sticks with me. I was counsel for the staff experts in PSC regulatory proceedings. We had submitted some prepared testimony; the utility filed its rebuttal. I read their document with a horrible sinking feeling. They’d demolished our case! But then we went to work carefully analyzing their submittal, its chains of logic, evidence, and inferences. In the end, we shot it as full of holes as they had initially seemed to do to ours.

The point is that the truth can take work. Mark Twain supposedly said a lie can race around the globe while the truth is putting its shoes on. Anyone reading that utility rebuttal, and stopping there, would likely have fallen for it. And indeed, that’s how things usually do go. Worse yet, polemical assertions are often met with not critical thinking but, on the contrary, receptivity. That’s the “confirmation bias” I keep stressing. People tend to believe things that fit with their preconceived opinions — seeking them out, and saying, “Yeah, that’s right” — while closing eyes and ears to anything at odds with those beliefs.

A further aspect of postmodernism was moral relativism. Rejection of empirical truth as a concept was extended to questions of right and wrong — if there’s no such thing as truth, neither are right and wrong valid concepts. The upshot is nonjudgmentalism.

Here we see a divergence between young and old. Nonjudgmentalism is a modern tendency. Insofar as it engenders an ethos of tolerance toward human differences, that’s a good thing. It has certainly hastened the decline of prejudice toward LGBTQs.

Yet tolerance and nonjudgmentalism are not the same. Tolerance reflects the fundamental idea of libertarianism/classical liberalism — that your right to swing your fist stops at my nose — but otherwise I have no right to stop your swinging it. Nor to stop, for example, sticking your penis in a willing orifice. Nonjudgmentalism is, however, a much broader concept, embodying again the postmodernist rejection of any moral truths. Thus applied in full force it would wipe out even the fist/nose rule.

That is not as absurd a concern as it might seem. Howard Gardner’s book speaks to it. He teaches at Harvard and expresses surprise at the extent to which full-bore nonjudgmentalism reigns among students. They are very reluctant to judge anything wrong. Such as cheating on exams, shoplifting, and other such behaviors all too common among students. A situational ethic of sorts is invoked to excuse and exculpate, and thereby avoid the shibboleth of judgment.

Presumably they’d still recognize the clearest moral lines, such as the one about murder? Not so fast. Gardner reports on conducting “numerous informal ‘reflection’ sessions with young people at various secondary schools and colleges in the United States.” Asked to list people they admire, students tend to demur, or confine themselves only to ones they know personally. And they’re “strangely reluctant” to identify anyone they don’t admire. “Indeed,” Gardner writes, “in one session [he] could not even get students to state that Hitler should be featured on a ‘not-to-be-admired’ list.”

Well, ignorance about history also seems lamentably endemic today. But what Gardner reports is actually stranger than might first appear. As I have argued, we evolved in groups wherein social cooperation was vital to survival, hence we developed a harsh inborn judgmentalism against anything appearing to be anti-social behavior. That (not religion) is the bedrock of human morality. And if that deep biological impulse is being overridden and neutered by a postmodernist ethos of nonjudgmentalism, that is a new day indeed for humankind, with the profoundest implications.

Was America founded as a “Christian nation?”

August 13, 2018

We’re often told that it was. The aim is to cast secularism as somehow un-American, and override the Constitution’s separation of church and state. But it’s the latter idea that’s un-American; and it’s historical nonsense. Just one more way in which the religious right is steeped in lies (forgetting the Ninth Commandment).

Jacoby

They assault what is in fact one of the greatest things about America’s birth. It’s made clear in Susan Jacoby’s book, Freethinkers: A History of American Secularism.

Firstly, it tortures historical truth to paint the founding fathers as devout Christians. They were not; instead men of the Enlightenment. While “atheism” wasn’t even a thing at the time, most of them were as close to it as an Eighteenth Century person could be. Franklin was surely one of the century’s most irreverent. Washington never in his life penned the name “Christ.” Jefferson cut-and-pasted his own New Testament, leaving out everything supernatural and Christ’s divinity. In one letter he called Christian doctrine “metaphysical insanity.”

The secularism issue was arguably joined in 1784 (before the Constitution) when Patrick Henry introduced a bill in Virginia’s legislature to tax all citizens to fund “teachers of the Christian religion.” Most states still routinely had quasi-official established churches. But James Madison and others mobilized public opinion onto an opposite path. The upshot was Virginia passing not Henry’s bill but, instead, one Jefferson had proposed years earlier: the Virginia Statute of Religious Freedom.

It was one of three achievements Jefferson had engraved on his tombstone.

The law promulgated total separation of church and state. Nobody could be required to support any religion, nor be penalized or disadvantaged because of religious beliefs or opinions. In the world of the Eighteenth Century, this was revolutionary. News of it spread overseas and created an international sensation. After all, this was a world still bathed in blood from religious believers persecuting other religious believers. It was not so long since people were burned at the stake over religion, and since a third of Europe’s population perished in wars of faith. Enough, cried Virginia, slashing this Gordian knot of embroiling governmental power with religion.

Soon thereafter delegates met in Philadelphia to create our Constitution. It too was revolutionary; in part for what it did not say. The word “God” nowhere appears, let alone the word “Christian.” Instead of starting with a nod to the deity, which would have seemed almost obligatory, the Constitution begins “We the people of the United States . . . .” We people did this, ourselves, with no god in the picture.

This feature did not pass unnoticed at the time; to the contrary, it was widely denounced, as an important argument against ratifying the Constitution. But those views were outvoted, and every state ratified.

It gets better. Article 6, Section 3 says “no religious test shall ever be required” for holding any public office or trust. This too was highly controversial, contradicting what was still the practice in most states, and with opponents warning that it could allow a Muslim (!) president. But the “no religious test” provision shows the Constitution’s framers were rejecting all that, and totally embracing, instead, the religious freedom stance of Virginia’s then-recent enactment. And that too was ratified.

Indeed, it still wasn’t even good enough. In the debates over ratification, many felt the Constitution didn’t sufficiently safeguard freedoms, including religious freedom, and they insisted on amendments, which were duly adopted in 1791. That was the Bill of Rights. And the very first amendment guaranteed freedom of both speech and religion — which go hand-in-hand. This made clear that all Americans have a right to their opinions, and to voice those opinions, including ideas about religion, and that government could not interfere. Thus would Jefferson later write of “the wall of separation” between church and state.

All this was, again, revolutionary. The founders, people of great knowledge and wisdom, understood exactly what they were doing, having well in mind all the harm that had historically been done by government entanglement with religion. What they created was something new in the world, and something very good indeed.

Interestingly, as Jacoby’s book explains, much early U.S. anti-Catholic prejudice stemmed from Protestants’ fear that Catholics, if they got the chance, would undermine our hard-won church-state separation, repeating the horrors Europe had endured.

A final point by Jacoby: the religious attack on science (mainly, evolution science) does not show religion and science are necessarily incompatible. Rather, it shows that a religion claiming “the one true answer to the origins and ultimate purpose of human life” is “incompatible not only with science but with democracy.” Because such a religion really says that issues like abortion, capital punishment, or biomedical research can never be resolved by imperfect human opinion, but only by God’s word. This echoes the view of Islamic fundamentalists that democracy itself, with humans presuming to govern themselves, is offensive to God. What that means in practice, of course, is not rule by (a nonexistent) God but by pious frauds who pretend to speak for him.

I’m proud to be a citizen of a nation founded as a free one* — not a Christian one.

* What about slaves? What about women? Sorry, I have no truck with those who blacken America’s founding because it was not a perfect utopia from Day One. Rome wasn’t built in a day. The degree of democracy and freedom we did establish were virtually without precedent in the world of the time. And the founders were believers in human progress, who created a system open to positive change; and in the centuries since, we have indeed achieved much progress.