Friday, December 30, 2005

Catch That And Paint It Green


A few postings ago, (last week I think) I was talking about Dr. Richard Dawkins, a British biologist who thinks the so-called "Intelligent Design" theory is just about the dopiest thing he ever heard of, except maybe for the idea of God itself. Dawkins is a militant atheist, very in-your-face about it, and I took him to task, second-hand as it were, for being so snotty and disrespectful to people whose beliefs might be different from his.

At the same time, I tried to make it clear that I don't have much more use for the Intelligent Design theory than he does. Dawkins comes across as a most unpleasant individual, and I wouldn't want to have lunch with him, but I agreed with him that I.D. doesn't meet the most basic requirement of real science: that you begin with no assumptions and work forward based on the available evidence. I.D. doesn't meet that criterion: it starts off with the assumption that an intelligent creator lies behind the universe, based in turn on the idea that some life forms are too complex to be accounted for by natural selection. Proponents of Intelligent Design are selective about the evidence they accept: if it doesn't fit their assumption, it's tossed out. This is the same mistake that orthodox Marxists made in the last century when they tried to assert the "scientific" nature of Marxism--any evidence they encountered that didn't dovetail with the Marxist vision of economic and social apocalypse, followed by the establishment of the worker's paradise on earth, was dismissed.

Marxism belongs (if it still belongs anywhere) in the sociology classroom, not the scientific lab. And Intelligent Design's logical place is in the theological seminary, not in biology class.

While kicking all of this around, I got to talking about the whole creation-vs.-evolution brouhaha, of which the Intelligent Design controversy is a sideshow, and expressed my amazement that now, 80 years after the Scopes "Monkey Trial" in Tennessee, in which a biology teacher was put on trial for teaching Darwin's theory of natural selection, people are still arguing about this.

Conservative Christians, with the Christmas season over, can now take a break from the trenches of the "war on Christmas" and get back to their other pet conspiracy, the claim that evolution is being taught in classrooms as established fact and not as theory.

Fundamentalist religion is, and always has been, bellicose. From Savonarola to William Jennings Bryan to Osama bin Laden, fundamentalism has always been more interested in fighting enemies than in discussing ideas, and isn't especially interested in the Areopagitica notion of letting opposing ideas clash in the public square, free of censorship, with the idea that if truth and falsehood can duke it out on equal terms, without help from the sidelines, truth will prevail. (John Milton, who advocated just this when he wrote the Areopagitica, was no slouch of a Protestant believer himself, but we'll leave that discussion for another day.) Religious fundamentalists, by and large, don't have much patience for the Areopagitica approach: they think their enemies already have the deck stacked. "They're out to get us" has been a favorite religious rallying cry since the Reformation (when it was often true.)

A generation ago, conservative Christians, led by Rev. Tim LaHaye, (who has made buckets of money off the end of the world with his Left Behind novels) threw a collective hissy-fit over their claim that so-called "secular humanism" was being taught in public school classrooms as a form of religion. The uproar embodied evangelicals' frustrated response as they watched school districts in the early 1980s, knuckling under to pressure from people like the late Madalyn Murray O'Hair and groups like the ACLU, moving to rid public school classrooms of the Judeo-Christian references that had been hitherto quite common. Obvious counter-tactic: claim it was all a secret, secular-atheist anti-God conspiracy.

I was a young newspaper reporter in 1981-82 and, since education was my "beat," I got to cover this controversy for my paper. What fun I had: born-again Christians wanted to talk about this endlessly, but when I went in search of one of their villainous "secular humanists," I couldn't find one. I ended up interviewing the superintendent of the local elementary school district, who wasn't quite sure what I was talking about and had to go do his homework first.

One pentecostal lady in particular gave me a head-scratch: alleging the conspiracy in the classroom to be very real, she told me earnestly that she believed in the hydra of "circular humanism."

I asked around the newsroom if anyone had any idea what "circular humanism" might be.

Our courthouse-beat guy, an editorial wit who also happened to be a bit portly, spoke up.

"A circular humanist is a humanist who eats too much," he explained. "I'm a circular humanist."

The "secular humanism" thing had pretty much blown over by the end of the first Reagan administration. But the creation-vs.-evolution thing really has legs: it's been going on at least since the drama in that broiling-hot Tennessee courtroom in 1925, and shows no signs of losing steam.

Which brings me to Christmas Eve. Last week we had some friends over for a Christmas Eve pig-out: roast turkey with all the trimmings, hors d'oeuvres, a fancy dessert, lots of white wine. A good gorge was had by all, including a good friend of mine who might have been created by Dostoevski. He is both passionately religious...and a drunk. A would-be journalist, in 2004 he took up the cudgels for Mel Gibson's film The Passion Of The Christ against an army of supercilious unbelievers in the media whom he insisted were indulging in a regular sneer-fest over this movie because of its religious content (as well they may have been.) He fired off a spirited letter to the editor, exulting in the victory of Gibson's film over the sneerers, and shortly after was found passed out under a tree in a public park. Something of a sad case, my pal, but he's passionate about what he believes in, and also what he doesn't. And he most decidedly does not believe in Charles Darwin (who, by the way, I don't think he's ever read.)

Now, I'll give my friend his due: on Christmas Eve he was reasonably sober. But he was also determined to make his point, which was that evolution is being taught as FACT in public classrooms, and not as theory. He and I amused the rest of the company by practically getting into a shouting match about this, my point being that most of the world, even most of the religious world, including the late Pope John Paul II, now accepts that Darwin's ideas about natural selection are so well-supported by scientific evidence that they must be treated, if not as established Truth, at least as very solid theory.

My friend was having none of this. While he pulled up short at defending the literal truth of the account of creation in the book of Genesis, seven days to make the universe, the Garden of Eden, the Adam and Eve story, etc., he nevertheless insisted at the top of his lungs that evolution is ONLY A THEORY, evidence supporting it be damned, and in defense of I.D. he offered me some story he'd heard in biology class about how life may have begun when a lucky pool of amino acids was struck by a well-directed bolt of lightning.

We dropped this discussion when it was time to eat. But a few days later, in an e-mail, he wanted to get started again.

Why is he so exorcised over this, I kept asking myself? What is it about Charles Darwin that makes so many religious people (most of whom have never been in the same room with a copy of The Origin of Species) feel their faith so threatened that they feel they have to wage constant war against ideas laid out in a book they've never read?

They don't seem particularly bothered by other scientific theories, even ones that are as well supported by evidence as those of Darwin. When was the last time you heard an evangelical Christian getting apoplectic over Einstein's theory of relativity? Or the big bang theory? (actually, religious conservatives like the big bang for the same reason that atheists hate it: it supports the idea that the universe was created, somehow.) But there's something about the idea that our species evolved from lower species that just drives them nuts. "My Ancestors Ain't Apes!" screamed one of the placards waved by the faithful in Inherit The Wind, the Broadway dramatization of the 1925 Scopes trial. Well, no, they probably weren't, says Darwin. Darwin doesn't say that people evolved from apes. He offers evidence that people and apes had a common ancestor. But to evangelicals like my pal the holy drunk, this is so much pettifogging. To teach natural selection in biology class is just part of the great secularist conspiracy against America's religious foundations.

Heck, if he wants to get upset about a scientific theory, I could refer him to an essay by Tom Wolfe, Sorry, But Your Soul Just Died. Without taking sides one way or the other, Wolfe discusses current research in the newly-established field of neuroscience, which is taking materialism to the nth degree. In picking apart the secrets of the brain, neuroscience is calling into question not only the existence of the soul, but the very notion of self-hood. There may be no such thing as "me." According to the neuroscientists, the most fundamental of all human ideas, that I am I and you are you, may be nothing but an illusion based on electronic impulses popping around. Jesus! If my fundamentalist pal thinks Darwin is still something to feel threatened by, let him consider the implications of all this, and chances are he'll forget all about fretting that his ancestors may be apes.

And by the way, if the very idea of selfhood is nothing but an illusion, what does that portend for all of our other cherished notions about what's true and what's false, including the latest scientific innovation or the oldest and most cherished religious belief?

Remember the old conundrum that the 16 year-old atheist in the high-school hallway used to whip out in order to bedevil his believing friends? I used it in the earlier posting; here it is again: "If God can do anything, can he make a rock so heavy that he can't pick it up?" Of course the fun in this question is that it contains its own contradiction. Its logic is so faulty as to be laughable, but plenty of credulous high-school kids were stumped by it. Either way they answer, God clearly can't do everything.

Well, apparently neither can the Devil. Long before I heard the rock conundrum for the first time, I remember hearing this joke on the playground:

"Three guys died and went to hell. The Devil told them that if they could come up with one thing he couldn't do, he'd let them go to heaven instead. The first guy says, 'OK, chop off the nose of everybody in the world.' The Devil promptly swoops out and does that. No cigar. The second guy says, 'OK, chop off the ears of everybody in the world.' The Devil immediately does that as well. When it comes to the third guy's turn, he offers the Devil a big, loud fart. 'Catch that and paint it green,' he says. Well, even the Devil can't do that, so the third guy gets to go to heaven.

I doubt whether, even if all the school districts in the world were to suddenly agree to teach evolution as "only a theory," my friend and his cohort would be satisfied. In due time they'd be quibbling about how earnestly it was being put forth as a theory, or complaining that textbooks didn't offer alternative theories, even though they have yet to come up with one that merits anything more than a quick once-over followed by a "try again." Meanwhile, he might want to ponder what there is to be gained from continuing to holler about evolution, a theory more than 150 years old, when clearly the blasphemies of science, as one of the lead characters in Inherit The Wind called them, are taking us to much more "dangerous" shores.

Lightning struck a lucky pool of amino acids? Catch that and paint it green.

Thursday, December 29, 2005

Omphalos Eve



My old friend Jeff Bertolucci is a freelance writer these days, living, I think, somewhere out there between Los Angeles and Thousand Oaks. Every now and then we swap e-mails, but I haven't actually seen him in more than 20 years.

We used to be roommates, J.B. and I. (We got into the habit, back there in the mid-1980s, of addressing each other by our initials, and we still do.) Roommates, and partners in hardship.

We were two members, circa 1984-85, of a three-member news department at radio station KUIC, 95.3 FM, in Solano County, California. (Solano County, if anyone cares, lies between the Sacramento Valley and the San Francisco Bay area. Its southwestern boundary abuts the Strait of Carquinez, where the little city of Benicia looks across the water at Martinez, in Contra Costa County, the birthplace of Joe DiMaggio.)

J.B. and I got along just fine, but it wasn't friendship that had thrown us together. We were so poorly paid that we had to double up and share an apartment in Vacaville, where the station's studio was. We weren't alone: a succession of disc jockeys and newscasters had shared that same apartment, in succession: we passed it along like a family heirloom as the names changed at Quick-95, the air staff's perenially miserable pay being a guarantee of high turnover. The two-bedroom apartment came to be known as "The Quick-95 Refugee Camp."

Since we never had any money, we seldom went out in the evenings, which left plenty of time for philosophical discussion (when we weren't watching reruns of Barney Miller, lampooning egotistical disc jockeys with the use of a tape recorder, or writing scurrilous little songs about station management.) One such philosophical discussion concerned the premise that "Everything happens for a reason," something you often hear optimists say. J.B. had no use for optimists, and would dismiss the notion with a snort. He preferred to believe in something he called "The random crapshoot:" if you come up seven, you win. If you come up snake-eyes, you lose. It's all chance and probability; no teleology involved.

I suspect that, every year from now on, about this time of year, I'll be wondering about that.

For nearly 18 years, the duration-on-paper of my first marriage, December 30 was my wedding anniversary. My first wife, Chris, and I were married in Vienna on Dec. 30, 1987. It had been just over two years since J.B. and I had said our farewells and parted paths in Vacaville--I'd decided to give up radio and join the foreign service, where I remained for 13 years. I met Chris in Frankfurt, Germany, my first overseas post.

These days my wedding anniversary is October 29--I remarried just this past fall. But December 30 will remain as much an "anniversary" for my second marriage as it was for my first.

My current wife, Valerie, was not only present at that first wedding in Vienna, but she was one of two specially-invited friends who had come along to witness the event. The other was a co-worker of mine from the U.S. Consulate in Frankfurt named Peter, whom Valerie was seeing at the time. In effect, the woman who would become my second wife was maid of honor at my wedding to my first. That's a bit unusual, but I suppose it's not unheard-of.

What sets me wondering about the random-crapshoot versus the everything-happens-for-a-reason argument is what happened last December 30.

Following my wedding to Chris, the four of us went back to Frankfurt and promptly scattered. Valerie was an official with the Immigration and Naturalization Service; I was a State Department employee. Chris and I, newly-married, promptly went off to my second overseas tour, which turned out to be in Brasilia, the capital of Brazil. We were there for three years, then went on to Ivory Coast. From Frankfurt, Valerie was sent back to Washington, D.C. and eventually got posted at an INS training center in New Mexico. I heard that she and Peter had gotten married, but that was the last Chris or I ever heard of either of them. For the next 17 years, nary a word was exchanged between any of us.

I don't know about Chris, but after one perfunctory, unsuccessful attempt, early in our marriage, to get an update on Valerie and Pete, I more or less forgot about both of them. Chris and I eventually went our separate ways. But way leads on to way, as Robert Frost wrote, and I don't recall giving either Valerie or Pete another thought as the 1990s spun out and the 2000s came on. After playing their respective bit parts in our lives, they vanished into the past.

Then, on the night of Dec. 30, 2004, an odd thing happened. Chris and I had been separated for more than ten years at that point. We hadn't bothered getting a divorce because it didn't seem that pressing--neither of us had remarriage prospects and we had no children either. The fact that December 30 was Chris' and my anniversary remained in the background, so to speak: when that date would come around each year, I'd usually recall the fact but give it no more thought than that.

But that night, 12/30/04, I was minding my own business, sound asleep in California. And after not having given Valerie a thought that I could recall in more than 17 years, I dreamed about her. All night. I could no longer even remember her last name, but I dreamed about her. Perhaps it had something to do with the fact that it was my wedding anniversary--the subconscious often messes with us--but I had had 16 previous anniversaries, and had never dreamt about Valerie on any of them. What was going on here?

The next day, New Year's Eve, I was intrigued enough by the dreams to jump on the Internet and "google" Valerie. She appeared immediately; seemed she had left INS, had divorced Peter years earlier, and was now a Washington, D.C. realtor. I e-mailed her. She was preparing to go out for New Year's Eve, but she e-mailed me right back. Late that night, she came home from her party and read my blog. The next day the e-mails continued. They started flying thick and fast. Then there was a phone call.

Then, on October 29 of this year, there was another wedding.

Friends have offered me all manner of explanations for this, including the intervention from beyond of my late younger sister, who died three months before Valerie and I "found" each other again. Perhaps it was just the right time for such a thing to happen. Perhaps it was just J.B.'s random crapshoot, in the form of my fevered neurons playing with history on the night of my 17th anniversary (old style.) But whatever the explanation, the date of Dec. 30th has been a fateful one for me, one upon which, when it comes around each year, I now have food for serious late night thoughts.

At least now I have someone to share those late night thoughts with. And she knows about this particular daisy chain of dates. She's here now, and she was there when it started.

Cue the music from The Twilight Zone, somebody. And Happy New Year.

Saturday, December 24, 2005

Dickens Redux

Last night my wife and I were watching the 1938 film of A Christmas Carol on TCM, the version starring Reginald Owen as Scrooge, with Leo G. Carroll (whom I remember as Mr. Waverly on The Man From U.N.C.L.E.) as Marley’s ghost. I’ve always been so partial to the 1951 version of this story, with Alistair Sim as Scrooge, that I had never seen this particular version.

I was surprised to find that in some ways it’s actually better than the 1951 version, but it’s an MGM production, and MGM doubtlessly had more money to spend than the UK production company which made the later Alistair Sim movie, Britain's postwar economy being the shambles that it was. The story is “fleshed out” a bit more here, with additional plot details, including some that don’t occur in Dickens’ story, such as Bob Crachit’s actually getting sacked on Christmas Eve after he throws a snowball at Scrooge and knocks his hat off.

Why do these quaint images of snowbound London circa 1840 have such an emotional appeal for us Americans? Our experience is so different from that of people who lived in England during the Industrial Revolution. Well, I guess the answer lies in a combination of things, collective nostalgia being one: when the camera slowly does a “zoom out” at the film’s opening, offering a panoramic view of snow falling on Victorian London, we’re looking at a goodly number of all the Christmas cards we’ve ever seen. And then of course there is Dickens’ ingenious tale, which like all ingenious tales functions on more than one level. It can be read as a naturalistic story (with ghosts) about early-industrial London. For example, a detail that’s never mentioned in any of the film versions I’ve ever seen, but Dickens provides us, is that Scrooge lived alone in upstairs rooms in a building most of which had been rented out as office space, above a wine-merchant's cellar, which explains the kegs over which he hears the chains being dragged as Marley’s ghost approaches. It also accounts for the pervasive darkness of the place on Christmas Eve. Business has been concluded for the day, the offices have been vacated and locked, the lights put out. Scrooge is the only occupant of the building who actually lives there. That fact doesn't bother him, nor does the deep gloom. "Darkness is cheap, and Scrooge liked it," Dickens writes.

It would be ridiculously easy to give this story a “Marxist” reading, right down to the stuff about religion as the supposed “opiate of the masses:” it is set amidst the very place and time in which Marx conducted his badly-flawed and highly spurious “research” and came up with his apocalyptic notion about the workers rising up and throwing off their chains. And had Dickens stuck to a Zola-like, naturalistic approach and left the ghostly (e.g. spiritual) themes out, for example restricting Scrooge's encounters to earthbound folk only, perhaps the story might be forgotten today. But he didn’t, and generations of readers (not to mention at least four generations of moviegoers—the earliest Christmas Carol film dates from 1915) have every reason to be grateful: with the accompanying ground bass of early industrial London being played with his left hand, so to speak, Dickens played out, with his proverbial right, a melody for all time. The tale is an allegory, that most unfashionable of literary forms today, and the themes of rage and redemption, injury and forgiveness, that permeate this little story are ubiquitous in western literature. They can be traced all the way back to the Old Testament, and very easily to the New—if you think about it, the way in which Dickens allows Scrooge to suffer, “die” and be reborn might be read as a sly metaphor, just as as some interpreted E.T.: The Extraterrestrial, 150 years after Dickens, as a not-so-subtle retelling of the Passion story. Whether Dickens was having some fun with us or not, (I tend to think he was, in fact I tend to think he couldn’t resist) he gave us a story that endures because there is so much in it that’s perennial, even if we have moved on from his world, befogged with coal dust, plagued by tuberculosis, workhouses and debtor’s prison. (For the definitive word on this last, check out Dickens' Little Dorrit, a big masterpiece.)

In a nasty, satirical little “Christmas carol” written in the late 1950s, iconoclastic songwriter Tom Lehrer exhorts one and all, tongue bulging in cheek, to “Kill the turkeys, ducks and chickens/Mix the punch, drrrraag out the Dickens.” I for one plan to "drag out the Dickens" this year, next year and in every coming holiday season I can forsee. Webster defines a classic as “a work of enduring excellence.” Well, if something is of enduring excellence, it deserves to be read (and watched) over and over. When I was young, I read Death In Venice once a year. Hemingway claimed he did the same thing with King Lear. A Christmas Carol deserves similar appreciation, because, in its limited way, it’s similarly great.

Happy Holidays, one and all.

Thursday, December 22, 2005

Shoulda Been There

I was reading, just a week or two ago, about the popularity--nay, ubiquity--of Top Ten Lists in contemporary America.

People seem to just love these things. Differing explanations have been offered for this, the most obvious one being that Top Ten Lists give consumers a quick shorthand version of the best things they should be buying, watching, listening to, eating, downloading or whatever.

Also, Top Ten (or 20) lists are a quick-and-easily-digestible version of entertainment. They spring from the same source as the editorial policy that Jeff Goldblum had to follow as a People magazine journalist in the film The Big Chill. His stories, Goldblum explained, could be no longer than it would take the average person to read while taking the average shit. Top Ten lists run in that crowd.

I've seen so many of these things that I began wondering if I could come up with a Top Ten list of my own. But what hasn't been Top Ten listed? Books, movies, plays, CDs; scenes from movies, scenes from plays, tracks from CDs; television shows, celebrity puke moments, sports bloopers, martini recipes...they've all pretty much been done.

But then, as I was lying in the bathtub reading the Los Angeles Times last weekend, (I love to read the newspaper in the bathtub, and there is definitely an art to doing it without getting the paper wet) it hit me that not only do we live in tumultuous times, but we have as far back as I can remember.

Then, suddenly, an idea came to me for my own, customized Top Ten list: Moments I Wish I Could Have Witnessed.

Right away, this called for some ground rules. Making the parameters too broad could result in mere silliness: "Gee, it would have been nice to meet Jesus," that sort of thing. (Frankly, I don't know how Jesus and I would have managed anyway: I don't speak Aramaic, and when he was around, English hadn't been invented yet.)

I decided, therefore, to restrict my Top Ten list of things I wish I had been there to see, to events that have occurred in my own lifetime. No shortage of possibilities there: I'm 50 this year, and have seen much, though usually at a distance, which is the reason for my list.

Here then, are ten moments in my lifetime I wish I could have witnessed up-close, rather than seeing them on TV or reading about them later in a book or magazine:

10. The moment in April, 1970, at the Manned Spacecraft Center in Houston, when Mission Control burst into celebration upon learning that the crew of the crippled Apollo 13 spacecraft had made it safely back to earth.

9. Pope John Paul II's prison-cell conversation with Mehmet Ali Agca, the man who had tried to kill him several months earlier.

8. The night Norman Mailer slugged Gore Vidal at a cocktail party.

7. Televangelist Jimmy Swaggart's tearful "I have sinned" speech after he got caught at a motel with a prostitute.

6. Hank Aaron's 715th home run in 1974, breaking Babe Ruth's record for career homers.

5. The Bayreuth Wagner Festival. (I lived in Germany twice, and never got there.)

4. President Ronald Reagan's "Mr. Gorbachev, tear down this wall" speech in Berlin in 1987. (Ditto: at the time, I was in Frankfurt, just a couple of hundred miles away.)

3. Alexander Solzhenitsyn's 1994 return to Moscow after having been exiled by the Soviet government exactly 20 years earlier. (I JUST missed this, by the way. I was in Moscow that spring, but by the time Solzhenitsyn's train pulled in, I was back in Washington.)

2. The 1974 Academy Awards show, at which David Niven's presentation was interrupted by a streaker, prompting Niven to deliver a truly "withering" quip.

1. The Boston Red Sox' victory in the 2004 World Series.

And I'll brook no quibbling about priorities: these came off the top of my head, in no particular order.

And so to breakfast.

Tuesday, December 20, 2005

Wabbit Season! Duck Season!


Like just about everyone my age, I love the old Warner Brothers cartoons.

I mean, I really LOVE 'em. My younger sister and I were so fond of watching Looney Tunes together on the Cartoon Network that, for several months after she died in 2004 at age 47, I couldn't bear to watch them. They were something special that we shared, Lynne and I. Of course we were re-connecting with our childhood, and there's no childhood memory more idyllic for a baby-boomer than that of sitting in front of the tube on Saturday morning, eating cold cereal and watching one cartoon after another in anticipation of a long, lazy Saturday, with the school classroom a blissful day-and-a-half away (what a long time a weekend is when you're 9!)

My favorite among the Saturday-morning lineup in those mid-1960s was The Bugs Bunny/Roadrunner Show. And I'd be willing to bet that almost everyone my age can sing the opening song: "Overture, curb the lights, this is it, the night of nights, No more rehearsing and nursing a part; we know every part by heart..."

Those marvelous cartoons, so sophisticated in their humor because so many of them were originally made to be seen in movie theaters, where their creators knew there would be adults in the audience, have been entertaining me and my kind for 50 years and more. But what's surprising is to realize not just how entertaining they have been, but how influential, and I'm not only talking about the world of animated cartoons. I'm talking about popular culture in general, and how people my age perceive the world.

Last Sunday afternoon I was watching James Cagney in White Heat on the Turner Classic Movies channel. It's a gangster classic, and the most intense performance Jimmy the C ever gave on film. Cagney pulls out all the stops as Cody Jarrett, a psychotic killer with a mother complex and a tendency toward migraine headaches. Edmund O'Brien plays an undercover cop who joins Cagney's gang in an attempt to bring him to justice. They escape from prison together, taking off with hostages in a stolen car.

As I watched this film unfold, what was I thinking about?

Bugs and Thugs.

Fellow members of the cartoon-buff inner circle, (those who bother to remember the titles) know which cartoon I'm talking about. Bugs Bunny gets abducted by two gangsters, "Rocky" and "Muggsy." Rocky is the mean little guy with the big gun and the enormous hat that hides his eyes; Muggsy is the big, stupid one in the too-tight suit who says "Okay, Boss." During the course of the seven-minute cartoon, Bugs proceeds to torment his captors to the point where, at the end of the cartoon, they throw themselves into the arms of the police, begging to be arrested.

In White Heat, when Cagney, O'Brien and company pull into a gas station with a an overheated radiator, O'Brien goes into the men's room and scrawls a message on the mirror for the police.

I thought immediately of the same scene in Bugs and Thugs: Bugs suggests that he's going to the john, then heads straight for a public phone and gives the cops all the information they need, screaming out a detailed description of the car even as he's caught in the act and he (and the phone booth) are being dragged back to the getaway car.

If you think about it for a moment, it's...kinda scary, to use that trendy word. And the more I thought about it, the more I came to realize that Bugs Bunny, Daffy Duck and company had truly colored my perceptions of works of art at all levels. Did I say popular culture? It goes way beyond that. The most obvious example: poor old Richard Wagner. If Wagner had any notion that, a few decades after his death, a bunch of predominantly Jewish guys in Hollywood (Wagner was a rabid anti-semite) would be using his music to orchestrate the antics of a cast of cartoon characters, mostly chasing each other, I'm sure he would have...well, had sleepless nights anyway.

Let's pretend you don't know anything at all about classical music, and wouldn't know what I was talking about if I were to mention Wagner's Ride of the Valkyries from his opera Die Walkuere (The Valkyries, actually.) But if I were to put it on the CD player, oh, you'd recognize it all right. In your mind's eye you'd see Elmer Fudd jumping around with a poorly-fitting horned helmet rattling around on his head, jabbing a spear into a hole in the ground and shouting, "Kill the wabbit! Kill the wabbit!" I swear, when I first brought home from the public library, at about age 17, an album by George Szell and the Cleveland Orchestra of excerpts from Wagner's Ring operas, the first notes of the Ride of the Valkyries brought one visual image to my mind: a television screen.

When my high school chum Randy first heard Rossini's William Tell overture, he described the slow section toward the end of it as "Bugs Bunny waking up music."

Talk about the power of music.

Of course one of the greatest things about the Warner Brothers cartoons, particularly the ones made in the mid-1950s, was the way they were forever satirizing Hollywood. Daffy Duck and Porky Pig, better character actors than Bugs apparently, teamed up for wacky takes on Sherlock Holmes, Charlie Chan and the western and science-fiction genres. Dragnet got a futuristic treatment in Porky and Daffy's classic Rocket Squad. Errol Flynn is never far away: not only are his films of derring-do sent up in Daffy Duck's turns as The Scarlet Pumpernickel and an extremely inept Robin Hood, but Flynn himself actually makes a cameo appearance in one of Bugs' films, in full costume as the master of Sherwood Forest. Drip-Along Daffy takes the boys to the Wild West, ("Anybody care to slap leather? At high noon? West of the Pecos?") and in the dawn of the space age, WB satirized the old Buck Rogers serials: "Duck Dodgers in the Twenty-Fourth and a Half Century," cast Daffy as the caped space hero and Porky as his "eager young space cadet."

I read somewhere that film director George Lucas was so fond of this cartoon that he wanted to run it in theaters ahead of the first Star Wars movie, as a way of reminding people that he, too, was sending up the Buck Rogers genre with his outer space shoot-'em-up. Copyright problems, I recall, prevented him from tipping his hat to the Warner Brothers' space adventure.

I fear that this is all over and done with now, and its like will never be seen again. The great Warner Brothers cartoons were products of an era when most of the public was reading off the same bulletin board: from the 1930s to the 1980s, most people were pretty much seeing the same movies, listening to the same radio programs and later, in the world of the Big Three networks, watching the same TV shows. It was easy for everyone, even kids, to get the jokes. A cartoon short is no longer part of the moviegoing experience: instead of a good laugh ahead of a feature film, we now get bombarded with 15 minutes of high-decibel advertising when we go into a movie theater.

Without the adult audience out there, cartoons have been dumbed down to the lowest common denominator, as befits an audience of children who just want to see colorful shapes and hear loud noises. And don't talk to me about The Simpsons or South Park. I've watched both. There is nothing in demeaning, mean-spirited humor or potty-mouth talk that's going to give much to the next generation. And anyway, with 200-channel TV now, not to mention the Internet, the great audience out there is getting so compartmentalized that it's becoming increasingly difficult to identify a broad, general group that will recognize the thing being ridiculed, pick up the cultural references, get the jokes.

So what'll I do? DVD to the rescue: I already have the first DVD set of the Warner Brothers classics, and plan to get the others as they come out. I don't have any children, but I'm happy to share. Pack up your PJs and come on over some Saturday morning. I'll bust out the Kix and the Cocoa Puffs. Here's a favorite: Operation Rabbit. You see, Wile E. Coyote, Super Genius, thinks he's going to catch Bugs Bunny and eat him for supper. So he builds this contraption...

Saturday, December 17, 2005

If You Can't Say Something Nice...

When I was a senior in high school (1972-73), I was on the speech squad: I participated in speech competitions.

I wasn't a debater. I left that for all those future lawyers out there. No, my specialty areas were impromptu and dramatic interpretation. In impromptu you were given a topic, and then had two minutes to prepare a five-minute speech. Impromptu taught me that I had a natural talent for B.S., because that's what impromptu is. You can't do much research in two minutes--it all has to come off the top of your head. I was pretty good at this, actually; I won a third-place trophy once at a speech tournament for impromptu speaking.

My other specialty, dramatic interpretation, was actually solo acting. In D.I., as we called it, you would take a scene from a famous play or novel, usually involving two characters, memorize it and then stand in front of a judge and act out both parts. I never won any trophies at this, but it was fun.

For most of my year as a Spartan Speaker, (my high school's nick was "The Spartans") My D.I. was a cutting from Jerome Lawrence and Robert E. Lee's Inherit The Wind, a Broadway play of the mid-1950s that made it to the silver screen in 1960 as a vehicle for Frederic March and Spencer Tracy. As those who have caught this flick on AMC or TCM know well, it is a fictionalized reshaping of the events around the John Scopes "Monkey Trial" of 1925, in which a schoolteacher in Dayton, Tennessee was put on trial for teaching Darwin's theory of evolution in a biology class. March and Tracy played characters representing the two stellar attorneys of the day who slugged it out in that Tennessee courtroom: Clarence Darrow and William Jennings Bryan.

I loved this play when I was 17. Heroic Spencer Tracy, representing urban sophistication and scientific inquiry, goes up against March, whose character is something of a mountebank, a thrice-failed presidential candidate hopelessly addicted to public applause. But he is also, in the viewpoint of modern, urban-based belief in science and "progress," defending the indefensible, or perhaps I should say attacking the unassailable: as prosecutor, it's his job to put the young teacher in jail for teaching evolution. So he spouts off through half the film about the truth of the Bible and the lies of science, and how we should believe the Book of Genesis and not Darwin, and so on. In one particularly piquant scene, Tracy's character is badgering March's on the stand, and gets him to admit that he has never even read Darwin. The oafish Bible-banger is made into a figure of fun, as are the equally oafish, agrarian "true believers" in the courtroom who support him with hosannahs and amens.

For those who haven't seen this classic film, I won't spoil the outcome of the trial, but I will tell you that March's character, Matthew Harrison Brady, who has been stuffing himself with food throughout the movie, drops dead of a coronary at the trial's end. Serves him right, I thought at 17. I truly relished getting up in front of judges at speech tournaments and re-enacting that courtroom scene in which Henry Drummond, who represents Clarence Darrow, punctures the hot air balloon that is his opponent, Brady, the religious fundamentalist seeking to ban science from the classroom. Take that, you knuckle-dragging, Bible-pounding, backwoods neanderthals!

Yes, by and large we're not very tolerant of other people's opinions, or indeed of their feelings, when we're young. Ever get into an argument with a teenager? Chances are you'll discover that he's read one book on the subject under discussion, and that one book has made him The World's Foremost Authority. Anything you say questioning anything he says is going to elicit eyeball-rolling, deep sighs and body language that silently shouts, "How can anyone as ignorant as this old fart who stands before me possibly manage to feed himself?" That was me at 17, and I have nieces and nephews today who remind me of myself at that age.

Frankly, I am surprised that the creation-vs.-evolution debate is still raging. The year 2005 marked the 80th anniversary of the Scopes trial. Radio made its courtroom debut in that trial--the first "CNN moment." Since then, we've seen the advent of TV, then cable TV, then satellite TV, then the Internet. In 1925 Lindbergh had yet to fly the Atlantic; now we've been to the moon and back, and are launching probes to Pluto. We've split the atom. Conquered diseases. Extended life expectancy. Cloned sheep. We have digitial music downloads. Global positioning systems. Stealth bombers. Pop Tarts.

And yet we're still arguing about Genesis vs. The Origin Of Species, the most recent skirmishes involving so-called Intelligent Design, the search for...well, just that: intelligent design behind the universe, and life itself. In other words, the age-old search for God in nature.

Up to this point, even my most liberal friends would have to agree that I am impeccably secularist. I'm not the firebrand trying to pronounce ecrasez l'infame that I was at 17, but I will agree that Intelligent Design is not science and has no place in biology class. In theology class yes, in biology class, no. Science, true science, begins with this statement: "We don't know where the evidence is going to lead us. We're going to follow the evidence and see where it goes."

Intelligent Design, on the other hand, begins with the statement: "We believe in God, the creator of the universe, and we're going to look for evidence to support that belief." That's faith, not science. Faith, of necessity, puts the chicken before the egg: Science goes at it the other way around, starting with unbelief and proceeding, hopefully in an unprejudiced way, to puzzle out the way things work.

As we get older, if we have managed to mature at all, we become less passionate about things. Zealotry is dangerous, the most dangerous people in the world being those who think they hold the sole keys to the repository of truth. This isn't only true with regard to religion, but with any system of belief. Look at totalitarian regimes based on utopian notions, such as the late, unlamented Soviet Union. I'll leave Hitler out of the discussion--he's too obvious. But look at Freud: he had himself so convinced that his every idea should be Holy Writ in the world of psychology that he regarded any student who questioned him as an apostate, worthy only of banishment. And look at how discredited Freud is today.

I read something yesterday that really got me going, even as one who stands firmly in the secular camp with regard to the teaching of biology. It was a Q&A session on the web site Beliefnet.com, which in turn I had arrived at through one of my favorite web sites, Arts and Letters Daily.com (www.aldaily.com) The Q&A was with Dr. Richard Dawkins, a British biologist. Dawkins is passionate about the argument from design: he not only hates it, he hates the very idea of God in general. Dawkins is the atheist version of the True Believer. If it is indeed true, as has been asserted many times, that science is what replaced religion in our lives during the last century, Dawkins is one of its high priests. After reading the interview, I had this to say about it in my journal:

"Went on Arts and Letters Daily.com this afternoon and read a Q&A with Richard Dawkins, in which, with great gusto, he trashes the idea of intelligent design and religious belief generally, and speaks up boldly for the immutable truths of Darwin. We should be years beyond such arguments now—even Pope John Paul II confirmed that Darwinism is so well-founded in science as to be beyond theory. It’s unfortunate that we have some die-hard Bible-belt types in this country who never got past the Scopes trial.

"But what an obnoxious, childish boor this Dawkins is! It’s one thing to be an atheist, but to cockily sneer at the idea of belief, dismissing anyone who disagrees with you as stupid and ignorant, is what one expects of an emotional adolescent, the kind of snotnosed loafer-about-the-hallways who likes to try and start trouble with questions like “If God can do anything, can he make a rock so heavy he can’t lift it up?” I heard plenty of this kind of thing in high school from assorted James Dean wannabes. And should anyone take the showy arrogance of an adolescent seriously? From his tenured position at Oxford, Dawkins noisily opposed the establishment of a chair of theology at Cambridge, simply because he happens to be an atheist-materialist who has no use for such things. Is there so much difference between that and demanding that evolution be banned from classrooms?"

I will defend to the death--well, maybe not quite that far--Dawkins' right to scream "There is no God." I'm not particularly religious myself. But what he's saying is by no means revolutionary, fresh or new: go back to Baron d'Holbach in the 18th century and you'll get the atheist-materialist position with a special wine sauce, the way they serve everything in France. In fact you can go back much further: Bertrand Russell, the last century's most famous atheist, said he got his worldview essentially from Lucretius, who died in 55 B.C. It was not Dawkins' belief or lack of it that I found off-putting, but his snotty arrogance. It served to remind me that scientists are not always humanists. Not every scientist is a Loren Eiseley, a Jakob Bronowski or a Carl Sagan. The humanistic tradition calls for studying--and embracing--every aspect of human experience, not just the spirit of scientific inquiry. I have tremendous respect for science, as we all do these days. But scientists like Dawkins are essentially technocrats, and I wouldn't want to live in a world run by them. They may take great pride in being soulless, and in telling the rest of us that we are too, but in dismissing so huge a part of the human experience as the religious impulse, calling it stupid or irrelevant or whatever, they are only showing themselves to be as narrow-minded as those they sneer at.

Bob Dylan wrote "Negativity won't pull you through." And he didn't write that line when he was going through his "Christian" period either, but when he was young, proud and questioning many things. The spirit of inquiry is perhaps the greatest human birthright. But it only forms part of the human experience, and those who would write off as meaningless any part of the great human experience are only cheating themselves. And perhaps, in their most disingenuous moments, they are also attempting to cheat others, even if they think they're doing it for some noble reason, as I'm sure Dawkins thinks he's doing when he scoffs at all religious belief, unconcerned about those to whom he is deliberately showing a downright pimply lack of respect.

I say let's go ahead and ban intelligent design from the public classroom.

I also say let's go ahead and establish a chair of theology at Cambridge.

Saturday, December 10, 2005

An Update from the "war on Christmas"

DEC. 10, 2005--Before you read this posting, read this disclaimer: I am not a Christian. I have tried batting from both sides of the plate in that department: I have been both a Protestant and a Catholic during my life. I have problems with both, and haven't been anywhere near a church in more than 20 years. Now, read on.

The "culture wars" in America have a new battlefield on which the two sides now slug it out every year starting right after Halloween.

The two sides, parsing our fuzzy notions of "the left" and "the right" these days, are those who favor a public secularism resembling that of Europe, and those who wish to conserve America's Judeo-Christian traditions.

The battlefield is, of course, Christmas. Some on the right are claiming there is an out-and-out conspiracy to remove Christmas from public view and replace it with "holiday," e.g. people saying "Happy Holidays" instead of "Merry Christmas." Christmas trees are being renamed "Holiday trees," and so forth. Right here in my home town of Chula Vista, California, the annual Yuletide Parade is now called "The Starlight Parade" (apparently even the word "yule" is too religious for some people.) Snowmen and candy canes are allowed in these "Holiday" parades, but Santa Claus is being increasingly banned. (And it's not just here in California, of course. A few years ago I read where a small town in Maryland caved in to the demands of the local atheist and banned Santa Claus from its annual parade. In response, about 300 guys showed up at the parade in Santa suits.)

Fox News' John Gibson has written a book about all of this, The War on Christmas. Gibson certainly believes that secularists have mounted a premeditated attack on religious tradition, deliberately attempting to remove the essence of Christmas from the Christmas season, to purge the season of any notion or acknowledgement of a supreme being. And the secularists have shot back in response to claims such as Gibson's, claiming there's no such thing as a war on Christmas. Tax-subsidized institutions should not endorse any one religion, and the separation of church and state demands that nativity scenes not be permitted at city hall, nor holiday displays and programs that center around baby Jesus be allowed in public schools or in any other place that receives tax money.

The whole idea of a war on Christmas, they claim, is just another right-wing bugbear; they're only defending the constitution.

And then, under their breath, they'll add, "From all those red-state troglodytes who want to turn this country into a theocracy."

A generation ago the left was raging impotently over America's choice of Ronald Reagan for president over the limpid Jimmy Carter, and screamed itself hoarse trying to sell the idea that Reagan wanted to start a nuclear war, which he never wanted and never did.

In those days the catchphrase was "nuclear war." These days, the left's favorite word has become "theocracy." The evil right, we keep hearing, is scheming to turn America into a "theocracy," just as surely as Reagan wanted to mash on that nuclear button. And never mind the fact that George W. Bush, even if he were a confirmed antidisestablishmentarian, (look it up) is a lame-duck president scheduled to leave office in January, 2009. Man the secular parapets: the theocrats are at the gates.

Theocrats at the gates? Because some store clerk says "Merry Christmas" to a shopper?

I was in the fourth grade, at Castle Park Elementary School right here in Chula Vista, at Christmas of 1963. President John F. Kennedy had been dead for a month, killed by an assassin. A lot of people weren't feeling very festive that holiday season. But we kids were rehearsed and gotten up for a Christmas pageant at our (public) school nonetheless. And that's what it was called, by the way: a Christmas pageant, not a "winter festival." We did a choral reading from the second chapter of Luke. I kid you not: the second chapter of the Book of Luke, in the New Testament. Nobody filed a lawsuit, nobody screamed about "theocracy," and everyone went quietly home to bed afterwards. Now, if a public school could have a Christmas pageant during the Kennedy era without putting America in peril of being taken over by Christian ayatollahs, why is uttering "Merry Christmas" such a danger now? Surely even those who have hatred for George W. Bush engraved upon their livers wouldn't credit him with a charisma approaching JFK's. And I'm not advocating readings from Luke in today's public schools, either. I'm just saying that, if such a thing didn't destroy the republic in the JFK era, it probably wouldn't now.

Nevertheless, pundits will assure you that insisting Wal-Mart employees say "Happy Holidays" to customers instead of "Merry Christmas," is to truly and courageously protect the American way, not to mention the sensitive toes of all those non-Christians out there. Better to deny a million shoppers a cheery "Merry Christmas" than make one whiny atheist feel "excluded." (Complaints about Santas and creches don't usually come from Jews, Muslims or other believers. They usually come from atheists, by way of whatever lawyer the atheist is paying.)

Of course these are the same people who will also try to tell you there's no such thing as political correctness. PC, they'll tell you with earnest, wide-open eyes, is just another "right-wing bugbear." It doesn't exist.

Uh-huh. And I root for the Stanford Cardinals in college football, too.

In the midst of all this fuss n' feathers, I went to the post office this morning. Today is Saturday, and it's the second week of December: there were many people lined up waiting for the doors to open so they could mail or pick up packages. When the doors finally did open, promptly at 9 a.m., we all filed in.

I looked around and noticed that there were holiday decorations hung in the post office lobby. No Santas or creches, though, just good old inoffensive fake holly and ivy. A snowman here, a candy cane there.

But wait: one of the postal employees had obviously brought in her own boom-box. It stood on the counter beside her station, and from it, a female singer was belting out O Come, All Ye Faithful. In the post office, a government building! Shocking, just shocking.

But I looked around to see if anyone in line were getting ready to protest. No one was. Everyone continued to quietly wait their turn at the window. "Well, just wait," I thought. "Later today some atheist will come in here for a book of stamps, and the next thing you know, the postmaster will get a threatening letter from the ACLU." But it didn't seem pending at that moment. Everyone continued to wait on line. Some hummed along.

Eventually I got to the window. I was looking for some returned mail. I was told to go stand in another line, one for pick-ups that didn't involve cash. So I went and stood over the by the door, where the no-cash people were queuing.

The crowd in the lobby grew, and the postal workers decided it was time to open one more window. I watched as an older postal employee came out to the front lobby with some mail in his hands and removed the cover from the postal scale.

"I'm going to open another window," he announced to the crowd. "But I'm going to sing to you first." With that, he launched into a chorus of "We Wish You A Merry Christmas." In the post office.

No one protested. A few people actually applauded.

With that, and with fond, loving memories of the Christmas holidays I remember from childhood and youth, including those spent right here in Chula Vista, I wish one and all...Season's Greetings.

Thursday, December 08, 2005

What John Lennon Means To Me (and doesn't)


DEC. 8, 2005--The segment of the American population that can remember what it was doing on Nov. 22, 1963 is beginning to thin.

Ten years ago, in the fall of 1995, I learned from a radio broadcast that the majority of the population of America alive in 1995 had not yet been born in 1963. As a baby boomer who had been in the fourth grade on the day John F. Kennedy was shot and killed in Dallas, accustomed to the idea that my generation was important, and that this was something that would always be remembered, that came as something of a shock.

With 11/22/63 fading, and 9/11/01 still a bit recent, the moment in history that virtually everyone my age--and even a little bit younger--seems to remember exactly where they were and what they were doing when it occurred was that which occurred 25 years ago tonight: Dec. 8, 1980. That was the night John Lennon was murdered in New York by a crazed fan.

Since this is my blog and I can be as self-indulgent as I please, I'll tell you where I was and what I was doing. I had just turned 25. I was a newspaper reporter in California's Imperial Valley, and was sitting in my apartment in El Centro that Monday night when my mother called from Chula Vista, two hours away over on the coast. She and my father were watching Monday Night Football on ABC, and the football broadcast had been interrupted so the announcer--was it Howard Cosell?--could share the awful news with the nation. Not a football fan myself, had I not had that phone call from my mother, I probably wouldn't have learned about Lennon's murder until the Today Show the next morning.

It was a HUGE moment for the baby-boom generation. The oldest of us were in our mid-thirties then (Born in 1955, I'm a "trailing-edge" boomer) and the Beatles had been so much a part of our collective upbringing that it must have seemed to those thirtysomethings that night as if some sort of umbilical cord tying them back to their youth in the 1960s had just been severed. Certainly this twentysomething had a hint of such a feeling, and I was only 14 when the Beatles broke up in 1970.

The outpouring of tributes was prompt and massive. The fact that Lennon died just before Christmas, when the nation is trying its best to be in a festive mood, only made the nostalgia and the feeling of loss that much worse. How many young journalists, when they sat down to write their pieces about What John Lennon Meant To Me, were remembering Christmas mornings when Beatle albums might have been among their gifts? Whatever the reason, in the weeks following Lennon's murder (I refuse to call it an "assassination," and explained why this week in the pages of National Review magazine) it seemed as if every journalist in the country between the ages of 25 and 40 was tapping out his or her version of Lennon In My Life, replete with memories of seeing the Fab Four on the Ed Sullivan Show, sporting that long hair which would soon spawn so much imitation--and so many dinner-table arguments--throughout the western world, or of Dad entering the bedroom with a mop on his head, singing "Come to dinner, yeah, yeah yeah!"

I didn't write my version. In part that was because I knew my editor would never print such a thing: he was an old fart from the days of Your Hit Parade who thought music meant Snookie Lansen, Shrimp Boats and How Much Is That Doggie In The Window? He conceded the front page to the Lennon murder that day because even he had to admit that it was news, but he refused to do anything gratuitous. I didn't even ask.

In truth, even had my editor been a big Beatles fan, I probably would not have asked. Because frankly there are two pop phenomena I find it utterly impossible to relate to: Elvis-olatry and the canonization of John Ono Lennon. (Well, actually make that three phenomena: I'm unable to hear anything the slightest bit redeeming in the sound of some functional illiterate grunting and shouting in front of an artificial rhythm-machine, so the appeal of rap will also remain a mystery to me.) Elvis was a nice young man with one of the most remarkable singing voices of his time. But when he had to wrestle with growing up, it seems to me he lost the war. I can enjoy his songs, but can't think of him as a cultural hero. I feel the same way about John Lennon.

I am, and have been most of my life, a Beatles fan. What music lover could fail to be? John Lennon and Paul McCartney, as a songwriting team, rank right up there with the Gershwins, Rogers & Hart, Cole Porter and Duke Ellington. But that's actually the problem: I recognize the Beatles as what they were: the songwriting team of Lennon and McCartney, with George Harrison and Ringo Starr playing backup. The Beatles were a great band, but the true magic was in John and Paul's compositions. Had John and Paul not been such an incredible team, the Beatles would probably be no better remembered today than The Dave Clark Five.

In short, the Beatles were greater than the sum of their parts. Individually, they were not. After the group broke up in 1970, I watched John and Paul, free from the restraint that each had placed on the other, spin off in their individual directions--Paul toward icky-sweet pop sentimentality, John toward wacky, self-indulgent forays into art and febrile politics, staging "bed-ins" and recording mediocre albums on which he vented his spleen at Paul over the personal differences that had broken up the group. True, John found his equilibrium a few years later and gave the world some good rockers, but I don't think he or Paul ever did anything separately as well as they did it together, and please don't get me started on Imagine.

I cannot stand that song. I admit it, openly and freely. It's a pleasant little tune, yoked to the most sententious, shallow, childish lyric since Paul Simon allowed himself I Am A Rock. In a world in which Alexander Solzhenitsyn had been for several years already exposing the lies behind Soviet power, Lennon was crooning about the sort of world Marxists used to pitch before their mendacities were exposed: no possessions, no borders, no religion...just a big, global, atheistic warm bath. Please. Some are inclined to give him a pass for his naievete; I'm not. Sorry. A streetwise kid from Liverpool, he should have had better sense. And that line! No one seems inclined to point it out, so I'll remind one and all: "And no religion too," he sings. To mangle basic grammar for the sake of a cheap rhyme is something a sixth grader writing his first love poem might do. For one of the century's greatest lyricists and songwriters to get away with this, unremarked, is unconscionable. Didn't Rolling Stone or Newsweek or somebody call him on this egregiously bad line? Apparently not, and Imagine still draws sighs of admiration. Not from me.

I was saddened by Lennon's death, and still am. He died too young, obviously, but he also died at the very moment he seemed on the verge of a creative rebirth. He had just released a remarkably good album, Double Fantasy, and good things seemed to loom on the horizon for him. I heard the radio interview he gave just hours before his death. He sounded upbeat, energized and ready for work. The whining and carping of past days had been put behind him; he criticized his younger self for behaving childishly, and as he signed off the interview, he and his wife Yoko Ono wished one and all "a happy Christmas." Had he not been killed that night, had he celebrated his 65th birthday this year with his old friend Bob Dylan (now 64) and perhaps even with a reconciled Paul McCartney, who knows what might have accrued between that upbeat radio interview and now? Perhaps I might be writing a birthday tribute including praise for half a dozen excellent albums that were never made.

Just imagine.

Tuesday, December 06, 2005

Apocalypse Passe


I'm a book collector. I'm also a music lover. The two things sometimes cross like intersecting orbits, and sometimes result in revelation. Sometimes not.

As a book and music collector, and also a big fan of the music of Gustav Mahler, I was a sucker for a title I spotted on the local public library's "giveaway" shelf one morning: Late Night Thoughts On Listening to Mahler's Ninth Symphony by Lewis Thomas. Thomas, I gathered from reading the dust jacket, was a sort of updated Loren Eiseley, a scientist who writes about science and other things human. The book was published in 1983, a while back to be sure, (hence, I'm also sure, the library's giving it away) but the title was too intriguing to resist, as was the price (free.) I took it home.

For some time, I merely gazed at the little book's spine as it sat on my shelf. I was saving it for a treat. What, I wondered, could these "late night thoughts" engendered by Mahler's sublime Ninth Symphony possibly be? I had no idea, but naively assumed that the essay in question would be some meditation on music and subjects related to it. I have Leonard Bernstein's 1973 Charles Eliot Norton lectures from Harvard on videotape, and have watched many times his dissertation on the Mahler Ninth, which closes out the fifth of the series of six lectures. Yes, Bernstein's comments are a bit dated now--the USSR was still in business in 1973, the cold war was at its chilliest, and Bernstein used the last movement of the Mahler Ninth as a springboard to talk about "global death," e.g. nuclear holocaust. The last movement of the Ninth, at least its final bars, seems to simulate the act of dying: Bernstein used it as a starting point for his own little pedagogical thanatopsis.

Fair enough. I turned 18 in 1973 and remember that time. But imagine my disappointment when I opened Thomas' book to the title essay and found it little more than a rehashing of what I had heard Leonard Bernstein say on tape many times before. And it was published 10 years after Bernstein's lectures!

And it's so quaint. Imagine: late night thoughts (or any kind of thoughts) tied to a masterpiece like Mahler's Ninth, becoming quaint. Still, quaint it is, because Lewis' message was politically loaded, and loaded for its time. He was purveying the conventional wisdom of 1983, which turned out to be so wrong, and which in turn accounts for its quaintness. Bernstein's ruminations about "global death" a decade earlier seem ingenuous by comparison: after all, in 1973 the Vietnam war was not yet over; Richard Nixon and Leonid Brezhnev were running the show in Washington and Moscow, and China was a good five years away from admitting Coca-Cola behind the wall. No doubt Bernstein thought he was doing his duty as a good liberal, sounding the warning. No doubt Thomas was doing the same. But the agenda in Thomas' case is so much more transparent, and from the standpoint of more than two decades later, so much more deserving of criticism.

So what was the conventional wisdom of 1983? Well, as handed to us by the television networks, talking heads and eastern university pundits, it was that President Ronald Reagan was some sort of nuclear nut, an out-of-control cowboy (sound familiar?) who was going to send us all up in festering heat by dint of his insane insistence upon standing up to the Soviets instead of accomodating them at every turn. The hand-wringers were having quite a big time that year: 1983 was the year of The Day After, a TV movie that got more media hype prior to its airing than any other TV movie in the history of the medium. Its subject was, of course, the impending nuclear holocaust. But this nuclear-war show exploded with all the impact of a wet firecracker: when the dust had settled, it turned out to be just another TV movie, hardly worth a critic's nod. Everyone yawned and moved on to TV's finest moment of that year, the last episode of M*A*S*H*. The movie WarGames also appeared that year, another nuclear-apocalypse fantasy.

And of course 1983 was also the year in which the Strategic Defense Initiative, known derisively to the hand-wringers as "Star Wars," ignited nationwide debate about the wisdom of trying to protect ourselves against nuclear attack. Mutually-Assured Destruction or MAD, as it was called, had prevented a nuclear duke-out between Washington and Moscow up till then, or so we were told, and to deviate from the status quo would surely "upset the apple cart" as one of my Rainbow Coalition friends told me.

Oh yeah? Well, as it turned out, the Lewis Thomases, E.L. Doctorows and Leonard Bernsteins of the world (remember "Peacequake?") were...wrong. Not only did we not go up in festering heat, but by raising the stakes on the table to the point where the Soviets could not say "call," because SDI or anything like it was far too expensive for them to build and they damn well knew it, Reagan cut them off at the knees, and with a smile on his face, no less. He even offered to share SDI technology with the Russians, knowing that he was safe in making the offer: their economy, trashed by decades of witless central planning, could not bear the expense of anything so sophisticated. When Mikhail S. Gorbachev came to power in Moscow in 1985, it was already commonplace to refer to the USSR as "Burkina Faso with missiles." No wonder the Russians threw a global hissy fit over SDI, screaming and shaking their fists. Oh, it wasn't that they were incapable of duplicating it. After all, the Soviets had led the way into space in the late 1950s, and for a number of years, their space program was significantly ahead of America's. There was nothing wrong with Russian science. They had the know-how; what they didn't have was the money, and Reagan knew it. The Soviets were poker players without a poke. They couldn't call the bet, so they did all they could do: they undertook what was called in the psychedelic era "agonizing re-appraisal." Gorbachev did exactly that, and found his position decidedly wanting. The result, a half-dozen years later, was the dismantling of the Soviet empire and shortly thereafter, the death of the Soviet Union and the removal of communism as a threat to peace and stability in Europe.

So, in the light of history, Thomas and his cohort were wrong: blatantly, glaringly wrong. Did any of them ever admit it? Not to my knowledge.

Yes, the world is still a dangerous place. It always was. Nuclear war was a very real danger before the implosion of the Soviet empire, and could become one again if the United States and China truly end up on the sort of global collision course that some of the pundits are now telling us is imminent within the next decade or so.

But even before the atom was split, the world was a dangerous place. It always has been. Would you want to have been raising a family on the banks of the Seine in the first decades of the Tenth Century, when those first Viking ships came sliding silently past the Ile de la Cite, manned with crews to whom murder, rape and pillage were all in a day's work? Would you have wanted to be anywhere in western Europe in 1348, when the Black Plague was marching across the continent, decimating populations (as the Asian Bird Flu may soon do worldwide?) Probably not.

No doubt the world will be a dangerous place until the end of recorded time. But the way to deal with danger isn't always by cringing and hand-wringing. Sometimes bold and decisive action will win the day. They did in the 1980s, even as a chorus of nay-sayers was announcing the apocalypse and insisting that we should leave the Soviets in peace to pursue their goal of hegemony in all corners of the globe, in the interest of preserving the peace. Instead, we ended up ridding the world of one its most insidious and dangerous police states.

Mahler's Ninth Symphony has nothing to do with any of this. Instrumental music is abstract; it is capable of carrying a "message" only in the most general of ways. Where Lewis Thomas' mind wandered as he listened to the Ninth is a matter of relative unimportance now. Mahler knew nothing of nuclear holocaust, or of Asian Bird Flu for that matter. He did know, when he wrote the Ninth, that he was dying. And we all know that eventually every one of us will die in one circumstance or another. So the message of the Ninth, if it has one, is perennial and for all time. The message of Lewis Thomas' meditations on the Ninth belongs to another time, one that is now past.

Open Letter To A Dirty Bastard

This is addressed to the son of a bitch who sold my wife a sick puppy, no play on words intended.

First, the good news. The puppy survived, no thanks to you. When he started vomiting, three days after my wife brought him home, she was concerned enough to get him to the pet clinic right away. We assumed he had eaten something that made him sick. We also have two cats; perhaps, we thought, he had ingested some of the cat litter that had gotten scattered around on the floor near the cats' food dish, since he does like to nosh on the cats' food, and dogs, after all, will eat just about anything.

But no, that wasn't it. Even the vet said that having eaten something he wasn't supposed to probably wouldn't have caused such protracted vomiting. He was nearly dehydrated. I left him at the clinic and came on home.

They called an hour later. The dog had parvo virus. The incubation period for parvo virus is at least three days, which means the dog had parvo virus when you sold him to my wife. It also means that the other dogs you're keeping on your property, and selling to other people in our community, probably have it too. So what did you say when we called you up and told you about it? You denied that the dog could possibly have parvo virus. You refused to pay for any part of his treatment, but offered to take him back across the border to Mexico where you got him from, and take him to a veterinarian there, "Where they're cheaper."

Bullshit. If we had returned that puppy to you, you would have tossed him back into your back yard and let him die along with the other dogs you have, who are probably dying now.

But what do you care? You'll just drive back down across the border and round up another bunch of strays for sale, won't you? Nice, lucrative business you have there, hanging around supermarket parking lots, selling sick dogs to people. Why screw up something that pays so well?

I went and spoke with the director of the local animal shelter. You're quite well known in this area. Lots of people have complained about your shenanigans. Since you speak Spanish, you prey particularly on low-income Spanish-speakers who may want pets for their children but can't afford to pay breeder's prices. So these people take puppies home, who promptly get sick, and since their owners can't afford to pay a vet, they die. Happy kids. Nice guy.

Your filthy business has been covered in the press, but I'm told you're not doing anything illegal. Yet. There is a bill pending in the state legislature that would penalize people who do what you do. The moment the Humane Society is empowered to nail bastards like you, I hope they get you first.

Merry Christmas, you asshole. I hope one of those strays in Mexico turns on you and tears your heart out. If he can find it.

Sunday, December 04, 2005

Naming Names


Same name, two generations: My dad and me, 1959



My ancestors on my father's side came from Trois Rivieres, Quebec. My last name, Dupuis, is quite common there, as it also is in Louisiana, and in the area around New Bedford, Massachusetts, where my father was born.

And it goes without saying that it's a very common name in France: the seducer of Emma Bovary in Gustave Flaubert's most famous novel was a Leon Dupuis. I met a man years ago who shared my surname, and he informed me that the French city of Loudun, where they burned witches in 1634, is the family's ancestral hub. We Dupuis (Dupuises? I've never been able to sort that out) have even bumped into the movies: the next time you rent Steel Magnolias, take note of the fact that Darryl Hannah's ex-boyfriend in the film is a Jasper Dupuis. We're all over the place, usually somewhere in the background.

But sometimes I wish my name were Sam Huck or Bill Jones or Bruce Springsteen. The only place anyone ever got it right on the first try was Paris, where a hotel desk clerk greeted me cheerfully as "Monsieur Dupuis," pronouncing it not only correctly but with the elegant labio-dental slide that it's supposed to have.

People on this side of the pond, even people I've known for a while, have trouble either pronouncing it or spelling it or both. My friend Jennifer, who handles media calls for San Diego County Supervisor Diane Jacob, was jotting down my name just this morning. Admittedly, Jennifer and I hadn't spoken on the phone for a while, but she asked me, "So...it's spelled 'D-U-P-R-I-S,' right?" I corrected her. "There's no 'r' in it." (You're not alone, Jennifer. I've told hundreds of people that there's no 'r" in it.)


I recently got married again. At the wedding reception I informed our guests that my wife was going to keep her surname, "Blake." This isn't because I've become some Alan Alda feminist. Mostly it had to do with logistics. "She offered to take my name," I explained, "but in view of what I've had to put up with all my life, having a French surname that most of my fellow Americans can neither pronounce nor spell, I decided I didn't want to put her through that."

That was only part of the story, of course. My wife is in business, and changing all of her business-brandings to read "Dupuis" rather than "Blake" would have cost a lot of money that we could just as easily spend on a Greek island cruise.

There is one area where having a gnarly last name has worked to my advantage: dealing with telemarketers. It's a dandy 'flag' for identifying uninvited and unwelcome telephone calls. If my phone rings and the person on the other end wants to speak to "Mr. Dew-pewis" or "Mr. Dew-pew" or "Mr. Dew-pwah," the chances are very good that that person doesn't know me and wants to sell me something. I mean, if they knew me, they'd know how to pronounce my name, right? Call me up looking for Mr. Dew-pewis, and all you're going to hear in reply is a really loud click.

I have thought of moving, either to Quebec or to France, just so this won't be so much of a problem. (Since I don't like hot, sticky weather, Louisiana never was an option, and that state's recent tragic bad luck with hurricanes makes me less inclined than ever to go there.) But of course in Canada or France I'd just be another Tom, Dick or Jean-Louis in the crowd. Here at home, having a slightly off-the-wall last name gives me a certain cachet---I once showed up wearing a beret to cover a speech at a Rotary Club meeting, and the rotarians gave me a round of applause, for being properly costumed, I would guess. People sometimes ask me if I speak French. I don't, but my father did. (When my father was in the Coast Guard in the 1930s, he was "Frenchy" on every ship he ever served. But of course those were the pre-Political Correctness days when ethnic nicknames weren't forbidden by the protocols of cultural sensitivity.)

I have also thought, naturally, of legally changing the name to something a bit more anglophone, if only for convenience's sake. Perhaps if I were 21 again, just starting out in the business, ambitious and in want of a real snappy-sounding by-line, I might. Maybe I'd be "J.D. Hawk," a great by-line if ever I heard one. But I happen to have a friend who's already using that name. No, I think I'll just stick with Dupuis until the bell rings. I know how to spell and pronounce it, and teaching others to do so can be a great ice-breaker ("That's 'D,U, P as in Paul, U, I, S as in Sam,' " I've been saying on the phone for years.)

As for telemarketers, let the caller beware. I have a few choice names for them, too.

Talk To The Golf Ball

I lost my younger sister in September, 2004, to a combination of drugs and alcohol.

She was an alcoholic who was also addicted to painkillers. She abused vicodin for years, mixing it with bottle after bottle of E&J brandy. Vicodin is a prescription drug, and she did not have her own prescription for it. She abused others' prescriptions for as long as she could get away with it. It cost her at least one friendship.

Ultimately it was our father, 90 years old, who became her unwitting connection. He had a prescription for vicodin, and my sister, as his caregiver and therefore the person responsible for having his prescriptions refilled, was getting all she wanted and then some.

But then we noticed that Dad was getting even more confused, particularly in the late afternoons and early evenings, than he had been before. We reported this to his doctor, who guessed that it might be the vicodin causing this "sundowning," as it's called. Vicodin, the doctor said, sometimes has that effect on elderly people. It should have occurred to me that this probably wasn't the real problem, as my sister was getting most of the vicodin; she was only sharing it with Dad when he was obviously in pain (he suffered from chronic pain in one ankle, the result of having a line-drive bounced off him at one of his grandsons' Little League games.)

The doctor changed Dad's painkiller prescription from vicodin to methadone.

One week later my sister was dead, the result of having taken 15 mg. of methadone (three times my father's dose) and chasing them down with brandy. She was 47. She didn't know, and no one bothered to tell her, that methadone is stronger than vicodin. In this case it was strong enough to make her heart stop beating.

With an experience like that in my recent life, I'm a bit sensitive to the sight of anyone I know wrestling with a substance-abuse problem.

Now, I'm not some teetotaling sniffer: I don't do drugs or pills, but I do drink, and I enjoy it. I've been known to drink too much. But I can tell when someone has obviously developed a problem with alcohol, and I have a friend who has.

He showed up on my doorstep a couple of Fridays ago, around noon. I had invited another friend to lunch and we were out on the patio when the doorbell rang. I answered the door, and there was my pal and onetime blogging partner, teetering a bit as he stood there, with the dopey expression on his face that tells me he's been drinking for at least a couple of hours already. I beckoned him to come on in; instead he looked down at the bushes and pointed. Again I told him to come in. Again he cast his eyes down at the bushes beside the door and pointed.

I stepped out to see what he was trying to call my attention to. It was a 12-pack of Coors Light that he had brought along, and for some reason had stashed in the bushes.

I hauled the beer—and him—into the house, took him out on to the patio to join my luncheon group, tried to get him to eat a sandwich, to get something in his stomach besides alcohol. He showed little interest in food, however, but just kept asking for another beer. He was also babbling incoherently, interrupting the conversation, and in general making a drunken pest of himself.

“Wait a minute!” He said suddenly, as my former editor and I were discussing business. “I’m gonna go get my tape recorder! I want all this on tape!” Laughing mischievously, he left the table and stumbled out the front door.

He returned a few minutes later….with a golf ball. “That’s your tape recorder?” I asked him. “Well, there you have it,” I said to my former editor. “Talk to the golf ball.”

Presently my wife and my former editor’s wife together “ganged up” a bit on my drunken pal, told him point-blank that he has a serious problem and encouraged him to get help. Offended, he got up and left suddenly. I haven’t seen him since.

I know he’s been to Alcoholics Anonymous; his parents forced him to go. But AA doesn’t work if someone has to force you to go. It has to be your idea. He has yet to reach the “rock bottom” point that often drives people to AA. When I suggest to him that he go back, he voices objections. He dislikes AA’s dogmatic approach, he says, by which I assume he means that AA refuses to be open-minded about his drinking. The nerve of them.

I still have the golf ball. My cat plays with it. I have an uneasy feeling that alcohol is playing with my friend the same way my cat plays with his golf ball. Not much you can do with a golf ball. A golf ball is a dead thing; rolling around on the floor or on a fairway are about all it’s good for. To my friend I’d like to say, quit acting like a golf ball. Quit rolling around and get up off the floor. Because you’re not a dead like a golf ball. Yet.