Wednesday, May 28, 2008

Take Notes, There's Going To Be A Quiz Next Period


Okay, everybody, listen up.

Old K.D. has been kicking around this big blue marble for more than 52 years, and he's picked up some truths you can live by. Get out your pencils. And there won't be any obvious ones you've heard before, like the one about washing your car as a guarantee of rain.

We all know that, don't we?

No, these are things I've noticed that I haven't noticed anyone else noticing.

-- My father lived to be 91. And of all the experiences he had in his long life, the following may be the most worthy of note. Dad started out as a Republican. Richard Nixon turned him into a Democrat. Three years of working for a Democratic state senator turned him back into a Republican.

-- Among the significant differences between women and men is that women tend to hug themselves while they walk down the street, while men shove their hands in their pockets.

-- Among the significant differences between women and men is that men generally can't resist the meat and potatoes, while women generally can't resist the dessert tray.

-- Contrary to popular myth, women and men tend to be slobs in equal numbers. I've known tidy women and sloppy men, but I've also known plenty of fastidious men and women who treated their surroundings as one gigantic dumpster.

-- Dogs may be more openly affectionate than cats, but they're also more high-maintenance. You can leave your cat alone for a weekend provided you leave him with enough food, water and litter. Don't try that with a dog.

-- Abstract art is a con game, period. My first wife liked to paint, and she was good at it. Still is. One day when I was serving at the American Embassy in Brasilia, they hung an abstract painting in the lobby for which the U.S. State Department had paid $5,000 under the official "Art In Embassies" program. The DCM, looking at it, asked me, "What do you think, Kelley?" I replied, "I think my wife could have done it and she would have charged you $200 for her time." He wasn't pleased with that answer, but that was my moment of "speaking truth to power."

-- Life has gotten too damn complicated when rearranging your furniture so you don't trip over it is called "Feng Shui."

-- Black America gave America, and the world, America's best music and also its worst. The best is of course jazz. The worst is rap, a one-syllable word for barnyard noise. Make that sociopathic barnyard noise.

-- Starbucks is as much a con game as abstract art. What idiot wants to pay $4 for a cup of coffee? I'll pause here while you count the idiots, and there aren't enough hours in the day. Listen, folks. Michelle Malkin thinks Dunkin' Donuts makes great coffee. She may be right; I've never tried theirs. But I can tell you this for certain: for my money, the fast-food outlet with the best coffee is Burger King. Every Burger King I've ever walked into, whether in Spokane, Washington, Fredericksburg, Virginia or Munich, Germany had GREAT coffee.

-- Paris in the spring is well and good. Been there, done that. But for my money (and I'd better have plenty of it) there's no place quite like Moscow in the late spring. You heard me right. Moscow, the capital of the Russian Federation. The best time to visit Moscow is in late May/early June. Winter has been over for a couple of months by then; the weather is generally mild, everything's in bloom and you have about 20 hours of daylight owing to the fact that you're at latitude 56 North. The Seine is wonderful, but there's nothing quite like strolling along the Moscow River in full view of St. Basil's cathedral at 10 p.m. when it's still broad daylight. Believe me, it's worth looking at Vladimir Putin's sour puss to experience this. And don't fail to get out of town and experience the Russian countryside in the late spring. But take PLENTY of mosquito repellent.

-- If you're going to put air in the tires of your car, make sure you have a handful of quarters. Air isn't free any more.

-- When you go to a baseball game, take public transportation if at all possible. That way you can enjoy the ninth inning without having to worry about getting caught in a rush of traffic on the way out. And you'll save at least 15 bucks on parking.

-- I never cared for Mary Tyler Moore in the first place, and the older she gets, the more she looks like Tom Petty.

-- Martina Navratilova already looks like Tom Petty.

-- Don't go to first-run movies. Why pay $9 to sit in a glorified closet staring at a screen the size of your garage door, and have to sit through 15 minutes of commercials before the film starts? Wait till the damn thing comes out on DVD, rent it, and watch it at home in your underwear. You'll save a drive, you'll be more comfortable, and you can mute the commercials.

-- Speaking of that, the mute button was the greatest invention of modern times, and the cellphone the most obnoxious.

-- Don't reheat coffee in the microwave. It gets cold too fast. Throw it out and make a fresh pot.

-- Scotch whiskey melts ice cubes faster than any other kind of liquor. Don't ask me why. But I have this from a guy who once tended bar as well as from personal experience.

-- Buttered toast really will land on the floor butter-side down eight times out of ten.

-- The best rag to clean windows with is newspaper. Makes 'em sparkle like gems. Now remember you heard that from me, not from Martha Stewart.

-- If you have to put a phillips screw in a hard-to-reach place, stick a piece of chewing gum on the end of the screwdriver and mash the screw into the gum.

-- Cigar humidors are an attractive, but high-end waste of money. A plastic bag with a slice of cucumber or apple in it will keep your cigars moist better than a $200 humidor with a fancy humidification system.

-- There are certain activities you simply should not bet on. Professional wrestling comes to mind.

-- If they tell you the delivery truck will arrive "between two and four," it will arrive at 3:58.

-- Don't start making hard-boiled eggs and then start drinking liquor. Ditto anything involving hot oil -- that's a guaranteed kitchen fire.

-- Eating peanuts is OK at the ballpark, but I don't recommend it at the opera. I once wolfed down a bagful of peanuts between Acts I and II of Die Walkure, forgetting that peanuts are, technically, beans and not nuts. Act II of Die Walkure was never so long. (And believe me, it's long.)

-- If you're getting ready to buy a house, unless it's a one-story house, check and make sure there's a bathroom on the first floor.

-- Never buy a house anywhere near a hospital or a fire station. The sirens all day will drive you nuts.

-- Ditto a house that's in the flight-approach path of an airport.

-- I'm a tough sell. Salesmen fear me. And when the Jehovah's Witnesses are ringing doorbells in my neighborhood, I deal with them the bold way. The assertive way. The macho way. I hide in the kitchen until they're gone.

-- Speaking of homegrown American religions, if you're looking to round up a work crew for a project like house painting, Mormons are a good bet. You will have NO problems with lost work due to substance abuse. These folks don't even drink Pepsi.

-- Don't try to freeze cheese. When you thaw it, you'll be sorry.

-- NEVER put red wine in the refrigerator. Kills it dead.

-- Only use animal waste for fertilizer if it comes from vegetarian animals, e.g. horses and cows. The shit of dogs, cats or any other animal that eats other animals doesn't make good manure.

-- I know this makes me sound like some old geezer from Minnesota, but the tomatoes that you grow in your back yard really do taste better than the ones that come from the store.

-- Younger sisters are a good thing to have. I used to have one.

-- The idea that dogs and cats are natural enemies is not just a myth, it's a stupid one. It's true that dogs like to chase cats, but that's because dogs like to chase everything. We have three dogs and three cats, and they get along better with each other than I do with any of them.

-- Soft drinks may taste good, but they'll give you a gut like the ass end of a Volkswagen beetle. If you're thirsty, drink water (and filtered is cheaper than bottled.)

-- If you keep things like books, paintings, CDS and other valuables in an upstairs room, make sure you close the door every time you leave. That way when a fire breaks out in the kitchen and destroys the entire downstairs, you won't be stuck with a roomful of books, paintings, CDS and other valuables completely covered in greasy soot. (And believe me, it doesn't clean off easily.)

-- Anything you download on to your iPod, back it up somewhere. iPods swoon, and they don't give any warning when they're getting ready to.

-- If you're having a quarrel with your significant other, make sure it's resolved, or at least a truce called, before you go to bed.

-- Pancakes and bacon actually make a very good quick supper. I recommend them with ice-cold Blue Nun.

-- If you're picking up a friend for work, get your ass out of the car and go knock on the door. Don't sit in the middle of the street like a spaz, honking your horn and waking up all of his neighbors.

-- On a related subject, if you deliver newspapers for a living, keep your goddamn voice down. Not everyone else is up at 5:30 a.m.

-- The difference between "having coffee" and "having A coffee" is not just a semantic distinction between a down-to-earth red stater and a hoity-toity blue stater. "Coffee" is the stuff that comes out of a Folgers can and is served in a mug. "A coffee" is a one-ounce shot of espresso, consumed with extended pinky in a cup the size of a large thimble. Everybody got that?

-- Sex is great, but as lingering pleasures go, it can't hold a candle to revenge.

-- Never go sky-diving with someone who was known in your fraternity for practical jokes.

-- Don't lift the lid until you hear the toilet gurgle.

-- If you have dogs, purchase Carpet Fresh, Formula 409 and paper towels by the case-lot.

-- Cancel your cable television service and pick up a book. And I said a BOOK, not a "Kindle."

-- When your dog, cat or any other pet reaches age 15, give it a hug and a cuddle every day.

-- Don't store your cutting board where you keep the bug spray and the ammonia cleaner.

-- Trust, but verify.



--

--

Monday, May 26, 2008

My Favorite Quotations


This one's pretty self-explanatory. We all have our favorite bons mots uttered by the famous, the obscure, the rising and the falling. Here are a few of mine:

"Most rock journalism is people who can't write interviewing people who can't talk for people who can't read." -- Frank Zappa

"Any asshole can write a tone-row. It takes a composer to write a tune." -- Leonard Bernstein

"People who want to share their religious views with you almost never want you to share yours with them." -- Dave Barry

"I hate rap music, which to me sounds like a bunch of angry men shouting, possibly because the person who was supposed to provide them with a melody never showed up." -- Ibid.

"We humans do not need to leave Earth to get to a hostile, deadly, alien environment; we already have Miami." -- Ibid.

If you surveyed a hundred typical middle-aged Americans, I bet you'd find that only two of them could tell you their blood types, but every last one of them would know the theme song from The Beverly Hillbillies." --Ibid.

"The one thing that unites all human beings, regardless of age, gender, religion, economic status or ethnic background, is that, deep down inside, we ALL believe that we are above- average drivers." -- Ibid.

"You can say any foolish thing to a dog, and the dog will give you a look that says, 'My God, you're right! I never would've thought of that!" -- Ibid.

"I'm fat and I'm lazy and I'm proud of it." -- Garfield the Cat

"Eighty percent of success is showing up." -- Woody Allen

"On the plus side, death is one of those things that's done just as easily lying down." -- Ibid.

"I can't listen to that much Wagner. I keep getting the urge to conquer Poland." -- Ibid.

"I took a speed-reading course and read War & Peace in 20 minutes. It involves Russia." -- Ibid.

"Gentlemen, you may smoke." -- Edward VII, announcing to the press in 1901 that his mother, Queen Victoria, had died.

"Put that back! He didn't say 'Abandon Ship!'" -- Ward Bond, playing a Navy chief in the film Mr. Roberts, commanding two sailors to cease putting a life raft over the side during a General Quarters alarm.

"It's from my mother. All she ever says is, 'Stay away from Japan.'" --Ibid., Jack Lemmon as Ensign Pulver, opening a letter during mail call.

"If there are any souls in hell, it is because that is where they insist on being." -- W.H. Auden

"To the man-in-the-street, who I'm sorry to say, is a keen observer of life,/The word 'intellectual' suggests right away a man who's untrue to his wife." -- Ibid.

"When statesmen gravely say 'We must be realistic,' chances are they're weak, and therefore pacifistic. But when they speak of 'principles,' look out: perhaps/Their generals are already poring over maps." -- Ibid.

"The man who thinks he can solve the world's problems has probably never faced up to his own." -- Henry Miller

"I wouldn't want to belong to any club that would have me for a member." -- Groucho Marx.

"I've decided that instead of getting married again, I'll just find a woman I don't like and buy her a house." -- Overheard in a bar

"A group of politicians deciding to dump a president because his morals are bad is like the mafia getting together to bump off the Godfather for not going to church on Sunday." -- Russell Baker.

"Ever notice that the words'acrimony' and 'matrimony' rhyme, as do 'married' and 'buried'?" --Actually, I said this. Hey, I have my moments too.

"A camel is an animal that looks like it was put together by a committee." -- Anonymous

"Football combines two of the worst features of American life: violence and committee meetings." -- George F. Will

"An actor is a guy who, if you ain't talking about him, he ain't listening." -- Marlon Brando

"I feel sorry for people who don't drink. When they get up in the morning, that's as good as they're going to feel all day." -- Frank Sinatra

"I know what custody of the children means: get even." -- Lenny Bruce

"Satire is tragedy plus time." -- Ibid.

"In the halls of justice, the only justice is in the halls." -- Ibid.

"Idealism is fine, but as it approaches reality, the cost becomes prohibitive." -- William F. Buckley, Jr.

"Too bad all the people who know how to run the country are busy driving cabs and cutting hair." -- George Burns

"Society is cancerous and bureaucracy is its cancer." -- William Burroughs

"People are hypocrites. If you ask them what they want to see on TV, they'll tell you they want better quality programming. And then what do they watch? 'Gilligan's Island.'" -- Johnny Carson

"All it takes to write 5,000 words a day of that kind of prose is a quart of whiskey and a lack of interest in syntax." -- Ernest Hemingway, trashing his rival William Faulkner

"Did you ever notice that people who are good with a computer don't use it for much of anything except being good with a computer? They know all about information technology, but they don't have much interest in the information. I'm the opposite." -- Andy Rooney

"There goes the good time that was had by all." -- Bette Davis, commenting on a passing starlet

"The pursuit of the uneatable by the unspeakable." -- Oscar Wilde's description of a fox hunt.

"A jury consists of 12 persons chosen to decide who has the better lawyer." -- Robert Frost

"A liberal is a man too broadminded to take his own side in a quarrel." -- Ibid.

"We are the unwilling, led by the unqualified, doing the unnecessary for the ungrateful." -- 1960s Vietnam veteran's graffiti

"When people are free to do as they please, they usually imitate each other." -- Eric Hoffer

"Patriotism is the last refuge of a scoundrel." -- Samuel Johnson

"Hell hath no fury like a liberal scorned." -- Dick Gregory

"Politicians are the same all over. They promise to build a bridge even where there is no river." -- Nikita Kruschchev

"How do wars start? Diplomats tell lies to journalists, then believe what they read." -- Karl Kraus

"Books -- what they make a movie out of for television." -- Leonard L. Levinson

"A promiscuous person is usually someone who's getting more sex than you are." -- Victor Lownes

"The longer the title, the less important the job." -- George McGovern

There are four stages to a marriage: first there's the affair, then there's the marriage, then there's the children, then finally there's the fourth stage, without which you cannot know a woman, the divorce." -- Norman Mailer

"Men are those creatures with two legs and eight hands." -- Jayne Mansfield

"Never send a monster to do the work of an evil scientist." -- the 'evil scientist' in the Bugs Bunny cartoon Water, Water Every Hare.

"What men value in this world is not rights, but privileges." -- H.L. Mencken

"When women kiss it always reminds me of prizefighters shaking hands." -- Ibid.

"Puritanism -- the haunting fear that somewhere someone might be happy." -- Ibid.

"Fascist: a word the Left uses to describe anyone and anything it doesn't like." -- George Orwell

"So, does your long hair make you a girl?" "Not necessarily. Does your wooden leg make you a table?" -- An exchange between Joe Pyne and Frank Zappa in a television interview

"Government is like a baby: an alimentary canal with a big appetite at one end and no sense of responsibility at the other." -- Ronald Reagan

"Fanaticism consists in redoubling your effort when you have forgotten your aim." -- George Santayana

"Few revolutionists would be such if they were heirs to a baronetcy." -- Ibid.

"You can't put sour cream on shit." -- Alexander Solzhenitsyn

"The chief cause of problems is solutions." -- Eric Sevareid

"Morality consists in suspecting other people of not being legally married." -- George Bernard Shaw

"Alcohol is a very necessary article. It enables Parliament to do things at eleven O'clock at night that no sane person would do at eleven O'clock in the morning." -- Ibid.

"An alcoholic is someone you don't like who drinks as much as you do." -- Dylan Thomas

"The trouble with the rat-race is that even if you win, you're still a rat." -- Lily Tomlin

"A critic is a man who knows the way but can't drive the car." -- Kenneth Tynan

"For certain people, after age 50, litigation takes the place of sex." -- Gore Vidal

"In general, the art of government consists in taking as much money as possible from one party of citizens to give to the other." -- Voltaire

"When you are down and out something always turns up, and it is usually the noses of your friends." -- Orson Welles

"When choosing two evils, I always like to take the one I've never tried before." -- Mae West

"Murder is always a mistake. One should never do anything one cannot talk about after dinner." -- Oscar Wilde

"Experience is the name everyone gives to their mistakes." -- Ibid.

"A cult is a religion with no political power." -- Tom Wolfe

"An expert is a man who has stopped thinking. Why should he think? He's an expert." -- Frank Lloyd Wright

"Obscentiy is whatever gives a judge an erection." -- Anonymous

Thursday, May 22, 2008

Lay Down Your Weary Tune



Today I'm blogging on a subject that's of absolutely no interest to anyone.

It's my blog, right?

Listen up, world! I don't write poetry anymore.

You heard me correctly, and you may put your false teeth back in. Get your lower jaw off the floor. I know it comes as a shock, and most likely you're reaching for the Jack Daniel's bottle to steady your nerves.

I TOLD you it was a subject of no interest to anyone.

I started writing poetry when I was in junior high school, circa 1970. My school chum, ASB President Chris Anderson, (who later went on to teach at Oregon State) had taken it up, and he encouraged me to do the same. We wrote poems together. We collaborated on a mercifully-forgotten "anthology." I moved away, but throughout high school we sent our poems to each other. When I was in my mid-forties, going through a cardboard box of old stuff, I came across a letter he had written me in 1972, with a couple of his teenage verses in it. We lost touch after graduating from high school, Chris and I. But I continued writing poetry. The only place I ever managed to publish any of it was in STATE magazine, the in-house monthly organ of the U.S. Department of State, where I worked for some years. My STATE magazine poems all had "foreign service" themes. (If one is permitted to have favorites among one's own poems, my favorite was one called Bach In Mato Grosso, a reflection upon listening to the Second Brandenburg Concerto on my Sony Walkman while driving through Brazil's Pantanal, looking at crocodiles.)

To come to an understanding of why I've stopped writing poetry, we need to come to an understanding of what poetry actually is.

And there's the first problem right there. No one seems real sure. Our prevailing notion about poetry, when we think about it at all, is that it has something to do with the expression of high-flown sentiments and emotions. I once heard no less a personage than Tom Brokaw conclude an interview with a poet by remarking, "That's what poetry is all about. Feelings."

Uh-huh.

Well, okay, so we need to discuss how we got there. Which means I'm going to spend just a minute or two discussing not what we think poetry is now, but what it originally was.

There isn't a whole lot of agreement there, either. The late Robert Graves held to the interesting-but-crackpot notion that poetry originally arose as part of a religious ceremony, that being worship of the triple moon-goddess, later incarnated as the nine muses of antiquity. His book The White Goddess is worth a read.

But read it in context. Graves came out of the trenches of World War I as an unmistakeable Vietnam shell-shock case. He saw ghosts, and sublimated his desire to serve the moon-goddess into the abject worship of a shrill, unpleasant woman named Laura Riding, who mistook her own talent for genius and resented the world for not agreeing.

But of that another time.

I go with the school of thought, concerning poetry, that it no doubt did arise, in the gray mists of ancient times, as part of religious ritual. The epics of Homer arose from this tradition. But the most important thing to keep in mind about poetry's origins is that they go very far back. Prose is a relatively recent invention, prose fiction even more so. Poetry is ancient.

And there's a perfectly good reason for this. Until the Germans invented the printing press in the 15th century, the overwhelming majority of people in the western world couldn't read. For most of history most people couldn't read. Reading was the domain of priests, scribes, lawyers and other educated folk.

But all civilizations have their stories, and stories have to be told. If no one in town or at the court can read, there are two ways left to tell stories: either in pictures or in some way that makes them easy to remember and recite. Hence, medieval churches were festooned with stained-glass and statuary that communicated the stories of the Bible to illiterate parishioners. When printing took over from oral and visual tradition, the newly-minted Protestants, brandishing The Word, declared
war on "images" and ran around Europe smashing stained glass windows and breaking up statues.

And indeed the Gutenberg Bible may be viewed as the first salvo in the victory of prose over poetry.

But of that also another time.

The fact is that for thousands of years poetry's chief use and function was the transmission of stories, not so much the expression of subjective feelings. Sure, the great epics from The Iliad to The Divine Comedy are shot through with powerful emotion, but it's the emotion of the theater. It aims at arousing certain emotions in the reader, as great drama does with its audience. Homer isn't interested in telling you how he feels; he wants you to know how Achilles felt. Dante is a first-person narrator, but he's taking his readers on a much broader journey than simply one through the garden of his own emotions.

Now. Go and try to memorize a page of say, Henry James. I'll wait.

Couldn't do it, could you?

On the other hand, I have personally witnessed performances by people who were able to memorize and recite whole pages, whole cantos of Milton. And there's a perfectly good reason for this. Poetry, as it was originally conceived, enables that sort of thing. Because for the longest time, poetry was written to be recited, not read. Ancient Greek lyric poetry was meant to be sung, you know, like Garth Brooks. Well, maybe not like that, but you get the idea. The Romans employed poets to reel off favorite epics at parties. Poets made their living wandering around propagating the old stories, most of which they carried in their heads. Tradition has it that the bards of ancient Wales had to go through something like "memorization boot camp," where they underwent such trials as being required to recite long verses to themselves while submerged in water, with only their noses poking out so they could breathe.

The ancient devices of poetry, its measures so to speak, were developed to facilitate these virtuosic feats of memorization. Meter, stress, syllable-counting, rhyme, assonance, alliteration and all the rest of it served a twofold purpose: they helped to emphasize the action of the story being told, and they facilitated its memorization, both in the reciter and in the audience. Measured poetry is extremely mnemonic. I can recite Yeats' Sailing To Byzantium or Robert Frost's Come In, but I can't recite Yeats' account of his first meeting with Lionel Johnson, nor can I recite Frost's prose.

Nor, I daresay, can I recite anything by John Ashbery or Charles Olson. I've read Allen Ginsberg's Howl plenty of times, but I'm damned if I can recite any of it without the printed page in front of me. "I've seen the best minds of my generation, something something something...angry fix...machinery of night..." I'm lost.

In the wake of the French Revolution a revolution also occurred in the arts. We now call it the Romantic movement. When I was in high school, my very-hip, product-of-the-sixties English teacher, Ms. Rochelle Terry, (who was still called MRS. Terry then) derided William Wordsworth as an anemic old fuddy-duddy. (In fairness to her, so did Lord Byron, but he, as they say, "was there.") Mrs. Terry, in her early twenties then, preferred to let us decide for ourselves what poetry was, and that meant that when it came time for each of us, as part of the curriculum, to get up in front of the class and "teach" a poem, what we served up to ourselves was a cornucopia of Top 40 lyrics by the Beatles, James Taylor, Black Sabbath, Richie Havens, Joni Mitchell, etc. etc. At age 15, I got the scent in the wind and decided to treat the whole thing as a joke: when it was my turn, I selected "National Brotherhood Week" by Tom Lehrer, rather than something by Dylan Thomas, which would have been my preference.

But you see, I was already writing poetry then myself, and such things mattered to me.

But despite Mrs. Terry's sneering, Wordsworth was actually a revolutionary figure in his own time. He was among the first to abandon the mannered style of poetry, with its alexandrine couplets and hard-working rhymes, that had characterized most English poetry of the 18th century and found its apotheosis in Alexander Pope. Wordsworth famously announced that he wanted to write poetry in "the language really used by men." He wasn't the first English poet to write blank, that is, nonrhyming verse; Shakespeare and Milton were just two who had gone before. But he was among the first to use blank verse to express a highly personal view of the world. His blank-verse epic The Prelude is essentially an autobiography. One of my personal favorites among his poems, the beautiful Tintern Abbey, is a personal reflection on revisiting a favorite spot which, before it's over, morphs into an outburst of love poetry directed at his sister.

Wordsworth and his cohort opened a floodgate. It had its corollary in the other arts of course; Wordsworth's famously fecund period, roughly 1799-1807, was followed by Beethoven and Berlioz, Delacroix and Goya, Verdi and Wagner. It was Romanticism; it was all about ego and it virtually defined the 19th century in the arts. And it came concomitant with the triumph of literacy, as the industrial revolution and the rise of bourgeois democracy gradually resulted in nearly-universal education.

So now everyone could read and write, and everyone was not only having feelings, but celebrating the fact that they did. "Feeling is all," Goethe wrote in Faust, and a Europe grown weary of Enlightenment reason and rhyme was only too happy to go along with that sentiment. In just a few short years Romanticism took the inevitable turn for the morbid as poets like Baudelaire and Rimbaud began probing the darker side of the psyche, taking their cue from the crown prince of morbidity himself, Edgar Allan Poe.

Eventually all this wallowing in emotion spun itself out, as movements will, and then the global explosion that followed Sarajevo suddenly seemed to render everything the 19th century stood for as just horribly moot. But even after the First World War blew the last shreds of romanticism to smithereens, the new, interior-focused nature of poetry that had arisen in the earlier century remained. The disillusion of the postwar generation merely reshaped its attitude. The celebration of feelings was replaced by the pathology of society and the individual. Yeats retreated into personal myth; T.S. Eliot, following in the French symbolist traditions of Baudelaire and Mallarme, took up refuge in an intellectual hermeticism, dipping into the past to write verse shot through with allusion and allegory, cast ironically among the fog-shrouded cities, rat-infested alleyways and iron bridges of Europe entre les deux guerres, as his generation used to say.

But the triumph of the printed word, and of poetry meant to be read more than recited, also brought about the triumph of vers libre. As Yeats himself put it in a 1932 lecture, the prevailing attitude among poets was "The past has deceived us, let us accept the worthless present." That past included nearly all of the traditions of poetry including its forms. In America, Walt Whitman had already assured the victory of "free" over measured verse when he shook up Ralph Waldo Emerson with the first edition of Leaves of Grass. So poetry not only became a personal affair as far its content, but also as far as its form. Anything that could be spilled on paper could be called a poem, and the more unintelligible or symptomatic of personal neurosis it was, the more likely someone would think of it as "profound."

By the time I hit Mrs. Terry's 10th-grade English class, the war was over, if you care to call such a tempest-in-a-teapot a war. Allen Ginsberg certainly did; he proclaimed the so-called "poetry wars" of the 1950s as having been won, and won by his side. The new free-verse crowd, including such as Lawrence Ferlinghetti, Charles Olson, Gregory Corso, Robert Duncan and Ginsberg himself, had triumphed over what he derisively called "The Bread Loaf poets," people like John Ciardi, Louis Simpson and Richard Wilbur who still wrote measured verse. To the kids in my high school this attitude was a "given." You didn't even think about it. "Free verse" was the only honest kind. Verse that rhymed was old-fashioned, out of touch, phony, because it wasn't as conducive to the direct expression of feelings. The feminist movement just gave this more impetus when its need for martyrs prompted poets like Anne Sexton, Adrienne Rich and Sylvia Plath (the last of whom, by the way, had mean technical chops when she wanted to use them) to turn poetry into a medium for expressing how neurotic and suicidally unhappy the bad old patriarchal world had made them.

Poetry is now entirely a personal matter, and what constitutes it entirely a matter of opinion. When Anthony Hecht died in in 2004, the New York Times reported his death as that of a "traditional" poet, not just a poet. Hecht had never abandoned measures, and measures had long since been thrown out by the academic establishment in favor of complete solipsism. So now "traditional" poetry has been marginalized as a subcategory of poetry, like kickboxing is a subcategory of boxing. Vers libre alone gets respect from critics as the genuine article.

The most damning argument I can come up with against the digitalization of reading is that gadgets like Amazon's Kindle need batteries, and books don't. By the same token, I can't think of a single free-verse poem that I've ever been able to commit to memory, so ultimately what good is it, since poetry had its birth in the necessity for oral transmission? Then again, I guess that's the idea, right? We get back to the triumph of the written word. Since vers libre is intended to be read silently to oneself, not recited, it isn't necessary that it provide any mnemonic cues. But goddamn, what fun I have at my annual St. Patrick's Day party reciting The Fiddler of Dooney and The Song of Wandering Aengus. I can't imagine getting a similar thrill from trying to recite one of Charles Olson's "Maximus" poems. I don't think my guests would appreciate it, either.

I would not personally have had any problem with the two traditions existing side-by-side, but the vers libre crowd would have none of it. Academia had decided that it could not serve two esthetics. John Ashbery may be the finest poet of his generation, (judged by what standard, I always find myself asking?) but if he ever starts worrying about dactyls and spondees, we're not going to take him seriously anymore. He'll be buried over there with Anthony Hecht and the "Bread Loaf" crowd.

Believe it or not, I was bothered by this as early as high school. My teachers, particularly my creative writing teacher during my senior year, Mrs. Joanne Massie, were very impressed with my poetry. More than I was, in fact. Because I knew, deep down inside, that it didn't take any particular talent, outside of a knack for descriptive language, to do what I was doing. Now, my friend Charlie Berigan, he could play the piano. He could play Beethoven. There was an acquired skill. Me, I could ramble on in free verse about any subject at hand and my teachers would be impressed. But even at that early age I was haunted by the feeling that there wasn't much percentage in playing a game that didn't have any rules. Robert Frost whacked a major nail on the head when he remarked that writing free verse was like "playing tennis without a net." I felt that even as a teen whose teachers were ooh-ing and aah-ing over his supposed precocity. All I was doing at 17 was shooting off my mouth with a bit of panache, a vocabulary beyond my years and a dash of attitude. Gertrude Stein told Hemingway, "Remarks aren't literature." Shooting off your mouth isn't poetry.

And my feelings haven't changed much in 35 years, which now and then gets me into interesting exchanges with people who never thought much about poetry, but who remember that Rod McKuen was once cool, that poetry is supposed to be all about "feelings," and who assume that "rules" (the word non-poets use for what poets call "measures") are something inhibiting to the direct expression of feelings and therefore must be something bad. It's interesting: The same people who will agree with me when I say that Chopin is music, while random fist-banging on a piano keyboard (which can certainly express strong feelings) is not, will respond to anything I say in defense of measured verse with the sort of look a guy gets who has just admitted that he collects kiddie porn. These are the same people who listen to Barbra Streisand and think she's great, but if they hear Arnold Schoenberg's atonality they'll say they aren't hearing music because they aren't hearing a tune. But that doesn't apply to poetry, which is actually closely allied to music, or once was. "Rules:" music si, poetry no.

So there you have half my beef: poetry has been "defined" as a free-for-all with no rules, the more invertebrate the more sincere, and therefore, as far as I'm concerned, a hopelessly debased currency. And every issue I receive of the American Poetry Review slams that message home ever louder. Some of the dreck I read in that journal, written by famous people I never heard of, sets my teeth on edge. Some of it reminds me of the junk I was writing in high school. And these are adults.

I just made mention of the other half of my beef, e.g. "famous people I never heard of." In a culture where anything is considered poetry if the lines don't reach the right side of the page, anyone can be a poet, right? Of course. And America currently has more poets than it has poetry-readers. The practitioners of what W.H. Auden called "this unpopular art" are more numerous, and more unpopular than ever. Don't believe me? Go to www.poetry.com and check out the poets. It's like looking at the penguins at the South Pole. And every one of them has an audience: themselves. Even the "famous" ones, the ones who get their pictures and their poems in publications like APR, have audiences consisting of as many students and colleagues from the English department as they can herd into Barnes & Noble for a book-signing of their most recent university-press-published 67-page paperback tome, which won the 2008 Apple Tree Award -- adjudicated by their former teachers and colleagues in the English department -- and is destined to sell 142 copies.

I wrote poetry off and on for 35 years, from about age 15 to about age 50. I have a footlocker full of it in my basement, some more in various folders on my computer.

And, as Bugs Bunny said, walking away, brushing the dust off his paws, that's that. Poetry will have to get along without me, thank you. I have more skill-intensive things to try and master. It won't miss me, I'm sure. Since Anthony Hecht died I don't think it misses anybody. Come to think of it, the hordes who congregate at poetry.com see to that.

Sunday, May 18, 2008

Another Update From The Digital Dome



A few years ago I wrote a short novel entitled The Coming Forth by Day Of Mr. McCone. In it, a couple of young fellows accidentally stumble across an old geezer living upstairs from one of them in a New York brownstone, who not only lives a very hermetic life, but appears at first glance still to be living in the 1940s or 50s. He's an opera nut and a audiophile, but all of his sound equipment is decades old. He almost never leaves his apartment, so he's nearly oblivious to the world outside. The boys befriend Mr. McCone, and they and their little group of Manhattan bon vivants gradually draw him into the contemporary age, at least as regards sound technology. The story was set in the 1980s, so the cutting edge was still VHS recorders hooked up to TV sets with enhanced sound systems. Mr. McCone becomes acquainted with the dizzy world of what was then called "home video," with surprising, unhappy results.

Well, I'm here to say that I am not Mr. McCone. Like most guys I'm fascinated by "neat stuff," which Dave Barry defined as anything mechanical and unnecessarily complicated. This especially applies to sound technology. I don't have the resources to put together a $10,000 home theater system, but you may rest assured that if I did, I would. I'm always perusing the Crutchfield and other high-end video/audio catalogues, drooling over the latest generations of tuners, speakers, amplifiers and such. And although I was relatively slow to join the compact disc revolution -- CDs first appeared in 1982 and I didn't buy my first one until nearly five years later --in the years to follow I threw my arms around the technology to the tune of 1,000 or more of the little buggers.

In 2005 my wife gave me an iPod for Christmas. Again, it took me a few months to "warm" to the damn thing. I not only couldn't quite get the hang of how to use it, but had no clear concept of what a large amount of stuff "20 gigabytes" represented. But eventually, with a little help from my friends, I figured it out. Valerie upgraded me to the 80-gig model this past Christmas. I have something like 400 CDs loaded on it, and it's not half full yet.

So my point is, I may be a little behind the curve, but I'm no techno-troglodyte. And I suspect I have plenty of company in being a little behind the curve. Dare I also suspect that some of that company is as skeptical of the new wave of self-appointed "futurists" as I am? When Amazon introduced its "Kindle" gadget recently -- a digital gizmo that enables its users to treat books the way iPod treats CDs -- Apple CEO Steve Jobs gave an interview in which he proclaimed that books -- paper-and-binding books, that is -- were the last analogue holdouts in a world destined to be entirely digital, and that the age of ink-and-paper is now as defintively over as the age of horse-and-buggy. In the case of music, the war is over. The Futurists have proclaimed that by 2015 even the compact disc will be gone. No more media, in other words. No more books, no more discs, just a phantom world of downloads, downloads, downloads.

Hogwash, pardon my French.

This sounds very similar to -- and for me, anyway, has about as much appeal -- as the world encountered by The Flintstones and the Rubbles in that episode of The Flintstones when the Great Gazoo, Fred and Barney's pal from another planet, zapped the two stone-age families temporarily into the age of The Jetsons so they could get a look at what life in the future would be like. In one scene, they go into a restaurant where, instead of real food, they're served pills they can pop which will have the same effect as the real foods they're supposed to represent.

That's what the phantom world of downloads, downloads, downloads represents to me. I go to order a steak with onions and I get a steak-with-onions pill. Yum.

I've already discussed the aesthetic "disconnect" between the experience of reading, say Tender Is the Night in an elegant leatherbound edition with tooled cover and gilded endpapers, and reading it on a white plastic gizmo that runs on batteries. Ditto music. Something was lost when the experience of purchasing a vinyl disc in a 12" by 12" shrink-wrapped cardboard sleeve, imaginatively decorated with cover art, then taking that disc home, slipping it out of its sleeve and onto a softly-clicking turntable, then watching the needle lower itself into its groove and begin issuing forth music, gave way to buying a hunk of plastic five inches square, pulling another hunk of plastic out of it and tossing it into a machine where you couldn't even see it spinning, just hear the sound it poured out.

Yes, something was lost when vinyl LPs were replaced by CDs, despite the convenience of the smaller, lighter media.

And there's the rub. The Futurists are saying that the next step is to get rid of the media altogether and make it all virtually invisible. Download the album directly from the Internet, pull the list of tracks up on your computer screen, click on the one you want to play, and enjoy. Or download it in turn to your iPod and head out the door. Convenience was never more convenient.

Or experience more sterile. I currently have, in my sun room, no less than five different ways to listen to music. I have my choice of terrestrial radio, satellite radio, compact disc, iTunes and Internet stream. Lately I've gotten into the habit, as have many people, of buying a CD, loading it into my iTunes library, then stashing the actual CD in the basement. But when I go down into the basement, I look at these stacks of CDs in their jewel boxes with just a hint of a sigh. "You spent 20 bucks for that recording of Britten's War Requiem," a little voice in my head says, "and here it sits, collecting dust. It's been transformed into nothing more than an invisible presence in your iTunes library."

Do we really intend to divorce the listening experience entirely from the tactile? Have our lives really become THAT compartmentalized?

Actually, I need to re-think this whole paradigm. Because I'm finding increasingly that when I sit down in my sun room to read or surf the 'net or whatever, if I decide to listen to some music while I do it, iTunes is generally my last choice. I'll switch on WETA, or go to one of the Internet streams on Live 365, before I'll go poking around in my own iTunes library. People, as we all know, tend to take the route of least resistance. And it's just simpler to switch on the radio or select a stream from a menu of streams than it is to go scrolling through my iTunes library and choose something to listen to. The truth is, there is so much stuff on my iPod and in my iTunes library now that I can't possibly remember everything that's in there. And whereas in the old days (four or five years ago) I might go to my shelf full of CDs and pick out something, I'm less inclined to go scrolling through a list of tracks. The sense of visual pleasure in perusing a panorama of CD packages isn't there, and therefore I'm less inclined to bother. I switch on Live 365 instead.

It's been more than 40 years since Glenn Gould predicted the death of the public concert and its replacement by the interaction of listener and stereo system. "Dial twiddling" would replace sitting in a darkened hall actually watching and listening to a breathing human being interpret a piece of music for an audience of other breathing human beings. And while the technology of dial-twiddling has advanced by light years since Gould made his bold prediction in the 1960s, the public concert is very much alive. But while he forecast the death of the concert -- a bit of wish-fulfilment, by the way; Gould was a "control freak" who hated the concert stage because he couldn't manipulate its environment as he could a recording studio -- he still envisioned music as something of a participatory ritual for the listener, at least insofar as he or she was expected to make "dial twiddling" part of the experience. But with the age of the download, the participatory experience is vanishing. The listener is merely being acted upon by something initiated in invisible space by a mouse click, not in any sense taking part.

This is not necessarily a good thing. At some point we need to have a serious discussion of where convenience trumps the basic human need to feel a concrete connection between the object being contemplated and individual doing the contemplating. In ancient Greek theater the audience was part of the performance. Modern opera continues this tradition, as do public concerts at their finest. All that is obvious enough, but even with the flourishing of recorded music, slightly more than 100 years ago, the listener was still as much participant as listener; there is that wonderful chapter in Thomas Mann's novel The Magic Mountain in which the hero, Hans Castorp, makes the delightful discovery that the mountain sanitorium where he is recovering from tuberculosis has purchased a brand-new, state-of-the-art grammophone. Castorp lovingly operates the machine, takes charge of locking it up when it's through being used, monitors the use of the needles, takes care of the records, marking and cataloguing them. He isn't just listening to the music, he's an active participant in making it happen and seeing that it will continue. His creator, Mann, was said to sometimes play records for friends on a Sunday afternoon, and if the needle skipped or got stuck, he would be as embarrassed as a pianist who had made a faux pas in concert.

While I have done my share of downloading, and am sure I will continue to do so, I also have down in my basement an "old-fashioned" stereo system. By "old fashioned" I mean one that has nothing whatever to do with a computer. You know, the old architecture of tuner, speakers, turntable (yes!) CD and cassette players all hooked up together. When it's time to get away from the computer, (and there is such a time, believe it or not) I savor the experience of going down there, hunting through my old LPs for say, the old Jascha Heifetz recording of Saint-Saens Sonata in C minor, taking the disc out of the box, putting it on the turntable and watching it spin, crackle...and play. It's playing. We're playing. And music never sounded so good.

Friday, May 16, 2008

You Always Hurt The Ones You Love



House Fire
There’s an ineluctable,
and at the same time
fugitive intimacy
in watching your own house
burn down on videotape.

Strangers in helmets
smashing windows with
axes: the most obvious
analogy (so let’s skip that)
would be with rape.

No. This is even more
impersonal; let it seem odd.
Here’s a monitor-full
of what the actuaries mean
by an act of God.

That's a poem I wrote in January of last year after a house fire at a property my wife and I owned in Spokane, Washington. It was a distinctly spooky experience, going out to the website of KHQ Channel 6 TV and seeing a piece of videotape showing my own house ablaze, with fire crews breaking windows and such.

The fire itself was confined to the kitchen, and the kitchen was really the only part of the house that was completely and utterly destroyed. (Along with my collection of baseball memorabilia, with which I had decorated the kitchen. And that included a mounted, autographed color picture of Nolan Ryan.)

But a worse surprise awaited me upstairs. The house was unoccupied; it was a property we were hoping to rent out as office space. Still, I had installed my library in one of the upstairs rooms. A contractor had come in and built floor-to-ceiling bookshelves. I had put in a stereo sound system. I had moved in a desk, my computer, a rocking chair and an armchair. All of my books and CDs were in that room.

I had come in two days before the fire, and upon leaving, had left the door to that room open. Thick, black, greasy smoke from the kitchen fire had permeated the entire house, and smoke goes UP, as we all know.

My entire library, with very few exceptions, was completely "smoked." Nearly all of my 1,000-some books were coated in thick, black soot.

Our insurance didn't cover it. We carried about $10,000 insurance coverage on the entire contents of the house. That didn't come close to what it would have cost to replace my collection of books and music. The insurance adjuster advised me to just trash the whole collection. Take them to the dump, start over.

No way.

You see, I had parted with my library once before, and that time it was a collection I'd spent about 30 years amassing. When I left Maryland in 2003 to return to California, it was under somewhat desperate circumstances. I'd been unemployed for nearly six months and was forced to head back out to the west coast where my family was. I had to travel light, and I simply did not have the resources to store 1,400 books, 600 LP records, 500 CDs and assorted household effects.

I gave the whole lot away, donating them to the Wheaton Public Library's thrift shop.

For a guy who agrees with Montaigne that "My home is where my library is," that was an extremely painful decision. Once I was settled in California and found a new job, I began slowly, laboriously rebuilding. Modern conveniences like the Internet made it somewhat easier than it had been in the 1970s and '80s, when buying books meant visiting bookstores and writing snail-mail letters to book dealers.

Gradually my collection grew again. Naturally I sought out and re-acquired many of the titles I had given away, in addition to new ones. Within three years I had amassed, once again, several hundred books.

Then came that fire in Spokane.

Well, the insurance adjuster be damned. I wasn't about to part with my library a second time. I went out and bought a couple of gallons of Simple Green and about 10,000 paper towels and began the long, laborious process of cleaning my books, one by one.

Let me tell you, there are few jobs so nasty in all the world. Greasy, black soot is all-pervading and yields no quarter. Some books simply had to be tossed. Others, despite my labors, will always be tattle-tale gray around the edges. But I was determined not to part with my library a second time. After cleaning each stack of books, I packed them into cardboard boxes, sprinkled in stink-remover, (for soot stinks as bad as it stains)and put them into storage.

Then my wife and I sold the bed-and-breakfast we were running in Spokane and returned to her home town of Washington, D.C. The boxes of books, along with nearly everything else I own, wound up stacked in the basement of our new house, for which we have plans to turn it into yet another library just as soon as I can come up with $5,000 to pay yet another contractor to build some shelves down there.

Then, about a week ago, it started to rain. It rained all day Thursday, and Friday. It stopped for a while on Saturday, long enough for me to barbeque steaks on the grill in the backyard, but then on Sunday it started again. It continued on Monday, as I went downtown in the morning to take care of some business at Voice of America.

When I returned home around 12:30 that afternoon, my wife had left a note on my desk. "BASEMENT FLOODED."

Huh boy. I went down to take a look. Apparently the heavy rains had seeped in through the basement windows on the east side of the house.

That just happens to be the wall against which my boxes of books were stacked. Naturally it was the boxes on the bottom of each stack that had gotten wet. I started lifting and moving to get at those boxes, afraid of what I might find when I tore them open.

Actually, the damage could have been worse. The basement floor apparently has a slight south-to-north tilt to it, so the boxes at the far end of the room were untouched. Also, numerous boxes of books were stacked on top of two footlockers, which kept them protected.

But a number of books were indeed water-damaged, some so bad that I had to just toss them out, write down their titles and figure on re-acquiring them once again. Some, including a few "Library of America" volumes, went into the oven at 175 degrees for a quick dry-out. They'll always show slight water-damage, but I can live with that as long as they're not unreadable. At least i got down there fast enough to prevent mildew from forming. I had a similar experience more than 20 years ago in an apartment where I was living, only that time I was unaware of the water damage until it was too late. Some books, including my cherished copy of Joyce's "Ulysses," had actually gotten mildewed, and there's no cleaning that up. This time I was a bit luckier than that, anyway.

But there you have it. I can now officially state that my library has been through fire and flood. If we ever get boll weevils around here, I can add pestilence. Fortunately I have yet to hear of an outbreak of boll weevils in the nation's capital. Oh, we do have pestilence here, but the last time I checked, members of Congress weren't eating books. And probably won't, as long as there are loopholes in the lobbying regulations that allow them thousand-dollar lunches.

But maybe I'd better think about having those new shelves built on stilts, like one of those bamboo houses you see in pictures from the Amazon. I don't know if climate change has anything to do with this, but when being a bibliophile becomes a risky proposition, we are living in dangerous times.

Wednesday, May 14, 2008

Reagan and Me: A ‘70s Kid Remembers




(I originally wrote this in June, 2004, on the occasion of the death of former President Ronald Reagan.)

I
This week my subscription to National Review magazine, ordered about two weeks ago, kicked in.

In yesterday’s mail I received NR’s special commemorative edition on the recent passing of Ronald Reagan.


I was reading it last night in class, to prevent myself from falling asleep out of sheer boredom during the first of a months’ worth of lectures on bankruptcy, the final section of a paralegal course I’m taking at the University of San Diego. Every essay in this issue is dedicated to some aspect of Reagan or his legacy.

I found something in one of the essays thought-provoking to say the least: I reflected on it in the car all the way home from school.


Among the authors who contributed encomiums to the late president was historian Paul Johnson, author of “Modern Times,” one of the best books on 20th Century history I ever read, by the way.

Describing the situation Reagan inherited when he was first elected president in 1980, Johnson wrote:

“The 1970s had seen a president forced to retire in disgrace, and an unelected president with no mandate, beaten in turn by a feeble Democrat from the south who had no obvious policy or coherent view of the world. In Washington, a triumphant but leaderless Congress usurped executive authority, allowing a triumphant Soviet Union, and its surrogates in Cuba and Vietnam, to do what they willed in Africa and Asia. America’s apparent decline as a great power was symbolized, in a terrible moment early in 1980, by a shocking military fiasco in Iran.”

I could add my own lugubrious memories of the 1970s to Johnson’s: gas lines. The Chevy Vega and the Ford Pinto. Stagflation. Unemployment soaring toward 10 percent. The U.N. transformed into a Third World debating society, with America blamed for every problem on earth. Mason Reese.

And Johnson didn’t bother to mention, but I will, the spectacle of Americans clinging to helicopters to get out of Saigon as the situation in Vietnam finally collapsed on April 30, 1975.

On that same subject, I also remember the 1975 Academy Awards show. The anti-American, pro-Communist film "Hearts and Minds" won the Oscar for Best Documentary. As if that weren’t bad enough, two of the American communist doofuses who had been involved in making it got up to accept the award, and they crowed to an approving audience of how South Vietnam was about to be “liberated.” In Hollywood anyway, little has changed.

As for roller disco, polyester leisure suits and big, ugly medallions, well, to paraphrase Mark Twain, we’ll draw the curtain of charity on that.

It’s hardly surprising that Jimmy Carter, when he sought re-election as president in 1980, had to resort to desperate scare-tactics: Democrats pitched a vision of Reagan as a bug-eyed, right-wing maniac who would abolish Social Security with one breath and mash his thumb on the nuclear button with the next. There wasn’t a single thing in his own record that Carter could point to as evidence that we should re-elect him, which left his campaign with no strategy except to demonize Reagan, which is all it did, and to no avail, because by 1980 America had clearly had enough.

In November, 1979, when our embassy in Iran was overrun, (it was just a month later, by the way, that Soviet tanks rolled into Afghanistan) things had been sliding steadily from bad to worse for six years, starting with the Watergate mess in ’73, and there was very little expectation amongst Americans that they would ever get better.

Tom Wolfe famously called the 1970s the “Me" decade, but the catch-phrase I remember much better from those years is “lowered expectations.” The Oscar for Best Picture of 1976 went to Sylvester Stallone’s "Rocky," a movie in which a from-nowhere prizefighter is offered a miraculous shot at the heavyweight title, only to realize before the fight even occurs that he has no chance of winning, and announces that he just wants to “go the distance,” in other words, he’s willing to settle for second best. Willingness to settle for second best had never been an American core value, and perhaps "Rocky" reflected the era’s malaise in much the same way that the film which followed it as Best Picture the very next year, "Star Wars," reflected a concomitant thirst for pure escapism. In 1933 Americans crowded into theaters to watch top-hatted Fred Astaire dance with slinky Ginger Rogers and thereby escape for a couple of hours the horrors of the Great Depression. In 1977 they crowded into theaters to watch Luke Skywalker dance with Darth Vader, probably for much the same reason. I remember well the fanfare that attended the appearance of "Star Wars," the runaway summer hit of that year: “It’s the return of...ENTERTAINMENT!” The reviews trumpeted, and all summer long, John Williams’ stirring orchestral prelude to the movie blared from radios all over the country. In the long view off the caboose of the train, all that hyperbole and fanfare seems a collective sigh of relief.

In short, the 1970s were a very bad decade for America, and Reagan, according to Johnson, affected an almost miraculous turnaround in the nationwide, and worldwide situation during his eight years in office. He restored our national confidence, stage-managed the destruction of the Soviet empire and brought America back to a role of pre-eminence on the world scene. The ‘70s were the disease and Reagan was the cure.

But here’s what got me thinking: those godawful ‘70s were also the decade in which I came of age. And despite all of the introspection I’ve been doing in my personal journals for more than 30 years now, I have never really made any serious attempt to come to grips with my own relationship to the era in which I grew up, and how it may have affected my entire life. I’ve written at length of how my father, who came of age during the Depression, let that traumatic experience shape his attitudes and behavior for the rest of his days, but I have written little or nothing of how my own experience of being a “‘70s kid” might have profoundly influenced the kind of man I became in the ‘80s, ‘90s and right up until today, as I write this at the age of 48.

Surely, some of the most important years of anyone’s life are the period between their teens and early twenties. Goals, dreams, ambitions and attitudes that will last a lifetime are forged between early adolescence and the time when you’re launching yourself on the great world as a young adult. On January 1, 1970 I was 14 years old. On January 1, 1980, I was 24. I graduated from high school in 1973, from college in ’77. If anyone can legitimately claim to be a product of the 1970s, it’s those of us who were born around the end of the first Eisenhower administration, circa 1955. Our parents came of age in a time of national economic disaster and psychological pain, and we in turn came of age in a period of national economic lassitude and psychological numbness. (By the way, I’ve mentioned the fate of the Soviets a couple of times; it’s a curious fact that Russians tend to view the 1970s in much the same way we do: the years of Leonid Brezhnev are referred to in post-Soviet Russia as “the period of stagnation.”)

It’s an accepted principle that our capitalist, free-market economy runs in cycles of boom and bust. Perhaps the national mood follows a similarly cyclical pattern. The malaise of the ‘30s was corrected by World War II, which in turn ushered in a period of such confidence that many were speaking of an “American Century” beginning in 1945. Our sudden postwar affluence resulted in a burgeoning middle class whose satisfaction with its newly-found prosperity shaped the 1950s, a decade (perhaps unjustly) characterized as a spiritual and intellectual wasteland, whose contrary symbols were the gray flannel suit and the beatnik T-shirt that was a response to it. The confidence in their own and America’s future which was instilled into babies born during the war and nourished by the optimism of the Kennedy years, led in turn to the waves of college-campus idealism of the early 1960s. That spirit promptly found fertile ground in the Civil Rights movement, and created a general confidence among the younger generation that they could change the world for the better, which was then dashed to pieces by the Vietnam war. In response to the war, the hippies, who were the younger descendants of the beats of a decade earlier, turned on, tuned in and dropped out as the country watched war, assassination and inner-city rioting explode all over its TV sets. My older sister once characterized the 1960s as a period when, for ten years, “the whole country threw up.” Pop culture, so often a good reflection of the time that produces it, telescoped the experience of the decade quite neatly as it came to a close: Woodstock engendered Altamont in short order. In the words of one commentator, the counter-culture “went from flower-power to death-tripping in a matter of months.”

Tired of the constant turmoil and upheaval that the years of John F. Kennedy and Lyndon Baines Johnson had brought with them, in 1968 America turned to Richard Nixon, and the rest is not just history, it’s my personal history, mine and everyone else’s who was born about the same time I was. Nixon was elected for his first term on Nov. 4, 1968. Reams have been written about what a tumultuous year ’68 was; I won’t get into that here. The important thing is, on the night Nixon was elected, I had just turned 13. I was primed and ready to become a ‘70s kid.

I don’t even have to think very long or very hard to come up with an example of how I was sideswiped by the 1970s in a highly personal way.

As the decade began, I was a teenager with understandably little perspective on what was happening with the country or the world at large. We were Republicans and conservatives in my family; on the morning when Nixon was declared the winner of that very-close ’68 campaign against Hubert Humphrey, my father got up and hugged my mother. It was good news in our household: after eight years of chaos ruddered by Democratic presidents, the country would now get back to normal. So we thought, anyway. I was of course aware of Vietnam as a child, and at our house we supported the war effort; as a Republican family we believed that the crusade against world Communism was rightly America’s number-one priority. Social issues could wait until the Communist hydra had been slaughtered. Of course, as the war dragged on, even my parents’ perspective on it began to change. My father, as staunch a Republican as you could ask for, declared some time in the early ‘70s that he was “By-God beginning to understand how the young people in this country feel” about the war. Clearly, even his patience with it was beginning to wear thin, and My Lai certainly didn’t help, although my father was with those who felt that William Calley was just a patsy for the higher-ups. I had no inkling of it at the time of course, but I think now that, as 1973, the year in which I would turn 18 and become eligible for the draft approached, my father was beginning to worry about Vietnam getting its clutches on me.

As it turned out, I was spared by inches: the last American to be drafted into the armed forces was inducted on June 30 of that year. I turned 18 on October 12. Draft registration was still required of course, and I dutifully went over to city hall and filled out my draft form. I promptly received a notice in the mail that I had been classified “1-H,” a holding category. They were no longer drafting anyone. Two years later the draft was formally abolished.

In that summer of 1975, I was visiting Jim Provenza. These days Jim is a middle-aged lawyer with college-age kids, but in those days he was a young firebrand of the left, ardently loyal to and active within the Democratic Party, who dreamed of becoming the next John F. Kennedy and changing the face of America. It was Jim who informed me that summer day that I no longer even needed to carry my draft card in my wallet.

We quickly organized a little ceremony. "Hey, everybody!" Jim called into the house. "Come on out here! Kelley's going to burn his draft card!" To the huge amusement of Jim and his family, I took the card out into the Provenzas' driveway, got out my trusty Bic...and set that sucker aflame. Burning your draft card in 1970 would have gotten you tossed in jail, but by 1975 it was about as inflammatory an act (no pun intended) as saying “The south will rise again.”

My father not only changed his mind about the war, but as time went on, he changed his mind about Nixon. In fact, Richard Nixon accomplished something that I don’t think anyone else on this earth possibly could have: he turned my father into a Democrat, albeit for a short time. By 1974, with Watergate clearly about to become Nixon’s Waterloo, my father got so mad at Nixon that he went out and changed his voter registration to Democrat. He remained a Democrat for several years, in fact after his retirement from the Immigration Service, my father did a stint as an administrative assistant to a Democratic state senator in California. That turned him back into a Republican.

I had just finished my junior year of high school when the Watergate burglars broke in. Again, I watched TV news and read the papers now and then, so I knew what was going on in a general way, but I was too busy being a teenager to worry about it very much. In the summer of ’72 I was more interested in ogling Olga Korbut, the excruciatingly adorable little Soviet gymnast who was the darling of the Munich Olympics, than I was in anything I was seeing in the papers about Nixon or McGovern. (I signed up to do campaign work for Nixon that fall, but again, being a teenager won out: I think I only showed up at campaign HQ one time, then lost interest.) Come to think of it, the same was true of such issues as the energy crisis and the gas lines of 1974; by then I was using the family Chevrolet to attend junior college, but my dad was paying for all the gas: what did I care if it had just reached the outrageous price of 50 cents a gallon? (I pumped gas in high school for $1.50 an hour. At that time, gas cost about 32 cents a gallon. When I tell that story to today’s twentysomethings, their jaws drop at BOTH numbers.)

By 1976 I had finished community college and transferred to San Diego State University to do my upper-division studies. I loved history and wanted to make that my major subject, but I could see that my father was uncomfortable with my getting a liberal-arts degree. It was the Old Story of “What can you do with that?” Many of my classmates were struggling with the same dilemma: the Class of ’77 was loaded with Business Administration majors who would have preferred to be English majors but had to worry about finding a job. I had decided, early in college, not to major in English myself, though I had considered it. At 19, I allowed myself to be swayed by a purely romantic notion: I loved poetry and literature so much that I decided not to allow my love for these things to be poisoned by a lot of academic BS. Somehow I had the idea that a literary degree would take the fun out of reading Yeats, Shakespeare and Tolstoy, so I decided against it. History was a subject I also loved, but not with the level of passion I had for literature. Besides, history is all about scholarship: classrooms can’t hurt that.

Still, I had to face the Old Story, and so I struck a compromise with my dad and with myself. Since the age of 16 I had never wanted to be anything but a writer, so I decided that an acceptable halfway measure would be to double-major, in history and journalism. I entered the College of Liberal Arts to pursue a major in history, and then walked across campus to the College of Professional Studies to pursue a major in journalism. I had no notion of becoming a history professor; I simply liked the subject. After college, I figured, I could go to work as a reporter on a newspaper or magazine. What the heck, my 21-year-old self figured. It was all just time-serving anyway, until I managed to explode upon the literary scene with my first big novel. Yeah, right.

Setting aside the nonsense about big (or small) novels, little did I know that my decision to major in journalism had put me on a collision course with history, or at least with cultural trends. Talk about bad timing. And Nixon, appropriately enough, lay at the bottom of it all.

I made the decision about going for a journalism major during my junior year at State: 1976. The year of "Rocky."

But it was also the year of "All The President’s Men." The movie version of Bob Woodward and Carl Bernstein’s adventures as Washington Post reporters bringing down the President of the United States hit the silver screen that year, with rugged Robert Redford and handsome young Dustin Hoffman as their nowhere-near-as-good-looking real-life counterparts. (Why does Hollywood always do this, by the way? One thinks also of unbearably-glamorous revolutionaries Warren Beatty and Diane Keaton in "Reds." In real life, John Reed and Louise Bryant were a couple of two-baggers.)


I heard it said, back around that time, that at the moment in the film when Robert Redford bats his beautiful orbs inquisitively and utters the line, “Who’s Chuck Colson?” 250,000 college students immediately ran out and changed their majors. Suddenly, with "All The President’s Men," journalism became cool. Everybody wanted to be the hotshot investigative reporter who goes around pulling down the establishment. From the release of the film until the end of the decade and beyond, the nation’s journalism schools were chock full of wannabe Woodwards and Bernsteins, and I, who had no Watergatey pretensions at all, (remember, my plan was to become Scott Fitzgerald, not Bob Woodward) found myself standing in a field swamped by heavy traffic. Everybody and his dog wanted to be a reporter, and every newspaper opening in the country had 50 people (some still fighting pimples) lining up to interview for it.

Needless to say, it took me a long time to get my foot in any sort of door. In fact, it wasn’t until February, 1979, more than a year and a half after getting my B.A., that I managed to glom on to a tiny position with a tiny, independent news service in San Diego that was the very definition of “shoestring.” (In the meantime, I had worked at a series of minimum-wage jobs, including security guard and 7-Eleven clerk.) The County News Service of San Diego covered the city and county beats, and the courthouse, for subscriber weeklies countywide that did not have the resources to cover these beats for themselves.

How shoestring were we? The era was not only pre-Internet by more than a decade, but pre-computer by maybe three or four years. I would cover a meeting of the County Board of Supervisors, then type out my stories on an ancient Smith-Corona portable. At the end of the day, whoever’s turn it was to do the mailing that week would gather everyone’s copy, determine how many copies needed to be made using a chart of our client list, then drive over to radio station KGB-FM, with whom we had a trade-off agreement: in return for tip service, they let us use their Xerox machine.
Once the copies had been made, the “mailer” then had to drive them over to the main post office (not a branch, that would slow things down) and drop them in the mail to our clients. (Yes, we ran a news service using “snail mail.”) We had complimentary subscriptions to all the client newspapers, and once a month we would all get together with pencils and rulers and go through the tear sheets, measuring in inches how much of our copy they had used. We then billed the clients $1.00 per column inch. Each reporter got to keep 60 cents on the dollar for whatever we managed to get into print. The other 40 cents were set aside for overhead, basically envelopes and postage. No by-line, and 60 cents a column inch: that was payday. (We had a joke amongst ourselves: “Welcome to County News Service: 60 cents an inch and all the pride you can swallow.”)

For the rest of that year, I went back and forth between home and downtown San Diego, by car, by moped and sometimes by bus, to put in eight-hour days for what was usually somewhere between $250 and $300 a month. To supplement my meager newspaper income, I went back to minimum-wage work as a part-time security guard, spending my Saturday and Sunday afternoons and evenings walking a beat at a local tuna cannery. My father, who initially threw up his hands in despair at my 60-cents-per-column-inch gig, went along with this: I think he understood that I viewed this as my last chance, at age 23, to get into the newspaper business, and if I were willing to work seven days a week to do it, that meant I was serious.

By December, (just about the time those Soviet tanks were rolling into Afghanistan) I managed to land my first real newspaper job, on The Imperial Valley Press, a daily newspaper of about 15,000 circulation in El Centro, California. The “Woodstein” crowd notwithstanding, I got the job through an acquaintance who covered the county beat for the now-defunct Escondido Times-Advocate, had worked on the Imperial Valley paper previously, and still knew the managing editor. That inside track helped, as did the fact that nobody in his right mind would want to live in the Imperial Valley, where the average high temperature between June and September is somewhere between 110 and 118. (For my European and Russian friends, that’s 43 to 47 Celsius.) But I wanted to be a journalist, so off I went.

And that’s where I was the night in 1980 when Reagan was elected: I was living in a one-bedroom apartment in El Centro, for which I was paying $180 a month rent out of the roughly $850 a month which was my full salary. (I started at $720, but the managing editor liked the cut of my jib, so he gave me a raise.)

And I wasn’t expecting a whole lot more, which is my whole point in telling this story.

My father, who turned 20 in 1934, spent the rest of his life expecting the Depression to come back and pounce on him again at any moment, and he lived his life accordingly. He set his sights low, seldom dared to dream, and even when he did dream, for example of buying a farm, he talked himself out of it. He insisted, sometimes loudly, that job security and the guarantee of a retirement pension were the highest things anyone had a right to hope for. He worked 30 years for the Border Patrol and the Immigration Service, retired and spent his dotage sitting on the front porch watching the world go by, boasting of the size of his retirement income, but inwardly seething with resentment, convinced that life had somehow cheated him.

My outlook at 24 wasn’t as extreme as my father’s at the same age, for a couple of good reasons. For one, as bad as they were, the 1970s were not the 1930s, and for another, I had had the advantage of four years of college, something my father never got.

Still, like my father, I had come of age expecting little. The Nixon-Ford-Carter years were conducive to that.

Reagan and his team did, in eight years, gradually manage to turn things around. The change didn’t come quickly, in fact 1982, the second year of Reagan’s first administration, was a severe recession year, one in which I had the experience of being unemployed—for me, the ‘70s seemed to be continuing. But we all know that economic upturns and downturns both tend to lag several years behind changes in economic policy: the positive effects of “Reaganomics” didn’t really start to bear fruit until after he had left office, and after his successor, George Herbert Walker Bush, continued those policies into the 1990s. The Clintonistas tried to take the credit for the 1990s economic boom, but it was not theirs to take. They bashed the memory of Reagan’s administration while basking in the benefits of his economic legacy. (Later, Democrats tried to lay the blame for the 2000-2003 recession on George W. Bush, changing the subject when anyone happened to mention that it began in March, 2000, on Bill Clinton’s watch.)

The change in attitude between my generation and the one that followed it is hard to ignore. When I talk to twentysomethings, and even thirtysomethings, nowadays, I’m amazed at how high their expectations are. Babies born after 1970 have grown into adults who, I would not be at all surprised, scratch their heads in bewilderment at the willingness of Rocky Balboa to settle for second best in 1976. I was a federal employee from 1985 until 1999, and until I left the federal work force and went back to the private sector, I didn’t know what a corporate recruiter was. I’d never heard of one. Imagine my surprise when a corporate recruiter e-mailed me in 1999 and asked if I might be interested in a job writing for the marketing department of a custom software company. The idea that a company might hire people, and pay them salaries, simply to hire other people, was beyond anything in my experience. And yet when I talk to college students today, their expectation is that corporate recruiters will be looking for them upon graduation. Their expectation (happily unrealistic, even in today’s world) is that they’ll walk away from their college graduation ceremonies and be pulling down 75 thou a year the following week.

Talk about a generation gap. The war babies and their parents didn’t see eye to eye over Vietnam, long hair, rock music and drugs. Now the generations may differ over rap, tattoos and body piercing, but I think there’s also a divergence on something fundamentally more important, namely, what they expect as regards the quality of life. I don’t have any children, but friends my own age who do are amused—-and sometimes understandably annoyed—-by the way their high school and college-age children seem to expect so much more coming out of the gate than we did. We ‘70s kids, for the most part, expected to start out humble and slow. I don’t say that to try and make us sound more virtuous than our progeny, it was simply a fact, a reflection of the era in which we grew up. Today’s college grads want it all, and they want it now. And they expect to get it, too.

It all came along too late for me, unfortunately. I’m getting close to 50, and the idea of even owning my own home is something I have only recently begun to think about. When I was young during the Carter years, interest rates stood at 21 percent. I assumed you had to have a huge pile of money to buy a house, and since I never raised a family, I never really thought I needed a house, so my thinking on that subject never changed. By contrast, I have a 26 year-old acquaintance who is aggressively buying and selling one house after another even as I write these words, claiming he’s going to be “the next Donald Trump, but a nice one.”

A few weeks ago I interviewed an 82 year-old World War II veteran, a man who had been at Normandy, for a newspaper story. While he poo-pooh’ed Tom Brokaw’s “Greatest Generation” applause for himself and his contemporaries, he nevertheless echoed Brokaw’s implied message about today’s young: “They have too much stuff and they got it too damn easy,” he said with genial scorn.

Well, the grumblings of the old about how soft and easy the young have it are themselves as old as the pyramids and then some, I’m sure.

But after interviewing this friendly, loquacious old vet, and realizing that it was people his grandchildren’s age that he was talking about, I surprised myself by realizing that my own attitudes aren’t that much different from his. My 22 year-old nephew drives a teal 1994 Honda Civic, fully loaded and souped up for drag racing, (I hope the cops nail him) that his grandfather bought for him for $4,500. My niece, 21, has a late-model Saturn. The only time my father ever bought me a car, it was a ’72 Chevy Luv pickup that cost $1,300, and I had to pay him back. That same nephew, by the way, is talking about opening up his own business as soon as he finishes college. (At the rate he’s taking classes, this should be around 2023.) I saw a newspaper article a few days ago about students at an affluent high school in California whose parents reward them for getting good grades with Jeep Cherokees, Lexus sedans and the like.

Today’s kids have too much stuff, and they got it too damn easy.

Like it or not, Reagan lies behind all of this. When he was reshaping economic policy in the early 1980s, the lefty-liberal crowd alternately worked themselves into a state of high dudgeon, and when that didn’t convince the country that Reagan was evil, beat their breasts and moaned about how this non-compassionate old meany was taking money from the poor, unleashing the forces of greed, glorifying selfishness, etc. etc. Yes, this non-compassionate cowboy who didn’t love the poor was determined to change policies that had given us 21 percent interest, 13 percent inflation and 10 percent unemployment. The bastard. Because Reagan was sworn in as president the year he was born, my nephew can now go around talking about opening up his own business when he finishes college.

Oh, and by the way, like many of his contemporaries, my nephew proudly sports a Che Guevara T-shirt. Times may change, and attitudes may change, but kids will always be kids.

As for me, my editor at the newspaper, who is about the same age as me (and facing his fourth marriage) has his eye on a little fixer-upper over in National City. Three bedrooms, two baths. $350,000, as reasonable a price as you’re going to find in San Diego County these days. He thinks I should take the plunge, too. But I’m not taking that old fart’s word for anything. I think I’ll go and have a talk with my friend who’s planning to become the next Donald Trump. He’s 26, so I’m sure I can count on him to give me good, practical advice for the 2000’s.

Sunday, May 11, 2008

The Necessity Of The Long View


I've been reading in Karl Keating’s Catholicism and Fundamentalism: The Attack on ‘Romanism’ by ‘Bible Christians,’ and concurrently reading Saved In Hope, Pope Benedict XVI’s second encyclical, the latter of which led me to the following passage in Paul’s letter to the Hebrews: “Let us then stop discussing the rudiments of Christianity. We ought not to be laying over again the foundations of faith in God and of repentance from the deadness of our former ways, about cleansing rites and laying-on of hands, about the resurrection of the dead and eternal judgement. Instead let us advance toward maturity, and so we shall, if God permits.” (Heb. 6, 1-3)

This brought to mind nothing more powerfully than the fundamentalist “prayer parties” that my sister Carla used to drag me to when I was in high school. She was hellbent to save my soul in those days. But I was having none of it, and now I can clearly see why. These “Bible Christians” were people who couldn’t get their religion beyond the kindergarten level, and didn’t want to. They thought they were all the more blessed for it. It was all about creating an emotional atmosphere and not much else, which explains why so many of the “Jesus freaks” at my school turned right around and dropped their “Christian” pretensions the moment the party wound down.

That in turn got me to thinking about some of the other reading I’ve done recently, Francis S. Collins’ The Language of God in particular. More than 80 years after the Scopes Trial, Darwin is still a flashpoint between believers and non-, and Richard Dawkins, the Al Franken of the “new atheism,” has only made matters worse by going around squawking that Darwin definitively proves the truths of atheism, that anyone who questions any part of Darwin’s theory is a science-hating moron, and by god, that’s that! Well, no, that’s not “that.” Scientists and philosophers who write about God are split on the subject of Darwin; some, like Michael Behe, think Darwin’s theory is on pretty weak ground. Others, like Collins, see the evidence for it as undeniably strong, but admonish evangelicals who reflexively scream “blasphemy!” at the very mention of Darwin’s name that they shouldn’t be so afraid of him. The Catholic Church wisely stays out of it, the last two popes having simply said that even if natural selection becomes a given, it’s all part of God’s plan.

Which leads me to these thoughts, which I jotted down last Wednesday:

It seems to me that religious fundamentalism is profoundly insulting to God. Its adherents claim strict fealty to the word as it appears on the printed page and assume that they are expressing their devotion to God and His word by clapping on horse-blinders, believing the Garden of Eden had a geographic location and telling themselves, “God loves me, and therefore I don’t have to think.” Setting aside the obvious absurdity of insisting on a word-for-word literal interpretation of a translation, since the Old Testament was originally written in Hebrew and the New Testament in Greek, while American fundamentalists are reading it in English, what they do is to impose the limits of their puny imaginations on God Himself. Fundamentalism denies God one, no, three aspects of His glory, namely, that he is the greatest architect, philosopher and poet in all the universe. Jimmy Swaggart doesn’t deny metaphor to Robert Frost, but he denies it to God. Fundamentalists, with the rejection of form, history and metaphor which they mistake for devotion, actually belittle God’s majesty. They think they’re honoring God with their narrow little minds. Actually, they are dishonoring Him by assuming that the scope of His vision, and the depth of his capability of expression, is no broader than theirs. Why did God give us the imagination and cognitive abilities that he denied even our closest relatives among the higher vertebrates, if He didn’t want us to glimpse and appreciate, if possibly never fully understand, the majesty of His creation better than they do?

And yet fundamentalists want to burn copies of The Origin of Species for the same reason Dawkins would like to remove Bibles from the public library and replace them with Darwin. Dawkins puts as much faith in Darwin as Swaggart does in Genesis, and for the same reason, only turned on its head. Darwin is a sacred text to Dawkins (and much of the “new atheist” crowd) because he sees in it (as do they) the definitive proof of of the truths of atheism, which is his religion. The Swaggarts of the world fear Darwin for the same reason. In truth, Dawkins is unjustifiably smug and Swaggart needlessly hostile. The evidence for natural selection is extremely strong, although in fairness to the churchy crowd, there are phenomena that don’t quite square with Darwin’s baby-step vision of the development of species, such as the Cambrian explosion 530 million years ago, which the fossil evidence supports and which even Darwin admitted gave his theory some problems. But at the end of the day it really doesn’t matter all that much I think. Human beings are in fact closely related to the higher apes. There is no getting around that fact; Jimmy Swaggart might just as well try to argue that Fred and Wilma Flintstone were real people. (and there are some fundamentalists who come close to this level of risibility: the so-called Young Earth Creationism crowd, which insists that the planet can’t be more than a few thousand years old, thinks dinosaurs and people coexisted at some point, like they do in the Flintstones' city of Bedrock.)

But having said that, and even admitting that certain behaviors are common between humans and higher vertebrates, there is still that quantum leap that separates man from even his closest ape relatives, and no, it isn’t simply his higher intelligence. It’s a little mystery called imagination.

Animals simply do not possess it. No animal, even the most clever, can imagine itself out of its current frame of reference and into another. For animals there may be a vague “yesterday” in the sense that they remember having supper last night and expect to have it again tonight, but there is no “yesterday” or “tomorrow” in their cognition. By the same token, as I think John Updike pointed out somewhere, animals, even when they seem to be behaving in a perfectly beastly manner, (no pun intended) as when a cat plays with a mouse or a lion devours a baby zebra, are not being “cruel” as humans understand it. They can’t be. “Cruelty” is beyond their capabilities. Cruelty is the enjoyment of your victim's suffering. To enjoy your victim’s suffering requires the capability to imagine yourself in your victim’s position. Animals can’t do that. A lion isn’t being cruel when it eats a baby zebra, it’s only being hungry. It sees the baby zebra as food, not a victim. Anyone who has ever watched a cat play with a mouse has probably also watched a cat play with a ball. To the cat there’s no difference: mouse, ball, both are just things that move and cats like to play with things that move. That the mouse is frightened and suffering doesn’t – can’t – occur to a cat. He doesn’t have the imagination.

Our imaginations get sent on some strange errands in the name of an agenda. Take the cliché about an infinite number of monkeys jumping up and down on an infinite number of typewriters for an infinite amount of time eventually producing the complete works of Shakespeare. This came up Friday night in the Voice of America newsroom. One of my young newswriting colleagues invoked those monkeys one more time, and objected when I dismissed the notion. Well, it is, actually, rhetorical hogwash, a bit of locker-room sophistry on the level of "If God can do anything, can he make a rock so heavy he can't pick it up?"

“Infinite number of monkeys,” “infinite number of typewriters” and “infinite amount of time” are all impossible concepts. The universe itself is finite: it’s roughly 14.5 billion years old, and even if you had a trillion monkeys and a trillion typewriters, probability still dictates that the universe would have to be much older than it is before they even came up with “To be or not to be.” Don’t take my word for it, ask a mathematician. But setting that argument aside, those infernal typing monkeys we’re always hearing about would be performing nothing more than the act of jumping up and down. Not one would imagine – could imagine – that he or she was doing anything else. We, on the other hand, can imagine an infinite number of typing monkeys. Think that doesn’t make us somehow quantitatively different from the monkeys, and on a more important level than the fact that they can swing from trees better than we can, and we both have opposable thumbs? Chimps and people can share the same space, eat much of the same food, watch the same TV shows and even enjoy each other’s company. Bond in friendship. But no chimpanzee is ever going to build a TV set, write a novel or fly a Boeing 757 from Minneapolis to Atlanta.

Clearly, there is a wide gap between chimps and people, despite all of our similarities. Dawkins and his crowd say it’s all an accident, and when mathematical probabilities introduce problems for that idea, they resort to a concept even more far-fetched in a lot of people’s eyes than the idea of God: the so-called “Landscape” or multiverse, the idea that our universe is just one of many, perhaps an infinity of universes. That allows them to believe that even if our universe appears to be “fine-tuned” to support our form of life, (which it does) that nevertheless such an idea can be dismissed as mere chance: in an infinite number of universes, one simply had to “get it right.”

Cool. Dawkins, as has been pointed out, prefers many universes to one God. Fine and dandy. But how does the same guy who claims that the idea of God should be subjected to the same scientific analysis as any other idea, put forth as an alternative an idea that’s every bit as unprovable? Stephen Hawking has come up with an elegant little piece of mathematical legerdemain that makes such a notion theoretically possible, but when the sun sets and the bar opens, it’s the cosmological equivalent of “how many angels can dance on the head of a pin?” Now the medieval scholastics may have been wasting their time debating such questions, but stop and think about what’s implicit in the idea that they were able to imagine such a thing. It’s mind-boggling.

The poet William Blake actually considered the world of the imagination more valid and more real, because having more symbolic value than the world around him. And he wasn’t crazy, either. He was perfectly sane, as are most Hindus who speak of the “veil of Maya,” essentially the same idea Blake was putting forth in his own version of Christian mysticism.

Some years ago I wrote a little poem about Blake, which I think is as suitable a way as any to conclude this rant:

Blake

Challenged to sketch
The soul of a flea,
He jumped at the chance:
“That’s it! Can’t you see?”
Then he glanced, unimpressed
At London’s gray streets.
“Reality’s nothing
But a chain of defeats.”

1/30/96