Stamp Acts

Acurrent off-off-Broadway production, Miami Beach Monsters, revolves around the notion that several classic movie monsters, now retired to Florida, are rediscovered thanks to a new issue of commemorative postage stamps. (Dracula, apparently, complaining he never consented to the use of his image). This charming notion is already old-hat

Acurrent off-off-Broadway production, Miami Beach Monsters, revolves around the notion that several classic movie monsters, now retired to Florida, are rediscovered thanks to a new issue of commemorative postage stamps. (Dracula, apparently, complaining he never consented to the use of his image). This charming notion is already old-hat: the United States Postal Service has honored Frankenstein, the Mummy, and the Wolfman, not to mention Daffy Duck, Buggs Bunny, Sylvester, and Tweety Bird. Wile E. Coyote and the Roadrunner will join them later this year.

Stamps 2000, a poster at the General Post Office on 34th Street, illustrates the commemorative stamps the Postal Service intends to issue during the coming year. There will be at least 113 of them. Indeed, the United States now issues so many varieties of postage stamps with such enthusiasm that one might forget they are merely adhesive receipts for prepaid postage on items of mail.

The English “commemorative” is derived from the Latin commemorare: to recall or put on record. Its postal use is now sufficiently common that the Random House Dictionary of the English Language (second edition, unabridged) includes it among its definitions: postage stamps “issued to commemorate a historical event, to honor the memory of a personage, etc.” They are themselves memorials or reminders, making honorable mention of something worth remembering.

Consider that: they remind us of the things that should be remembered, that deserve to be.

Why does the United States issue so many commemorative stamps? It is not that we have so many great men and women or so many more notable events. It is for money. There are millions of stamp collectors willing to pay for nearly every bit of postal paper that drops from a government press.

A century ago, the United States rarely issued commemoratives. A non-collector may probably find a good collection of late 19th- and early 20th-century U.S. stamps quite boring. Throughout the Gilded Age, the post office used a series of classical designs, all nearly as forgettable and monotonous as the interchangeable presidents of the time (each looking like one of the Smith Brothers, Trade and Mark), picturing the nation’s great dead men. In 1869, the post office released its first pictorial issues: stamps that bore images other than those of dead politicians. Each denomination illustrated something different—a ship, an eagle or a steam locomotive, for example. Some were even printed in two colors. They were quite controversial: James Gordon Bennett the elder wrote in the New York Herald that he feared the government might be “changing stamps as often as every six months, not giving the people a chance to get used to one variety before it was withdrawn and the people’s eyes startled by another.”

But as a rule from 1840, when Britain issued the first postage stamp, until, say, 1894, most countries viewed stamps as utilitarian: more or less elegant as the nations’ tastes required (if memory serves, Luigi Barzini argued in “Italy and Its Aristocracy” that an evidence of the decline of the nobility in Italian public life was reflected in the architecture of its buildings, the courtesies, even in the typography of official documents and the design of its postage stamps).

In 1894, Their Excellencies Tonnini and Marcucci, co-Regents of the Most Serene Republic of San Marino (a tiny independent country on the Northern Italian peninsula), professional politicians pressured by a rising national debt, yet averse to increasing taxes, envisioned that the pocketbooks of the world’s millions of stamp collectors, rather than of the few thousand Sammarinese, might be opened to swell the coffers of San Marino. They released a special issue of finely engraved stamps bearing portraits of the co-Regents with views of the interior and exterior of the National Palace.

Several months later, stamp sales alone had retired the national debt and financed a sewage system. San Marino has since issued many, many, many different kinds of stamps: internal postage, external postage, airmail, semipostal, postal tax, postage due, airmail postage due, postal tax due, and thousands of commemoratives: all beautifully produced stamps showing national and international heroes, ships, locomotives, military uniforms, dinosaurs, aircraft, castles, temples, and so forth. All are valid for postage, of course, and all are intended not to carry mail but to land in stamp albums around the world. The sale of postage stamps is perhaps the country’s leading industry, having edged out wine and marble some time ago, and one understands heroic equestrian statues of Tonnini and Marcucci have been raised in the city of San Marino itself. Forty years ago, the little country was issuing fifty-six different stamps a year. Now, it releases new issues several times a week, and apparently there is no end to the demand. San Marino, all thirty-eight square miles of it, has issued more stamps than nations a thousand times its size, and without shame.

The United States Post Office first established a philatelic agency in 1921 (“philately” is the English term for stamp collecting, from the Greek philos, “fond of,” and ateleia, “exemption from tax”; together, the words mean nothing, though they may suggest that a sender’s prepayment of postage exempts a receiver from paying it). In 1932, the United States elected a philatelist to the presidency. One of FDR’s enduring achievements was increasing the number of commemorative postage stamps issued by the United States and his successors have followed his example.

As I get older, the stamps commemorate people and events I remember myself. Sometimes, the result is surprisingly good. The ongoing “Black Heritage” series hit its high note last year by honoring Malcolm X: a splendidly designed stamp showing him alert, active, and thoughtful. Greatness is a remarkable thing; thus we can honor without irony a man once known as “Detroit Red” and “the Harlem Asp,” “a hustler, a pimp, a dope addict, a gambler, a numbers pusher and a thief”—as the political historian George Thayer noted—because he transformed himself into a dynamic, vitriolic preacher and teacher, and then into a practical, heroic visionary. I think of Ignatius Loyola, who aspired to a life of unending sensation: wenching, drinking, and fighting, until he turned to God from sheer boredom; and Malcolm X somehow comes to mind.

As I believe this would be a better country if he had lived, so I used dozens of these stamps on my mail. They are handsome, and I wanted to honor not only what was, but also what might have been. Apparently, I am not the only one who felt this way. The stamp honoring Malcolm X is one of the few recent commemorative stamps to sell out long before its planned withdrawal from sale.

This year’s honoree in the “Black Heritage” series, Patricia Roberts Harris, is something of a contrast. She is pictured with a nice smile, a nice hairdo, and a nice, puffy yuppie woman’s bowtie. The Honorable Harris was a Jimmy Carter Cabinet official and college professor—an upper-middle-class political hack, one of that elite whose finest flower was the late Ron Brown, high class legal hustler and Cabinet officer immortalized by Al Sharpton as “Ron Beige.” Harris was even ambassador to Luxembourg, the foreign service’s most blatantly political appointment. Without white folks’ patronage, she was nothing: her 1982 campaign for mayor of Washington, D.C., against the egregious crack-smoking adulterer Marion Barry—he stomped her into the ground— illustrated the hoary truth that universal suffrage eliminated the elite and their tools from elective public office.

Besides, the revolution meant justice for all, not the gravy train for some.

Another set of stamps, “Stampin’ the Future,” used four designs submitted by children from eight to twelve years old. The set’s title, with its dropped final consonant, is as condescending as its designs are crudely repellent.

And even where the design is classic, the publicity is bland. The Florida statesman Claude Pepper is being honored this year in the “Great Americans” series. The poster describes him as a “champion of elderly rights.” Thus, one might never know he was a tough, wily, radical politician and a magnificent orator, whether on ceremonial occasions or on the stump, tie askew, fists waving, and the crowd surging to its feet. V.O. Key wrote of him, “In Senator Pepper’s races the division has been most concretely drawn. There is never much doubt about where Claude stands.”

Because he was a great whirlwind campaigner who brought New Deal projects and military bases to his state, he was elected twice to the U.S. Senate, only to lose in 1950 to a McCarthyite opponent who called him “Red Pepper.” More than a decade later, Pepper went to the House of Representatives at an age when most men retire, and held his seat until his death in 1989.

I chatted with him, briefly, when he held a hearing with the City Council President in City Hall’s old Board of Estimate room in the mid-80s: he was nearly as old as the century himself. His eloquence and genuine though elaborately Southern courtesy nearly concealed an extraordinarily alert, subtle intelligence, a genius for cross-examination, and a gentle admiration for the splendid charms of our vivacious staff intern. He was among the last men in Congress to have served there during the New Deal. Long after he must have realized that social justice would not be realized in his lifetime and perhaps never, he still remembered what it was to be poor and to have no hope. He stood for so much more than “elderly rights”—yet another intrinsically meaningless phrase that, in context, merely signifies taking tax money from the working poor, skimming off salaries for public and not-for-profit sector administrators, and passing the rest to the impoverished elderly.

Worst of all is “Celebrating the Century,” a series of ten sets of fifteen stamps, one set for each decade. Perhaps the concept itself is flawed. Certainly, the means of selecting the stamps for the last few decades have been. Postal customers have chosen the topics by voting: yet another weakness of universal suffrage. As Albert Jay Nock observed, as against a Jesus, the historic choice of the common man goes regularly to some Barabbas.

Thus, for example, the stamps commemorating the 70s honor Big Bird, disco fever, the smiley face and Secretariat winning the Triple Crown. (No one thought of Nixon in China.) The set for the 80s, which was released on January 13, 2000, is worse. Cabbage Patch Kids; cable television; video games; The Cosby Show, and E.T.: The Extra-Terrestrial. There is a stamp, too, for Cats. The American musical theater of our time is represented thus: music composed by Sir Andrew Lloyd Webber with verse by T.S. Eliot, a St. Louis-born Harvard man and anti-Semitic elitist who renounced American citizenship to become a British subject.

Are these the things that should be—deserve to be—remembered? I think not. Such choices are the fruit of our society’s truly remarkable ignorance of our own history—even that of our own times. Cicero observed that those who have no knowledge of what has gone before them must forever remain children, and the infantile nature of the selections disturbs me.

As does the evasive quality of what we are commemorating. Last year the stamp “Honoring Those Who Served” purported to pay tribute to “the many millions of courageous men and women in public service and the military who have served or presently serve our country.” This vague, ambiguous purpose is a politician’s dream. What is courage in these diverse contexts? Thus we reach for a lower common denominator. Everyone goes into the pool: George Washington, Andrew Jackson, Abraham Lincoln, Bill Clinton, Douglas MacArthur, and the creep behind the counter at the Parking Violations Bureau.

For some reason, I think of my father’s eldest brother, a paratrooper who vanished in the fighting for the Remagen Bridge more than ten years before I was born. For my family his remains are the War Department telegram and a few fading photographs. His body was never found. We don’t even know if he was brave. My father has no memory of him. Perhaps the stamp honors my uncle, too. But the wider we cast the net and more thoroughly we work out the logic of the language defining the stamp’s subject—the person, institution, or event we honor—the less meaning it has. The stamp might honor even me for two decades’ public service, working for the City of New York.

But when everyone without distinction is honored, no one is distinguished. That’s not much of an honor at all.

New York Press, February 2, 2000

Junk Geniuses

Icame straight home from The Dazzle, Richard Greenberg’s three-character play about the Collyer Brothers (at the Gramercy Theater through May 12) and threw out all the plastic shopping bags that had been mounting up in the pantry closet. Then I went at the piles of newspapers waiting to be gone

Icame straight home from The Dazzle, Richard Greenberg’s three-character play about the Collyer Brothers (at the Gramercy Theater through May 12) and threw out all the plastic shopping bags that had been mounting up in the pantry closet. Then I went at the piles of newspapers waiting to be gone through—two in the kitchen and three or four on the hallway bookcase. I was heading for another mound, on the living room futon, but got distracted by the laundry cart and sorted some socks instead. Such is the transforming power of art.

Greenberg’s play, which the Roundabout is presenting in a production directed by David Warren, is a sheer delight. A fanciful meditation on the legend of New York’s most famous recluse pair, it’s not so much based on their shared life together as it is inspired by the image of how they died, buried alive in their Harlem mansion under a lifetime’s worth of clutter and debris, victims of fatal paranoia and a pathological inability to throw anything out.

Greenberg goes back to a time before the Collyer boys were quite so peculiar, furnishing them with a putative youth and respectability. He also invents a woman (as in cherchez la) and a crisis (a marriage that never took place). The play is sort of an anti-period piece. It juxtaposes flagrant anachronisms (conversational and circumstantial) with tropes and figures from the fiction and drama of the first half of the last century. One brother, Langley (Reg Rogers), talks like a Wildean protagonist, the other, Homer (Peter Frechette), comes on like a character out of Noel Coward, while the woman (Francie Swift), a young heiress who attaches herself to the brothers for reasons of her own, seems at times like a heroine from one of Shaw’s pleasanter plays. Occasionally, the situation in he Dazzle calls to mind one of the triangular relationships in an Edith Wharton or Henry James novel. (More often, it doesn’t.) Out of this ragbag of literary forms, Greenberg emerges with a genre all its own—call it screwball tragedy.

In terms of the actual facts of the Collyers’ story, The Dazzle plays fast and loose with recorded history. Matters of which brother seems to have been the caregiver at what point in their lives, where and when they died, at what ages, how, in what order, and in what sort of physical state and proximity to each other—all these were aspects of the affair that shocked the public or fed speculation when the story broke in 1947, as did the bizarre manner in which the brothers had been living. (A more serious historical account of the case can be found in The Collyer Brothers of Harlem, William Bryk’s 1998 “Old Smoke” column in New York Press.)

The truth is considerably more horrific and peculiar than Greenberg makes out. But as a program note makes clear, he isn’t all that interested in the historical Homer and Langley Collyer. He’s interested in what the haunting specter of their life together can be made to represent for us. The play is a meditation on two types of neurosis that feed off each other. Greenberg’s Langley is an artist of the helpless-genius variety, a gifted pianist whose career is derailed by his own refusal or inability to behave according to conventional expectation, whether meeting concert obligations or playing a familiar piece of music at a reasonable tempo. Overwhelmed by the big picture, Langley fixates on minutiae—a hair, a thread, a leaf, a fraction of a tone—which he seems able to lament or contemplate endlessly. In his brother’s diagnosis, he’s simply unable to let the notes go. It’s a lovely conceit (not to mention an intriguing if somewhat poeticized vision of anal retentiveness). Rogers has been rightly praised for his portrayal of Langley, whose lines he intones with an adenoidal languor that suggests a reluctance to part with even the breath it takes to utter them. But it’s Frechette’s Homer—his inability to leave is the other half of the story—who really breaks your heart.

The real-life Langley Collyer actually was a concert pianist, but making him a genius was Greenberg’s idea. That part is pure invention, and it makes The Dazzle the latest in a recent spate of movies about intellectual prodigies. I wouldn’t have noticed this if Ben Brantley hadn’t made the connection in his review of Greenberg’s play; I was busy tracking tales of scientists and mathematicians, which seem to be unusually plentiful of late: Darren Aronofsky’s offbeat thriller Pi (about an obsessive math/computer whiz); Michael Frayne’s Copenhagen (about the Nobel physicists Niels Bohr and Werner Heisenberg); David Auburn’s Proof (about the brilliant daughter of a University of Chicago mathematician), which won a Pulitzer Prize for drama last year; Peter Parnell’s QED (about another Nobel physicist, the American Richard Feynman). And, of course, there’s the Hollywood blockbuster A Beautiful Mind, about the Princeton mathematician John Forbes Nash, who suffered from schizophrenia for much of his adult life and in 1994 won a Nobel Prize in economics for a paper on game theory produced forty-five years before. American popular culture would seem to be going through one of its periodic love affairs with intellect just now.

It’s always nice to see a high value placed on intelligence (it happens rarely enough), but some of these plays and movies seem to worship genius without really knowing why. I recently caught up with QED, which reopened at Lincoln Center in February after a two-month hiatus, and found it mildly cringe-making. Ostensibly based on Feynman’s own writings (the program credits a book he co-authored called Tuva or Bust!), it takes place in a single evening in which the physicist is trying to decide whether to undergo a particularly risky form of surgery that may finish him off or may trounce a cancer that has been diagnosed as inoperable. Mostly, the play consists of Feynman talking to the audience, sharing hopes and dreams and memories, reminiscing about the heady days at Los Alamos working on nuclear fission, playing phone tag with various physicians, and voicing the odd qualm about the moral rightness of having helped to build the bomb. Except for a couple of interruptions from a student (Kellie Overbey), nothing else happens. Since nothing Feynman says or does offers a window onto what is supposed to be an uncommon mind, the impression left by the play is that we’re to take an interest in this man because he is super-smart and because he may be dying.

Proof is similarly frustrating for anyone who approaches it expecting to gain insight into the beauties of higher mathematics—the more so because its protagonists are constantly alluding to them. There’s endless talk in the play about the “elegance” of a particular mathematical proof whose authorship is in question, but Auburn makes no attempt to explore what that means. What does “elegance” in a mathematical proof consist of? And how does it relate to other brands of human endeavor? We’re never told, just as we’re given no inkling as to how the formal mathematical proof reflects the beauty of the world or the poetry of human and divine intelligence.

What if the heroine of Auburn’s play were not a “genius”? If it turned out that she hadn’t written the amazingly brilliant proof found in her late father’s papers, would the years she’d lost looking after him be less wasted and poignant? Would her plight—the fact that her sister has sold the house and that she fears she’ll go crazy just like Dad—be deemed less interesting or important? That seems to be the implication.

In Copenhagen, Frayn used the mysterious and much-pondered wartime meeting between Bohr and Heisenberg to suggest how essentially unknowable human motivation is—as “unmappable” as anything in the behavior of the atomic particles whose observation inspired Heisenberg’s Uncertainty Principle. Thoughtful people may differ as to the moral implications of this idea—and they have differed about it, particularly in recent weeks (in The New York Review of Books, mostly) with the release of certain papers bearing on the subject of the play. What is never in any doubt is the impulse behind it: Frayn’s fascination with quantum mechanics and his interest in locating in its theories a metaphor for human action and experience. What gives plays like QED and Proof the patina of literature is the reverence they evince for the idea of ideas. In fact, these plays couldn’t be less interested in imponderables or abstractions.

Ron Howard’s biopic about John Nash doesn’t get any nearer to the essence of the latter’s genius, but it does have one thing going for it, which is the expression of paranoid delusions in terms of a Cold War spy thriller. In Akiva Goldsman’s screenplay, Nash’s delusions partly take the form of very real fantasies about being recruited into the Secret Service. Nash’s actual real-life delusions were antigovernment and nonpatriotic (he spent some of his worst years living in Europe trying to relinquish his citizenship and to get the bewildered representatives of European officialdom to offer him political asylum) but that’s no more relevant than the content of his ravings, anti-Semitic or not, or his early sexual ambivalence, real or perceived (both of which were much discussed in the media in the weeks leading up to the Oscars). What’s clever about the spy conceit is the way it forces us to experience the plight of delusion ourselves. Because Nash’s fantasies take the form of familiar genre tropes, until we begin noticing the wit behind Ed Harris’s deadpan performance as an OSS mystery man, we believe in them too.

A couple of times, Howard uses cinematic gimmickry to probe into the nature of Nash’s ideas and quality of mind. In one scene, we watch a set of choreographed arrows head…now for a sultry blonde…and now away from her. (This is Howard and Goldsman’s attempt to explain something called the Nash Equilibrium.) There’s also a sequence in which Russell Crowe (playing Nash) demonstrates to Jennifer Connelly (playing his bride-to-be) that there is no shape or object which he cannot find in the multitude of stars. Crowe lifts his finger to the heavens and charts a course, and—voila!—a line of stars lights up in the shape Connelly has named, embarrassingly like something in a PowerPoint presentation. The scene is oddly distasteful. It’s pure Disney, for one thing. But it’s also unsettling since Nash’s ability to see things that aren’t there is his whole problem.

Or, rather, it is and it isn’t. A Beautiful Mind, which sets out to celebrate Nash’s triumph over schizophrenia, is itself of two minds about madness. It needs for madness to be a bad thing—in order for Nash’s redemption and rehabilitation to seem meaningful—but it also wants it to be a metaphor for creativity. The movie wants us to see Nash as an artist, possibly because that’s how Howard sees himself. What else is the artist if not he who reimagines the world in visions of his own devising?

It’s probably impossible for the stage and cinema not to glamorize craziness, given that—like plays and movies themselves—craziness brings forth images that aren’t there. Drama makes us see things, which is one reason why theater (from a Greek verb meaning “to see”) has traditionally been the best medium for exploring it.

Alone of the recent plays and movies about mind and madness, The Dazzle chooses to glamorize neither. It’s what gives the play, for all its hijinks, a flavor of tragedy. Greenberg doesn’t value brilliance per se, but he manages to convey some inkling of what it means to have transcendent thoughts. There’s a wonderful speech, one of two from which the play derives its title, in which Langley describes a piece of string that he’s kept since infancy. “I first saw this when I was in my crib,” he murmurs lovingly: “I looked up—and saw it—it was—dazzlingly colored—nothing is ever lost on me, nothing ever leaves—it was the first thing of its kind I’d seen—and though I didn’t know about words yet, I wanted desperately to name it.” He goes on:

I remember the desperation…there were tears off to the left—a glittering of porcelain below—and a spoon with the sweet white was thrust into my mouth—and piano notes played in the next room—and everything was entirely itself, and all at the same time.

It’s a beautiful evocation of the piecemeal way an infant probably experiences the world. It also suggests the infantile selfishness of the career-neurotic, his naive willingness to feed off the love and goodwill of others. “All the frail, frail people with their iron imperatives,” Homer laments, in the other title speech, contemplating the spectrum of different types of nutcase “that modern life presents in such dazzling array.”

Oh, I mustn’t sit here, I must sit there; I can’t wash dishes, I’m afraid of foam; I couldn’t possibly work, I have a terror of energy—they’re everywhere, it seems: The People Who Simply Can’t. And they have a simple thing in common: they always get their own way.

And we romanticize and adore them.

New York Press, April 2, 2002

The Towers Gain a New Perspective

We must not forget how undistinguished the World Trade Center was. “When completed,” the authors of the 2000 edition of the AIA Guide to New York City wrote, “these stolid, banal monoliths came to overshadow Lower Manhattan’s cluster of filigreed towers, which had previously been the romantic evocation

We must not forget how undistinguished the World Trade Center was. “When completed,” the authors of the 2000 edition of the AIA Guide to New York City wrote, “these stolid, banal monoliths came to overshadow Lower Manhattan’s cluster of filigreed towers, which had previously been the romantic evocation that symbolized the very concept of ‘skyline.'” Ada Louise Huxtable, former architecture critic for The New York Times, described their style as “General Motors Gothic.” Her successor, Paul Goldberger, called them “boring, so utterly banal as to be unworthy of the headquarters of a bank in Omaha.” Wolf von Eckhardt published an article in Harper’s calling the World Trade Center a “fearful instrument of urbicide,” and “one of the ugliest buildings in the world.” They were monuments to money and power, brutish and ugly, and only in the agony of their final hour did they take on nobility from the valor of those who sought to save the people who worked within their walls.

Until some forty years ago, the lower west side was an industrial neighborhood. Washington Market, a block-square two-story building at Washington Street between Fulton and Vesey Streets, where 175 merchants dealt in meat, poultry, cheese, butter, and garden produce delivered by boat, wagon, and truck, dominated the local economy. At Cortlandt Street, the huge, faintly seedy Hudson Terminal buildings towered above the terminus of the Hudson Tubes, now the PATH line. Printing plants, warehouses, and factories jostled delicatessens, bars, cobblers, hardware stores and barbers. Cortlandt Street had so many electronics retailers that some called it “Radio Row.”

David Rockefeller, chairman of Chase Manhattan Bank, had plans for lower Manhattan. After Chase built its sixty-story tower on Pine Street in 1960—the first new skyscraper in the financial district in a generation—he sought lower Manhattan’s redevelopment through a form of central planning reconciling private interest and public power, rather than the largely spontaneous entrepreneurial development that had historically molded the city’s economy. His urge for civic uplift dovetailed with the real estate interests of his bank and his family. He founded a civic group, the Downtown-Lower Manhattan Association, which in 1958 commissioned the architectural firm of Skidmore, Owings & Merrill to develop an overall plan for lower Manhattan. It included a World Trade Center, which Rockefeller believed would catalyze regional development. He forwarded the plan to the Port of New York Authority, now the Port Authority of New York and New Jersey.

Austin Tobin, the Port Authority’s executive director from the 1940s into the 1970s, presided over the decline of New York’s port into what often seems a virtual harbor, a thing of nostalgia trading on the images of its past as at South Street Seaport. Like a minor Robert Moses, Tobin built highway tunnels, airports and the container port at Elizabeth, New Jersey. Tobin often quoted Daniel Burnham, a brilliant 19th-century Chicago architect and apostle of centralized urban planning: “Make no small plans, for they have no power to stir the blood.” Thus, Tobin’s pride and the Port Authority’s internal culture fueled the ambition to build the world’s tallest building. Government power advanced the World Trade Center: the power of the Authority to condemn land and avoid most local environmental laws; of the city to close streets and issue permits; and of the state to transfer its offices into the building to provide it with tenants.

The vision of David Rockefeller and his allies transformed the city’s economy from its traditional base as an industrially diverse seaport to one narrowly dependent on finance, insurance, and real estate. The World Trade Center, the transformation’s monument, was an esthetic debacle: merely the first of the “million-square-foot, flat-topped boxes” that, as the AIA Guide notes, “muffle…the constellation of tall, slender, 1920s and 1930s Art Deco office buildings and the flamboyant pinnacles of their earlier, shorter, neo-Classical cousins, the structures that made up the inspired—if unplanned—Lower Manhattan skyline that was once the world-renowned symbol of New York City.”

For some four years, a friend daily rode an early ferry to Manhattan. He often reflected on the oddly rootless, alienating quality of the World Trade Center. Last week, he was horrified by his instinctive response to films of the skyline after the fall of the World Trade Center. He found beauty and harmony restored by the towers’ absence. They had merely been big buildings, not great buildings such as the Chrysler, Empire State, or Woolworth buildings.

To be sure, mere size does not make a building inhumane. Buildings as large as St. Peter’s Basilica in Rome or St. Paul’s Cathedral in London retain humanity in their proportions, perhaps because they were designed to exist in an harmonious relation with the society that created them. In the case of cathedrals, they glorified something greater than man, and at least half of the idea of building their spires ever higher was that the created might reach closer to his Creator.

The World Trade Center, despite its Gothic touches, glorified only power. And, by God, its towers became symbols of power last Tuesday. In the moments between the first and second attacks, one saw the compelling, powerful image of a symbol of power with a great gaping hole. For all our horrified wonderment at the scale, timing, and organization of the attack, the buildings were nonetheless destroyed by men who simply stole airplanes—using ingenuity and intelligence to overcome the security systems that, though able to make airplane travel obnoxious, could not prevent a group of people utterly dedicated to the destruction of the society represented by those buildings from their obscene and symbolic act.

Minoru Yamasaki, the Center’s architect, had supposedly claimed the towers were engineered to withstand the impact of a Boeing 747. His calculations, like his personality, were a little off.

He designed the Pruitt-Igoe public housing project in St. Louis, which was built in 1955 and entirely leveled by implosion in July 1972. One can only note that the local politicians, probably not profound esthetes, seriously held the buildings themselves were alienating: that they somehow actually encouraged criminal behavior. Like the terrorists who last week destroyed his most famous project, Yamasaki apparently viewed, as Eric Darton wrote, “living processes in general, and social life in particular, with a high degree of abstraction,” perhaps seeing the people who used his buildings as so many ants.

Yet Yamasaki had a keen business sense. The Port Authority was obsessed with maximizing the rentable square footage of the towers. He did this by discarding the conventional interior support columns: the steel framework used in older buildings, such as that permitting the Empire State Building to survive the crash of a B-25 bomber into its upper floors in 1945. Rather, Yamasaki suspended the towers from their own skins and from the core columns containing the buildings’ machinery: air conduits, electric and telecommunications cables, and water pipes. This meant that, last Tuesday, once the buildings’ envelopes were violated and the burning jet fuel swiftly melted the steel supports linking the upper floors with the walls, the towers pancaked within ninety minutes because they could no longer support themselves.

The walls consisted of alternating surfaces: eighteen inches of metal and twenty-two inches of glass. This created the structures’ least humane quality: their oddly narrow windows, “projecting an image that seems more radiator than building,” as John Tauranac wrote, that made them seem windowless at a distance of only a few blocks. Until the sickening moment when the towers began to fall, one could not comprehend the damage because the building was incomprehensible.

Now one can comprehend it. At dawn last Tuesday, the World Trade Center’s towers were 110 stories tall. Their remains piled about ten stories high.

New York Press, September 25, 2001

Pluck and Luck

In 1928 Herbert Asbury published The Gangs of New York, his masterwork on 19th century New York’s virile young ruffians. That same year Herbert R. Mayes published Alger: A Biography Without a Hero, the first biography of Horatio Alger Jr., whose works–countless moralizing books for boys–presented his view of the

In 1928 Herbert Asbury published The Gangs of New York, his masterwork on nineteenth-century New York’s virile young ruffians. That same year Herbert R. Mayes published Alger: A Biography Without a Hero, the first biography of Horatio Alger Jr., whose works—countless moralizing books for boys—presented his view of the same class at the same period.

Asbury’s book was founded on the general knowledge and gossip he’d picked up as a New York reporter, research among old newspapers and court records, and numerous interviews with those who had participated, or known participants, in the crimes and adventures that he recounted. Mayes claimed his book was based on exclusive facts derived from a diary that Alger had started at Harvard and maintained throughout his life. Vaguely inspired by the debunking biographies of Lytton Strachey, the book portrays Alger as a skirt-chasing sexual athlete.

The critic Malcolm Cowley inquired about the diary, murmuring that Mayes’ facts were so exclusive that they could not be documented at all. But no one else questioned them. Once Mayes became editor of Good Housekeeping and a director of The Saturday Review, his biography became the accepted truth: the basis for all future critical discussion and analysis of Alger and his works. The Dictionary of American Biography, Stewart Holbrook’s Lost Men of American History, and John Tebbel’s 1963 biography, From Rages to Riches, all rely on Mayes.

Tebbel even claimed he had verified Mayes’ sources. He was being less than truthful. Mayes admitted in the 1970s that his book is a work of fiction, largely invented by the author. There were no sources to verify. The diary did not exist.  Nonetheless, in 1978, on the occasion of the book’s golden anniversary, Mayes published a new edition, featuring a new self-debunking introduction. Mayes delighted in committing and then safely revealing literary fraud. Evidently, he also liked the income derived from a successful book.

Alger, who had himself written for money, would have understood.

Alger’s name is wedded to a particular image of the American dream: that anyone can rise from rages to riches through his own efforts. It is derived from the writings of Herbert Spencer, a 19th century English agnostic philosopher who was once taken very seriously indeed. Spencer, who, before Charles Darwin published The Origin of the Species, had come to believe in the evolution of animals by natural selection, believed this notion equally applicable to the social sciences. Spencer’s American disciples, particularly the sociologist William Graham Sumner, popularized his ideas as Social Darwinism.

The crude product of subtle minds, Social Darwinism applied natural selection—the notion of “the survival of the fittest”—to nearly every area of human life. Social Darwinists believed that even as the physical order was fixed by certain natural and implacable laws with which men ought not to interfere, so was the social order.

Sumner preached that the rule of life was “root, hog, or die.” He opposed anything—the minimum wage, the eight-hour day, government regulation of economic activity, even private charities such as soup kitchens for the homeless—that might interfere with his notion of social evolution. Slums, low wages, and other indices of human misery were not to be reformed. Those living in squalor deserved no better: it was a symptom of their unfitness. Sumner lacked the common touch, but then, as a tenured Yale professor, he didn’t need it.

The image of the Horatio Alger novel far more effectively advocated this kind of rugged individualism. He published upwards of 125 novels during his lifetime (about half a billion of his books have sold since the 1860s) and some 500 short stories. Another 280 serialized novels were never put into book form.

One wonders how much of his output was actually read. Despite the collective reputation of Alger’s work as inspiring tales of hardworking, go-getting young entrepreneurs, his stories are often far from preaching a virile gospel of success through struggle. Rather, they often seem like the passive romantic fantasies of a lonely mid-Victorian pederast.

Born on Friday the 13th in January 1832 in Revere, Massachusetts, Horatio Alger Jr. was the first son of a Unitarian minister. Having graduated Phi Beta Kappa with Harvard’s Class of 1852, he spent the first few years of his working life as a starving freelance journalist before returning to Harvard for his divinity degree, which he received in 1860.

Pleasant-faced, gray-eyed, balding, and mustachioed, Alger was soft-spoken and shy. Twice rejected by the Union army for chronic chest trouble, Alger wrote novels instead. The New York Weekly serialized Marie Bertrand, a romance, in 1864. A year later, he published Frank’s Campaign. This was his first book for juveniles: a tale of how a boy ran the family farm while his father served in the Union army, outwitting the villainous squire who held the mortgage, and succeeded in all he undertook.

In November 1864, he was called to the pulpit of the First Unitarian Church of Brewster, on Cape Cod. Until March 1866, the Rev. Horatio Alger Jr. preached the Gospel (he was a superb public speaker), visited the sick, and comforted the aged. He took a very special interest in the boys of the parish, taking them on walks in the woods, playing ball, and organizing games, entertainments, and festivals. This idyll ended on March 20, 1866 when, according to Edwin S. Hoyt in Horatio’s Boys, a parish committee heard a report that Alger had buggered the son of an influential parishioner and at least one other boy.

The parish minutes read, “We learn from John Clark and Thomas S. Crocker that Horatio Alger Jr. has been practicing on them at different times deeds that are too revolting to relate.” Alger did not deny the charges, saying that he had been “imprudent” and that he considered his connection with the parish severed. That afternoon, he caught the next train out of town. The former minister was neither arrested nor indicted, and the charges were quickly forgotten. No one would uncover the facts of his departure from Brewster for a century.

Alger traveled directly to New York with a carpetbag of manuscripts and a desire to dedicate his life to writing and to boys—an interest he had announced to an acquaintance, William Taylor Adams, who wrote books for juveniles under the pen name Oliver Optic. Adams, who also edited Student and Schoolmate, which Stuart Holbrook characterized as “a goody-goody periodical for boys,” seems to have taken this statement at face value. Soon after his arrival, Alger promised Adams a new serial novel for Student and Schoolmate, set among the homeless waifs, bootblacks, and newsboys of New York in whom Alger took a keen interest. Within a few days, he delivered three chapters to Adams.

This was Alger’s first success, Ragged Dick, the story of a youth attempting to survive on the streets of New York City. Student and Schoolmate flew off the stands. At the end of the following year, when A.K. Loring published the serial as a book, it became a runaway nationwide bestseller. Alger endlessly reused this story over the next thirty-two years, usually changing only the titles, the names of the characters, and sometimes the setting.

Contrary to popular belief, the protagonists of these books are not so much adventurous youths rising to riches as male Cinderellas, sycophants pleasing their employers to gain lives of modest comfort. As Michael Moon notes in his essay, “The Gentle Boy from the Dangerous Classes,” mere luck, rather than an increased understanding of the world, sets the Alger hero on his way. Alger’s protagonists are attractive adolescents—“well-formed and strong” or “well-knit,” with “bright and attractive faces”—who, through chance encounters, usually involving some spontaneous display of strength and daring, are befriended by older, wealthier men. Often, the relationship seems based upon a quick physical assessment. The lads become protégés and flourish under their mentors’ genteel patronage. Intriguingly, Alger heroes only rarely make their fortunes by marrying the boss’s daughter.

Within five years of his arrival in New York, Alger had published seven serial novels in Student and Schoolmate alone. He usually wrote several books simultaneously. He would churn out a few pages of one before boredom set in; then he would turn to another and then another before returning to the first. He worked fifteen hours at a stretch, often living on coffee to stay awake as the prose gushed from his pen. He wrote quickly: Frank and Fearless, 80,000 words long, took two weeks. On finishing, according to Holbrook, he took a walk around the block and returned to start Upward and Onward, polishing that off in thirteen days. His life became his books: Fame and Fortune, Rough and Ready, Rufus and Rose, Strive and Succeed, Tattered Tom, Paul the Peddler, Phil the Fiddler, Slow and Sure, Try and Trust, Bound to Rise, The Young Acrobat, Sam’s Chance, Risen from the Ranks, and dozens more.

Of course, none of these books is any good. His writing is clichéd and pompous. Heroes invariably assume manly stances and villains charge like bulls to no avail. Characters are interchangeable: there is no difference between Tom Temple and Tom Thatcher or between Tom Thatcher and Walter Sherwood. Though later novels are set in the Wild West, San Francisco, Australia, or England, the stories never change. His Native Americans and Asians are stereotypes, his Negroes subhuman.

Moreover, his work was amazingly sloppy. He forgot whether his current hero was Andy Gordon or Andy Grant or Bob Burton or Herbert Carter, and sometimes a single hero might bear five or six different names in a manuscript. With age, he became intellectually flaccid: Brave and Bold, though a novel of a factory boy, fails to show its hero doing a day’s work in a factory or even to identify the factory’s product.

Yet these incredibly bad works were incredibly good magazine serials, as Edwin Hoyt noted: each episode rises to a climax, leaving ’em panting for more. Alger slowed only slightly with age, still producing three books or more a year until his health began falling during the winter of 1898. He was planning to visit his sister in New England when an attack of asthma overcame him. He died on July 18, 1899.

But death had no dominion over his product. His publishers hired Edward Stratemeyer, the future creator of Tom Swift, the Hardy Boys, and Nancy Drew, to squeeze new Alger novels from the plot outlines and incomplete serials left in the dead man’s bottom drawer. Alger remained a bestseller until World War I, when changing tastes in children’s books marginalized him.

As literature, Alger’s work is trash. As propaganda, its effect was stupendous. The influence of his books and, more importantly, the code they were believed to preach, may have affected more Americans in his day than did those of any other contemporary writer. Not bad for a child molester.

New York Press, February 5, 2003

The Prince on 36th Street

James Aloysius Harden-Hickey, Baron of the Holy Roman Empire by command of the Supreme Pontiff, editor, novelist, swordsman, and adventurer, who would proclaim himself James I, Prince of Trinidad, and die by his own hand, was born in San Francisco on December 8, 1854.

James Aloysius Harden-Hickey, Baron of the Holy Roman Empire by command of the Supreme Pontiff, editor, novelist, swordsman, and adventurer, who would proclaim himself James I, Prince of Trinidad, and die by his own hand, was born in San Francisco on December 8, 1854. The flavor of San Francisco life in the 1850s was still affected by the Gold Rush of 1849. The enforcement of rough justice through committees of vigilance and lynch mobs was merely one of many indications that the city by the bay was not yet housebroken. Accordingly, Hickey’s French-born mother took her son to Paris.

They arrived amidst the glittering reign of Napoleon III. If one believes Richard Harding Davis, “When Harden-Hickey was a boy, Paris was never so carelessly gay, so brilliant, never so overcharged with life, color, and adventure.” Davis, one of the most glamorous reporters of the early 20th century, was also one of the most wildly overrate; yet there is something to what he says here. Within a generation, France had seen two revolutions, two kings, and a republic. Now she was an empire again, under the rule of a Bonaparte. Napoleon III, the nephew of Napoleon I, had seized power in 1852. He transformed Paris with his massive public works and bewitched the public with theatrical display and gorgeous ceremonies. The Second Empire sparkled with flamboyance, imagination and a kind of passionate worldliness.

Harden-Hickey spent little of his childhood and adolescence in Paris. The Jesuits taught him at Namur, Belgium; he studied law at the University of Leipzig. Yet his politics, tastes, point of view, and appearance were molded by Napoleon III and the Second Empire. At nineteen he passed the examinations to enter Saint-Cyr, the French military academy, from which he graduated with honors in 1875. His father died shortly before his graduation.

Having inherited a small income and mastered French, and enjoying the reputation of a master swordsman (like Scaramouche, he could pick the buttons off one’s waistcoat with a foil), Harden-Hickey foreswore the profession of arms for Parisian literary life. In 1878, he married the Countess de Saint-Pery; they had a son and a daughter.

He published eleven novels between 1876 and 1880. Irving Wallace, in his essay on Harden-Hickey (collected in The Square Pegs), calls the plots naive, the characters stereotypical, and the language flat. One novel is obviously borrowed from Jules Verne’s Michael Strogoff, another from Don Quixote; all are bluntly monarchist and antidemocratic. Harden-Hickey’s polemics were more successful: his vehement defense of the church won him ennoblement as a Baron of the Holy Roman Empire.

The political upheaval after the fall of Napoleon III in 1870 subsided into the regime we call the Third Republic. Much as a tempest in shallow waters stirs up sludge, so the transition raised up a new set of corrupt politicos whose embezzlements and bribery led to a seemingly unending succession of scandals. With the concomitant abolition of press censorship, France saw an explosion in the number of newspapers being published. Most were edited in the spirit of Villemessant, founder of Le Figaro, who observed, “If a story doesn’t cause a duel or a lawsuit, it isn’t any good.”

The royalists, who wanted to restore the kings of France, unleashed their own media blitz by financing newspapers. They wanted a Parisian illustrated weekly, something like London’s Punch. Harden-Hickey’s swordsmanship and polemical skills made him its perfect editor.

On November 10, 1878, Harden-Hickey first published Triboulet, named for a jester of King Louis XII. The cover illustration showed the jester beating Marianne, the symbol of the Republic, and various politicians with a war club. The writing was as vigorous as the artwork. Within weeks, Triboulet had a paid circulation of 25,000. Within the year its business manager had been imprisoned, its staff had collectively served some six months in jail, and the paper had been fined 3000 francs. Within two years, it had become a daily. Harden-Hickey was sued forty-two times for libel and fought at least twelve duels, believing that his opponents should meet him either “upon the editorial page, or in the Bois de Boulogne.” The fun lasted until 1887, when the royalists’ money gave out.

He had changed. He divorced his wife, largely renounced Catholicism and flirted with theosophy and Buddhism. He began a journey around the world, spent nearly a year in India learning Sanskrit, studying the Buddha’s teachings, and even—he claimed—traveling to Tibet. Along the way, he made a short stop in the South Atlantic. Some 700 miles from the Brazilian coast, his ship hove to off to the deserted island of Trinidad.

“Trinidad is about five miles long and three miles wide,” Davis wrote, “but a spot upon the ocean. On most maps it is not even a spot.” Its residents were birds, turtles and land crabs. Harden-Hickey went ashore, explored the island and claimed it in his own name.

He was not the first to land there. An Englishman named Halley had made his way there in 1698. Two years later some Brazilian Portuguese settlers built stone huts, the ruins of which survived into Harden-Hickey’s time. A 1775 book claimed that one Alexander Dalrymple had taken possession of the island in the name of the King of England in 1700. Nonetheless, mariners landing in 1803 and 1822 found no inhabitants save “cormorants, petrels, gannets, man-of-war birds, and turtles weighing from five hundred to seven hundred pounds.” This gave Harden-Hickey’s claim color under international law: the English never settled the island; the Portuguese abandoned it. Trinidad was there for the taking.

Harden-Hickey returned to Paris in 1890, where he met Anne Flagler, daughter of John H. Flagler, an American financier. On St. Patrick’s Day 1891, he married her at the Fifth Avenue Presbyterian Church. For the next two years, he lived quietly with the Flaglers in New York. At some point during this period, he traveled to Mexico, where he purchased at least one ranch with money from his father-in-law. Apparently, Flagler would support his son-in-law quite generously without permitting him the control of any sizable sum of money. This restraint galled Harden-Hickey, who seems never to have considered earning his own living.

On Sunday, Nov. 5, 1893, the New York Tribune gave him front-page publicity with an exclusive story on his scheme to transform Trinidad into an independent country. Harden-Hickey argued that “…the inland plateaus are rich with luxuriant vegetation… The surrounding seas swarm with fish… Dolphins, rock-cod, pigfish, and blackfish may be caught as quickly as they can be hauled out…the exportation of guano alone should make my little country prosperous…”

Harden-Hickey’s announcement did not precipitate a world crisis. In January 1894, when he proclaimed himself James I, Prince of Trinidad, some nations even recognized him. One reporter interviewed his father-in-law, who seemed surprisingly tolerant of the adventure. He said, “My son-in-law is a very determined man… Had he consulted me about this, I would have been glad to have aided him with money or advice… But my son-in-law means to carry on this Trinidad scheme, and he will.”

The Prince announced that Trinidad would be a military dictatorship. Its flag would be a yellow triangle on a red ground. He began selling bonds for 1000 francs or $200, announcing that anyone purchasing ten of them was entitled to a free passage to the island. In San Francisco, he purchased a schooner to transport colonists and ferry supplies and mail between Trinidad and Brazil. He hired an agent to negotiate the construction of docks, wharves and houses. He also contracted for Chinese coolies to constitute an instant proletariat. On December 8, 1893, he instituted the Order of Trinidad, an order of chivalry in four classes to reward distinction in literature, the arts, and the sciences. He then commissioned a firm of jewelers to make a golden crown and issued a set of multicolored postage stamps.

An old friend from Paris, the Count de la Boissiere, became his secretary of state for foreign affairs. After working out of the Flagler residence at 18 W. 52nd Street, he opened a chancellery at 217 W. 36th Street. It was a room in a brownstone just west of 7th Avenue. Davis would visit it in the summer of 1894. Children were playing on the stoop with dolls. A vendor was peddling vegetables in the street. On the front door was a piece of paper bearing a handwritten note: “Chancellerie de la Principaute de Trinidad.”

In July 1895, the British government, then constructing a submarine cable to Brazil, landed troops and took possession of Trinidad as a cable station, based on Halley’s discovery in 1698. The Brazilians asserted a claim based upon the Portuguese occupation of 1700. Diplomatic representations were made; a mob at Bahia stoned the British consulate. No one remembered Harden-Hickey.

The Count de la Boissiere opened fire. His protest to Secretary of State Richard Olney pointed out that Harden-Hickey had notified the powers that he had taken possession of the uninhabited island of Trinidad in 1893. None of the powers had objected or opposed him. He asked the United States government to recognize the Principality of Trinidad and guarantee its neutrality.

August is the silly season, when no real news happens. In that long-ago summer of 1895—decades before electric fans or air conditioning—Olney may have been feeling the heat. He gave copies of the protest to the press corps for their amusement. The New York papers, much to the Count’s horror, ran stories poking fun at Prince James and at himself, his chancellery, his broken English, the formal manners so at odds with his squalid surroundings, and even his clothes.

It was around this time that Davis, then writing for The Evening Sun, called at the chancellery. On the wall he saw a notice of “Sailings to Trinidad.” It listed two: March 1 and October 1. The Count’s desk was piled with copies of proclamations, postage stamps, bonds and, in pasteboard boxes, gold and red enameled crosses of the Order of Trinidad. Davis found the Count “courteous, gentle, and…distinguished,” and gave Harden-Hickey a straight treatment. The other newspaper that treated Harden-Hickey with compassion was, odd though it may seem to us today, The New York Times. Reporter Henri Pene du Bois (grandfather of children’s author William Pene du Bois) and managing editor Henry Cary felt that Harden-Hickey and the Count were both in earnest and that their only fault was having a dream and the imagination to strive for it. One day, a pasteboard box appeared on the desk of each man: the Prince had awarded them the Order of Trinidad.

During the next two years, Harden-Hickey spiraled into depression. Without his island, he had nothing; furthermore, much of the world laughed at him for having tried to make his dream come true. No one, not even those who loved him, ever suggested that he had much of a sense of humor. And clearly, no one ever persuaded him to just go and get a job.

In 1897, Harden-Hickey completed plans for an invasion of England from Ireland. He asked Flagler to finance it. His father-in-law, perhaps not unreasonably, declined. Harden-Hickey never spoke with him again. He had drifted from his wife, too. While he had been in San Francisco hiring coolies and buying schooners, she had been in Paris; when she went to San Francisco, Harden-Hickey vanished to his Mexican ranch. Moreover, as Davis primly observes, Harden-Hickey “was greatly admired by pretty women.”

In early 1898, Harden-Hickey’s attempts to raise money by selling his Mexican land fell through. On February 2, 1898 he registered at the Pierson Hotel in El Paso, Texas, where he remained a week. According to Wallace, he was overheard to say that he was waiting for money from friends. He went up to his room at 7:30 p.m. on February 9. The following noon, the maids found him on the bed, a half-emptied morphine bottle on the nightstand. A letter to his wife was pinned to a chair. In his trunk was the crown of Trinidad.

New York Press, December 11, 2003

Small Change

The Lincoln cent was first struck in 1909. Its obverse portrait of the sixteenth president, designed by Victor David Brenner and bulldozed through the Mint bureaucracy by President Theodore Roosevelt, is the nation’s oldest circulating coin design. With over a billion pennies minted annually, this portrait is the

The Lincoln cent was first struck in 1909. Its obverse portrait of the sixteenth president, designed by Victor David Brenner and bulldozed through the Mint bureaucracy by President Theodore Roosevelt, is the nation’s oldest circulating coin design. With over a billion pennies minted annually, this portrait is the world’s most frequently reproduced work of public art. One can find it in any gutter.1909svdb_cent_obv

Nearly 225 years after the first Independence Day, we still call the cent, our smallest copper coin, after a British coin, the penny, which dates back over two thousand years.  In the  third century B.C., the Romans first coined the denarius, their standard silver coin, which was about the size of our dime. It was probably the coin handed to Jesus of Nazareth when he was asked whether paying taxes to Caesar was lawful; thirty of them were the blood money paid Judas Iscariot.

The emperors gradually debased the denarius, alloying its silver with ever-larger proportions of copper until no silver remained. After the Legions withdrew from Britain in the fifth century A.D., British kings struck crude imitations of the denarius, which invading Angles and Saxons called “penning” or “pfennig.”  Hence, penny.

The North American British colonies used English money—pounds, shillings and pence—yet they lacked small change. Some places used wampum, Native American money made from beads. Connecticut made corn legal tender at two shillings a bushel and New York at five shillings a bushel. Congress made nails legal tender by size, which is why some are still called three-penny nails. The most common coins were Spanish milled dollars, the famous “pieces of eight” (so called because their value in Spanish money was eight reales). The Americans, logically, valued a reale at 12½ cents (the old jingle, “Shave and a haircut, two bits,” refers to a 25¢ visit to the tonsorial parlor).

The federal government was too broke to strike its own coins; small change was a chaos of state and foreign coins, and merchants’ advertising tokens. The Continental Congress and the colonies had financed the Revolution with oceans of heavily discounted paper money. A horse worth 80 Spanish dollars cost $2,500 in paper money. Nor were the colonies’ monies at par with one another: one New Hampshire shilling was worth nearly five-and-a-half South Carolina shillings.

gouverneur-morrisIn 1782, Gouverneur Morris wrote a report describing the “perplexing…and troublesome” exchange rates among state currencies. He suggested a federal mint and a national coinage that would reconcile the different state currencies with an extremely complicated system involving a common denominator of 1,440 units.

Perhaps only two men in the United States understood what Morris had in mind. The other man, Thomas Jefferson, had a better idea. In July 1784, he published an article in The Providence Gazette and Country Journal stating the obvious: Morris’s scheme would prove enormously cumbersome in day-to-day business. Jefferson proposed a decimal system: “The most easy ratio of multiplication and division is that by ten. Everyone knows the facility of Decimal Arithmetic.” He suggested using the dollar, divided into dimes (even today the coin worth one-tenth of a dollar is denominated “one dime,” not “ten cents”), and copper coins worth one one-hundredth of a dollar, which became the cent. Finally, in April 1792, Congress established the U.S. Mint, which struck the first copper one-cent pieces in 1793. From the beginning, the public called them pennies.Penny Auction

tn_1857_flyingeaglecentThe first cents were larger than today’s quarters. They were unpopular and expensive to produce. From 1850, the Mint experimented with possible replacements. In 1857 the Mint struck the first small cents, bearing a flying eagle. Two years later, Liberty appeared on the penny, wearing an Indian war bonnet. The model was no Indian:1859_indiancent1 the designer, James Longacre, apparently popped an Indian headdress on his pretty daughter Sara. But charming Sara was merely a model for an allegory. Placing an historical person on a coin was seen as monarchical and inappropriate for a republic. This suited the Mint bureaucracy, who seemed more concerned about whether their coins would stack for easy counting than their physical beauty.

At the beginning of the 20th century, Charles Barber had been chief engraver of the Mint for nearly a generation. Barber was competent without the slightest spark of creativity. His masterpieces were the so-called Barber dime, quarter and half dollar, first minted in 1892. The three designs were virtually identical, distinguished only by size and denomination. However, they were easy to mint and stacked well.

In 1905 Theodore Roosevelt dined with Augustus Saint-Gaudens, whose sculpture the President admired. Behind the Rough Rider wielding the Big Stick was a naturalist, man of letters, and aesthete, who took as strong an interest in public art as any president since Jefferson. The conversation drifted to the beauty of ancient Greek coins, which Saint-Gaudens described as almost the only ones of real artistic merit. The President, asking why the United States could not have coins as beautiful, challenged the artist: if Saint-Gaudens would design them, Roosevelt would mint them.

double_eagle_1907_ultra_hr_obv_trompeterThe sculptor and the statesman collaborated to make coins beautiful. Their triumph is Saint-Gaudens’ “Standing Liberty” twenty-dollar gold piece, first struck in 1907. On the obverse, a voluptuous Liberty holds aloft in her right hand the torch of Enlightenment and in her left the olive double_eagle_1907_ultra_hr_rev_trompeterbranch of peace.  On the reverse, an eagle soars above a sun rising in glory. Saint-Gaudens died before he could design a new cent. Roosevelt had another artist for the job.

Victor David Brenner (1871-1924) was born in Lithuania on June 12, 1871. His father, an artisan, taught him engraving and jewelry-making in his shop (Father also carved gravestones). He also saw that Victor was instructed in history, languages, and the Talmud. In 1890, Brenner arrived in New York, and worked as a die cutter while attending Cooper Union, the Art Students League, and the National Academy of Design. Five years later, he began executing art medals for the American Numismatic Society. In 1898, he began three years’ work and study in Paris, where he won a bronze medal at the 1900 Paris Exposition.

brennerIn 1908, Brenner was commissioned to create a bronze plaque for the centennial of Lincoln’s birth in 1909. Around the same time, the Panama Canal Commission also retained him to design a medal. Its obverse would bear President Roosevelt’s profile. T.R. posed at Brenner’s studio. The President openly admired the plaster patterns for the Lincoln plaque and later suggested to his secretary of the treasury that Brenner’s portrait of Lincoln should appear on a coin. Consequently, Brenner was the only person invited to participate in formulating the new design, much to the chagrin of Charles Barber.

Brenner’s constraints were set by law. A one-cent piece was 19 millimeters in diameter and weighed 3.11 grams. It would be struck on bronze planchets, blank discs of copper alloyed with tin. The design had to include a date and mintmark; the words “Liberty,” “In God We Trust,” “E Pluribus Unum” and “The United States of America”; as well as the denomination, “One Cent.” The design also had to prove simultaneously easy to strike, long-wearing in circulation, and appealing to the eye.

The coin’s obverse replicates the centennial plaque’s bust. However, Brenner’s challenge in designing a penny was the relief, the relationship of the coin’s features to its field, the flat surface of the coin. When a coin executed in bas-relief (with the features extending above the field) is circulated, the design’s high points show the most wear. Brenner executed the cent in lower relief than the plaque, sacrificing depth and detail to meet coin production requirements and extend its circulating life.1909svdb_cent_rev

Brenner’s design for the reverse was brilliantly successful. The words are rendered in a slender Gothic font inspired by the Vienna Secession, and two stylized ears of durum wheat enwreath the denomination. Though the obverse is traditional, the reverse is late art nouveau, perhaps with a strong flavor of art deco. At the bottom, Brenner placed his initials, V.D.B.

He created three-dimensional plaster models, twelve inches in diameter, of the coin’s obverse and reverse, with the relief in proportion to the finished coin’s. Once approved, the model was thinly plated with copper and placed on a pantographic reduction lathe. This machine’s tracing tool transferred the details in miniature into a soft steel blank, the master hub, which was then heat-treated, or annealed, to harden it and then used to make the master dies from which the working dies are made.

The planchets were mechanically fed into the coin presses, which struck each planchet with three dies in a cold-strike process. A feeder placed the planchet on the lower or anvil die; a ram then forced the upper die against the planchet. The pressure compressed the metal, which flowed outward against the collar, which in turn forced it back into the cavities of the upper and lower dies. This took a fraction of a second.

The Lincoln cent was first issued on August 2, 1909. Crowds swarmed the Sub-treasury in Wall Street to purchase them. Three days later, however, coin production was halted due to a media-created controversy over Brenner’s initials on the reverse. Barber gleefully ground the initials entirely off the master dies. In 1918, after Barber’s death, Brenner’s initials were quietly placed on the lower edge of the truncation of Lincoln’s bust.

Brenner’s reputation soared. He executed hundreds of tablets, commemorative medallions and prize medals. His work is in the New York Public Library, the Brooklyn Museum, the Metropolitan Museum of Art, the Musee d’Orsay, the Boston Museum of Fine Arts and the Smithsonian. He is considered the most capable designer, engraver, and cutter of medals in his time. He arrived in New York a tradesman; his industry and determination transformed talent into genius.1959-penny

However, none can foresee the enduring power of mediocrity. In 1959, to commemorate the 150th anniversary of the birth of Lincoln, the Mint replaced Brenner’s simple reverse with a bland rendition of the Lincoln Memorial. The design was created in-house; no outside submissions were considered. Some compare it to a bus or a trolley car.

After 230 years, the penny, now struck on lightweight copper-plated zinc planchets, lives on borrowed time. Its major constituency, the zinc industry, helps keep it alive through its support of a lobby, Americans for Common Cents. Their strongest argument against its abolition is that all prices would then rise to the next five-cent increment. No politician wants that responsibility.

The penny is now only the small change required by state sales taxes. Once it was part of the blood-money of God.

New York Press, May 15, 2001

A Poor Printer of New York

William Bradford published Manhattan’s first newspaper, the New York Gazette, on November 16, 1725. According to F.L. Mott’s American Journalism, it was two pages long. Each page was ten by fifteen inches with two columns of text, “chiefly foreign news from three to six months old, state papers

William Bradford published Manhattan’s first newspaper, the New York Gazette, on November 16, 1725. According to F.L. Mott’s American Journalism, it was two pages long. Each page was ten by fifteen inches with two columns of text, “chiefly foreign news from three to six months old, state papers, lists of ships entered and cleared, and a few advertisements.” There were no illustrations. Its weekly circulation ranged from 300 to 350 copies.

John Peter Zenger would publish its first competitor. Born in Germany in 1697, he arrived in 1710 as a bonded apprentice to Bradford. Zenger served Bradford for eight years, learning the printer’s trade while repaying his passage. Later he opened his own printing shop.

On Aug. 1, 1732 Col. William Cosby became captain general, vice admiral, and governor in chief of His Majesty’s Province of New York and the Jerseys. The Colonel had needs. He demanded 1,000 pounds sterling from the Governor’s Council for lobbying services in London. Cosby also demanded that Rip Van Dam, the Council president, split the salary Van Dam had received while serving as acting governor and then sued for the money. Van Dam’s attorneys, James Alexander and William Smith, persuaded Chief Justice Lewis Morris to dismiss the suit. Cosby, in a fit of rage, replaced Morris with the “young and arrogant” James De Lancey.

The Morris family was wealthy, powerful, and proud. With other “gentlemen of the landed interest,” they organized against Cosby. In the fall of 1733, at a special assembly election, Morris crushed his Cosbyite opponent. Parenthetically, their campaigns largely consisted of treating voters to free drinks on Election Day, a tradition worth reviving.

On November 5, 1733 the Morrisites unleashed the New York Weekly Journal, edited and published by Zenger. Its articles attacked Cosby as an idiot, a Nero, a rogue and a lawbreaker, “tyrannically flouting the laws of England and New York.” The paper accused him of “incompetence, influence peddling, corruption, collusion with the French, election fraud, and tyranny.” It also exposed his padded expense accounts, mysterious dealings in government-owned lands, and greed for every imaginable perquisite. The Morrisites won the September 1734 city elections. On Sunday, November 17, 1734 Cosby ordered Zenger’s arrest for seditious libel. Seditious libel was the publication of statements intended to arouse the people against the government by either bringing it into contempt or exciting dissatisfaction. Truth was no defense to the charge. The hangman publicly burned the paper. Chief Justice De Lancey set bail far beyond Zenger’s means, requiring his imprisonment until trial.

The arrest prevented the paper’s publication on November 18. A week later, the Journal appeared with a front page apology:

As you last week were Disappointed of my Journall, I think it Incumbent upon me, to publish my Apoligy which is this. On the Lord’s Day, the Seventeenth of this instant, I was Arrested, taken and Imprisoned in the common Gaol of this Citty, by Virtue of a Warrant from the Governour where upon I was put under such Restraint that I had not the Liberty of Pen, Ink, or Paper, or to see, or speak with People, till upon my Complaint to the Honourable the Chief Justice, at my appearing before him upon my Habia Corpus on the Wednesday following. I hope for the future by the Liberty of Speaking to my Servants thro’ the Hole of the Door of the Prison, to entertain you with my weekly Journal as formerly.

Anna Zenger, John Peter’s wife, thereafter published the Journal, becoming New York’s first woman editor and publisher.

On April 15, 1735 Alexander and Smith appeared as Zenger’s counsel before Chief Justice De Lancey, challenging the court’s legality by arguing that Cosby’s appointment of De Lancey was unlawful. De Lancey held both lawyers in contempt, disbarred them and ejected them from the courtroom. He then appointed the honest and competent John Chambers as Zenger’s counsel, who took care of loose ends left behind by Alexander and Smith, such as entering a plea of not guilty.

On August 4, 1735, before a packed courtroom, the Attorney General opened for the prosecution, arguing the Governor, “the King’s immediate representative here, is greatly and unjustly scandalized [as a] person that has no regard to law or justice.”

Then, to nearly everyone’s surprise, an elderly man strode to the defense table and bowed to the Chief Justice. Andrew Hamilton, born in Scotland around 1676, had arrived in America, like Zenger, an indentured servant. He practiced law in Pennsylvania, where he had been attorney general and was presently speaker of the assembly-also a practicing engineer, architect and builder. (Hamilton’s most famous structure is Independence Hall.) Now he would argue his most famous case.

The Attorney General had used canned language in his pleadings, which charged Zenger with publishing “a certain false, malicious, seditious, and scandalous libel.” Each adjective thus became an element of the crime, requiring each to be proven at trial. Hamilton initially offered to concede that Zenger had printed and published the articles. The Attorney General claimed Hamilton was admitting libel: “I think nothing is plainer than that the words in the information are ‘scandalous, and tend to sedition, and to disquiet the minds of the people’ of this Province. And if such papers are not libels, I think it may be said there can be no such thing as a libel.”

Hamilton replied, “I must insist that what my client is charged with is not a libel; and I observed just now that [the Attorney General] in defining a libel omitted the word false.”

The Attorney General said, “But it has been said already that it may be a libel notwithstanding it may be true.”

Hamilton now had his opening. “We are charged with printing and publishing a certain false, malicious, seditious, and scandalous libel. This word false must have some meaning, or else how came it there? No, the falsehood makes the scandal, and both make the libel. [The Attorney General] has only to prove the words false in order to make us guilty.”

The Attorney General seemed irritated: “We have nothing to prove; you have confessed the printing and the publishing.”

Hamilton riposted, ” We will prove those very papers that are called libels to be true.”

Now, the Chief Justice interjected, “You cannot give the truth of a libel in evidence.”

Hamilton briefly discussed the law of seditious libel, arguing that the cases creating the doctrine all involved false statements, making falsehood an element of the crime. He then distinguished the common law of England and of the colonies. An act punishable as seditious libel in England might not be in New York, for colonials enjoyed greater liberty than Englishmen.

Finally, Hamilton argued the jury’s inherent power to judge the law as well as the facts and refuse to convict if the law is unjust, a doctrine called jury nullification. He discussed a 1670 case, involving William Penn’s arrest for breaking the laws establishing the Church of England as the only lawful religion, by preaching a public sermon on Quakerism. At trial, Penn freely admitted preaching. The judge directed the jury to find Penn guilty. Four jurors voted to acquit. The judge ordered them jailed without food or water. After four days, they still voted to acquit. The judge fined them and ordered them imprisoned until they paid the fines. One juror, Edmund Bushell, sought a writ of habeas corpus. The Lord Chief Justice of England ordered the jurors’ release, ruling they could not be punished for their verdict. It followed that defendants were entitled to trials before a jury unintimidated by the government.

As great defense lawyers will, Hamilton redefined the issue at trial from whether Zenger was guilty of libel to whether a free people might criticize their rulers.

The question before the Court and you gentlemen of the Jury, is not of small or private concern, it is not the cause of a poor Printer of New York alone, which you are now trying; No! It may in its consequence affect every Freeman that lives under the British government on the main of America. It is the best cause. It is the cause of Liberty; and I make no doubt but that your upright conduct this day will not only entitle you to the love and esteem of your fellow citizens; but every man who prefers freedom to a life of slavery will bless and honor you as men who have baffled the attempt of tyranny; and by an impartial and uncorrupt verdict, have laid a noble foundation for securing to ourselves, our posterity, and our neighbors that to which nature and the laws of our country have given us a right; that liberty, both of exposing and opposing arbitrary power by speaking and writing the Truth.

After the Attorney General closed for the government, De Lancey instructed the jury. Their role, he said, was merely determining whether the statements had been published and, if so, whether they referred to the persons or institutions described in the charges. The truth of the statements was irrelevant and immaterial.

Zenger later wrote:

The Jury returned in about Ten Minutes, and found me Not Guilty; upon which there were immediately three Hurra’s of many Hundreds of People in the presence of the Court.

Forty Morrisites hauled Hamilton to dinner at the Black Horse Tavern, near William Street and Exchange Place. The next morning, as Hamilton sailed for Philadelphia, “he was saluted with the great Guns of several Ships in the Harbour, as a public Testimony of the glorious Defense he made in the cause of Liberty.”

After Gov. Cosby’s death in 1736, John Peter Zenger became public printer of the Province of New York. He published the Journal until he died on July 28, 1746.

Jury nullification-“non-cooperation with injustice,” as Clay S. Conrad of the Cato Institute called it-flourished until the last century. Jurors routinely refused to enforce the Alien and Sedition Act, the Fugitive Slave Act, and Prohibition as unjust laws. In 1895, the United States Supreme Court held that trial courts need not inform jurors of this prerogative. Today, a trial judge would hold Hamilton in contempt for attempting to advise the jury of it.

New York Press, June 12, 2001

McCain and Me

John S. McCain won the New Hampshire Republican Presidential primary on February 1, 2000. No one reported the result of the Republican vice-presidential primary. Even the victor didn’t know the results for a week, until he checked the web site of the New Hampshire Secretary of State on Tuesday

John S. McCain won the New Hampshire Republican Presidential primary on February 1, 2000. No one reported the result of the Republican vice-presidential primary. Even the victor didn’t know the results for a week, until he checked the web site of the New Hampshire Secretary of State on Tuesday, February 8 and then downloaded the results.

I won. I polled 23,808 votes. Russell J. Fornwalt, of New York City, polled 18,512. Ours were the only names printed on the ballot.

As New Hampshire uses paper ballots, casting a write-in vote is much easier than in New York. The write-in candidates and their votes included Elizabeth Dole, 9,492; Alan Keyes, 5,426; John S. McCain, 3,994; Steve Forbes, 3,822; George W. Bush, 2,659; Colin Powell, 736; Gary Bauer, 496; Orrin Hatch, 218; Bill Bradley, 129; Albert Gore, 73; Wladislav D. Kubiak, 40; and Sam Costello, 35.

Finally, there were 3,908 scattering votes: people writing in themselves, or Donald Duck, or Donald Trump, which is all much the same thing.

Obviously, someone persuaded Mrs. Dole or her sorority sisters to spend time and money on telephone calls and a pulling operation to show her desirability as a running mate. The same seems true for Keyes. The other numbers seem to represent the usual falloff from the Presidential primary. People vote for their favorites for both offices. Perhaps also, as one sensible woman speculated to me, “Maybe it’s ‘I won’t vote for you for President, but I’ll vote for you for Vice President.'”

Whatever. The New Hampshire vice-presidential primary is meaningless. No delegates are bound to vote for me. I am far likelier to win the lottery or be struck by lightning than to find myself in the winner’s circle at this summer’s Republican National Convention, the Presidential candidate and I with arms raised in victory, ready to lead the GOP into battle against eight years of Democrat corruption and dishonor.

Why did I enter the primary? I wanted to test how many people would vote for someone of whom they knew nothing. Gary Bauer uttered some kind words about the voters in his withdrawal speech after polling one percent of the vote in New Hampshire: “These are serious people here. They take their citizenship seriously. They’ve done a good job of looking at all of us.” This is untrue, of course. Bauer was being a good loser.

That kind of statement could otherwise come only from someone who either was punch-drunk or believed he had a future in politics. A republic, literally, is public property: res publica. Its owners, the public, must take an interest in its affairs or it becomes the property of anyone who takes possession—seizes power, if you will. That happens. Our political system is an insecure oligarchy, seeking periodic moral ratification from the people. Perhaps the best rejection of the system is withholding one’s sanction by not voting.

But I digress. By contrast with the presidential candidates who spent millions of dollars on television, radio, and direct mail advertising, I spent nothing beyond my filing fee. In fact, I returned to an old American custom: I waged a rocking chair campaign (unlike McKinley, who waged a front porch campaign because he had a front porch) and let the office seek the man. I did not even go to New Hampshire.

Vice-presidential primaries grew out of the reform impulse of the Progressive era: to smash the power of bossism by placing all nominations in the hands of the people. Apparently, some states took this to a logical extreme. Maybe it was a kind of philosophical idealism: even an office for which nominees are customarily chosen by their running mates should lie–or seem to lie–in the gift of the people.

The first presidential primaries as we know them were in 1908. New Hampshire adopted a direct primary law in 1913 and applied it to the selection of delegates and alternates to the National Convention in 1920. In 1952, the state added a beauty contest for President and, inexplicably, vice-president.

Now there are only two vice-presidential primaries: New Hampshire and West Virginia. Ohio and Maryland had them too at one time. They were usually uncontested. Often, some local elder statesman would have his name placed on the ballot as a favorite son. (In Ohio, several aging Civil War generals were put up to it.) H. L. Mencken considered entering the Maryland Democratic vice-presidential primary in 1912: he declined when a mayor of Baltimore, convinced the lightning would strike him, entered his name. Of course, Mencken’s candidacy would have been just a sick joke.

Parenthetically, the eventual 1912 Democratic nominee and victor, the dapper, witty Thomas Marshall of Indiana, is remembered only for a response to Senator Francis Newlands of Nevada, a would-be Cicero fond of stringing sentences beginning with “What this country needs…” “What this country needs,” the vice-president riposted from the chair he occupied as President of the Senate, “is a good five-cent cigar.”

In New Hampshire, at least one candidate usually enters each party’s vice-presidential primary. One such was Austin Burton, a Republican who won the 1968 vice-presidential primary while–or perhaps by–arguing that we should return the country to the Indians. (He campaigned in a feathered headdress and claimed to have been made “Chief Burning Tree.”) In 1972, Endicott Peabody, a one-term governor of Massachusetts, entered the Democratic vice-presidential primary, espousing the view that the vice-presidency was important. He won, without opposition, only to receive around 100 votes at the Democratic National Convention out of some 3,000 cast.

Fewer candidates enter the West Virginia vice-presidential primary, and in some years no one does. I briefly considered doing both: then I decided I could do better things with West Virginia’s filing fee for vice-presidential candidacy, which is $1,750. In 1976, Ray Rollinson, who opposes abortion on demand and favors the re-legalization of marijuana (not a bad platform, that), entered both primaries. He won New Hampshire. Alas, in West Virginia, he was soundly defeated by Dale Reusch, an Imperial Wizard of the Ku Klux Klan from Medina, Ohio.

Entering a primary in New Hampshire is simple. One fills out a short form, attaches a check or money order for the filing fee, and either mails it or files it in person or by representative. The filing period for the 2000 primary ran from November 1 to November 19, 1999. Many candidates appear in person to file. Indeed, the Secretary of State notes in his records whether a candidate appeared in person. I mailed mine, as did most candidates. Then I contemplated the great questions of the day from my rocking chair.

The only controversy involving my candidacy, and everybody else’s, was a complaint to the state’s Ballot Law Commission by one Joseph S. Haas Jr., an occasional candidate for state office. The gravamen of his complaint was that the candidates had violated the law by tendering checks in payment of the filing fees and must be disqualified because, under the state’s Coinage act of 1752 and an 1898 New Hampshire Supreme Court decision, State v. Jackson, the only money recognized by the state was coinage of gold and silver.

This seemed strange.  After all, most of New Hampshire’s taxpayers pay their taxes with checks rather than large sacks of coin.

Mr. Haas’s papers were even stranger.  Some were written in a crabbed hand; others were typed, with self-conscious eccentricities of usage such as “UN-answered” and “PAYment” or “‘pay’ment,” and annotated with hand-written corrections. And he signed his name as Joseph Sanders Haas Jr., Joseph S. Haas, and Joe Hass.  Maybe he was suffering from schizophrenia.  Maybe he was just a slob.

Maybe he wanted to create a Constitutional crisis on the cheap. The United States demonetized gold in 1933. A few years later, the U.S. Supreme Court ruled that bond indentures requiring payment of principle and interest in gold coin were enforceable only as to require legal tender, i.e., paper money. And our last circulating silver coins were struck back in the 1960s.

In common with the other candidates, I ignored Mr. Haas. The Ballot Law Commission held a hearing on December 17, 1999. Mr. Haas first argued that the candidates had defaulted by not responding to his complaint. The Commission did not consider his motion. Then he argued that a check was not immediate payment, but merely a promise to pay.  The Hon. Richard Marple, a State Representative, also argued that checks were not legal payment and only gold and silver coins were legal tender.

New Hampshire’s legislature has 400 Representatives in its lower house; they are paid $125 a year which, if Mr. Marple’s capacity for self-delusion is an indicator, is excessively generous.

The Commission, after analyzing the Uniform Commercial Code, determined that a check was not a promise to pay, but an order, a written instruction to pay money signed by the person giving the instruction, acceptable as a payment, and dismissed Mr. Haas’s complaint. The Republic was saved from the spectacle of a primary with no candidates on the ballot.

The next interesting development came on January 25, 2000, when I heard from my opponent. An envelope addressed to Hon. William Bryk from Russell J. Fornwalt landed in my mailbox. Mr. Fornwalt sent me a small “Russell J. Fornwalt for U.S. Vice President” calendar printed in blue, red, and black inks on light cardstock. He included a reproduction of his advertisement from the Carriage Town News, of Kingstown, N.H., in which he promised to restore dignity and integrity to the office of vice-president and called himself “the choice of the voters,” inviting people to learn more about his campaign by sending a stamped, self-addressed envelope to his post office box.

Finally, he enclosed a press release, bearing a black and white photocopy of a color snapshot of a benign, grandfatherly man (Mr. Fornwalt, presumably), seated at a secretary with several papers before him. The release was unsteadily typed with a manual typewriter, and two errors had been neatly corrected with a blue ball-point pen. Mr. Fornwalt wrote the following:

Russell J. Fornwalt, Republican candidate for vice-president in the New Hampshire Primary Election on February 1, 2000, has only one goal: Get the Vice out of the vice-presidency and put Virtue back in.

What is “Vice”? (Dictionary Definition No. 1: moral depravity or corruption; wickedness; a moral fault or failing).

Among other things, Candidate Fornwalt says he will not stand for the pardon of terrorists. He will not use YOUR telephones in YOUR White House for political fundraising. He will not resort to any kind of funny-money monkey-business. He will not be the Bag or BEGman for any national committee…

Fornwalt points out that there have been 45 VEEPS, starting with an Adams in 1789, including an Andrew, an Arthur, an Adlai, an Alben, an Agnew, ET AL. Fourteen of these 45 (about one out of 3) later became Presidents (with or without “VICE”) one way or another.

Mr. Fornwalt’s campaign literature bore neither telephone number nor e-mail address. So I mailed him an acknowledgment and wished him well.

Two journalists tracked me down. A charming woman from the Press Trust of India, who combined a sultry alto with a pukka accent, was irresistible and knew it, which made her even more so. The other, Mr. Al McKeon of the Milford Cabinet, combined excellent questions about my motives with an appeal to help him meet his deadline. I couldn’t resist that, either.

Both harped a bit on my comparative obscurity. “I’ve talked to reporters on papers throughout the state,” the woman informed me, “and they say they’ve never heard of you.” I murmured something about censorship by the liberal media. The fellow was less combative. “They say people like you who run for President or Vice President—people they’ve never heard of—tend to be nuts.” I said I’d never heard of them, either, but that “I’d never make that kind of generalization–not even after reading the Manchester Union-Leader.”

I don’t believe either interview has seen print.

Primary Day and Night came and went. I periodically glanced over the newspapers and checked the Internet for the results, until the moment of glory came. One can’t say it was my Warholian fifteen minutes. No one noticed. And McCain hasn’t called me.

February 16, 2000,  New York Press

The New Drag (And Why It Matters)

In the 1999 film adaptation of An Ideal Husband that recently came out on video, there’s a scene in which Lord Goring, the play’s hero, attends the opening of Oscar Wilde’s The Importance of Being Earnest. All London seems to have turned out for the occasion, which ends with Wilde

In the 1999 film adaptation of An Ideal Husband that recently came out on video, there’s a scene in which Lord Goring, the play’s hero, attends the opening of Oscar Wilde’s The Importance of Being Earnest. All London seems to have turned out for the occasion, which ends with Wilde himself appearing in response to cries of “Author, author!” while Goring, played by Rupert Everett, looks on approvingly. The scene (invented by Oliver Parker, who wrote and directed) is metaphysically daunting: a Wilde character at the premiere of another of Wilde’s plays. Chronologically, it has an historical elegance. An Ideal Husband and The Importance of Being Earnest were both running in the West End at the time of Wilde’s arrest in 1895 for the crime of homosexuality.

Parker’s film isn’t, as far as I know, based on Sir Peter Hall’s 1992 London production of the play, but I doubt if it would have been made if Hall hadn’t revived interest in the play. New York audiences who saw Hall’s production on Broadway four years later, with Nicky Henson in the role of Lord Goring, will find it hard to understand how revelatory it was. As Goring in the original production, Martin Shaw had been got up to look exactly like Wilde. He’d comported himself like Wilde, too, intoning his lines with the sonorous languor we’ve learned to associate with Wilde’s own speech mannerisms. In New York, Henson just wore a fat-suit and a poppy or a lily (I forget which) and trotted out the stock figure of the stage-dandy. Parker’s film goes a long way toward restoring the central mechanism of Hall’s production by casting Everett in the role of Goring.

What had been brilliant about Hall’s conception was its recognition of the coded message in the play. Hall was mining a literary observation: two characteristics that Wilde’s most likable heroes share with their creator are a flair for aphoristic reversals and a fondness for pretending to be something they are not. Lord Darlington, in Lady Windermere’s Fan, pretends to be “wicked” and is really the best of fellows. Lord Goring, in An Ideal Husband, pretends to be trivial and ineffectual and is really serious and resourceful. When Hall showed us Goring recast in Wilde’s own image, the double meaning of the title became clear: an “ideal” husband, whatever else he might be, was arguably a bisexual one.

Both plays actually contained covert references to homosexuality. An Ideal Husband was just a little bit more opaque than Earnest. The subtext of Wilde’s effervescent comedy about a sober, respectable young man inventing a secret identity for himself so as to be able to disappear periodically to do who-knows-what, seems so patently obvious to us now, it’s hard to credit even Victorian audiences with missing it. (What can they have thought all that stuff about “Earnest in town and Jack in the country” and going “Bunburying” was all about?) Of course it was essential that the majority of Wilde’s contemporaries not get the joke. What Hall’s production of An Ideal Husband brought out, which had hitherto gone unnoticed, was that Wilde’s dramaturgy operated on two levels, relying on a potent literary device that amounted to a sort of moral “drag.”

Actually, it was more properly meta-drag. We now use the term “drag” mostly to mean people cross-dressing for non-artistic reasons—because they like to, because it’s funny or fun. But of course theatrical cross-dressing is as old as theater itself, because before the invention of the actress in the restoration period, characters of both sexes were played by males. (One theory is that “drag” referred to the long skirts that boys had to wear to indicate the gender of the characters they were playing.) What I’m calling meta-drag emerges with plays in which characters themselves were cross-dressing and the fundamental meaning of a play was dependent on the audience’s awareness that they were watching a male playing a female playing a male. A boy playing Ophelia in Shakespeare’s day was drag; a boy playing Rosalind or Viola disguised as a boy was meta-drag.

Wilde’s plays presented saints who purported to be sinners—or rather good men who liked to pretend to pretend to be bad. For Wilde’s protagonists, the affectation of affectation was a moral imperative given the institutionalized hypocrisy of society. On another level, they really were “sinners”—in society’s view—if you were hip enough to get the subtext.

Wilde didn’t invent meta-drag, but he seems to have invented the idea of applying it to non-gender-based antitheses—wicked or pretending-to-be-wicked, frivolous or affecting-a-pose-of-frivolity, affected or just saying that you were. (Years later, Joe Orton would invent a brand of comedy based on the premise that homosexuality doesn’t exist, but whose whole force and authority really relied on an audience’s knowledge that it does.) Exactly what makes meta-drag so potent and dangerous—and it almost can’t help being subversive—is a mystery. But no self-respecting culture can afford to be without it, particularly one (like ours) that’s stuck in the idea that theater is supposed to be realistic.

Our drama thinks that the only way for theater to arrive at truth is by saying things directly, a notion that Wilde’s protagonists tended to dismiss and his plays to refute. A culture without meta-drag or its equivalent is one whose theater (like ours) is going to be short on irony, ambiguity, metaphor, artifice, and moral complexity—all the things that great drama tends to revel in.

I remember some years ago having a worried conversation with a friend about what would happen to meta-drag now that there was no longer any real need for it. (My friend is fond of Ronald Firbank, another master of covert homosexual literature.) We actually discussed, though probably not seriously, which was more important: the ability to lead an openly gay life or a form of literature that gave rise to works of such complex reverberating genius.

In fact, what’s happened is that new forms of meta-drag have begun to emerge in the last few decades, very much on the Wildean model but using other binary oppositions—questions of humanity, race, identity, species, even ontology and metaphysics. You see it all over the place, any time an actor purports to play something an audience knows he is not. You saw it years ago in the sitcom episode in which the straitlaced neighbor played by the openly gay actress hit on a man who turned out to be gay. You saw it in Nicholas Hytner’s production of Carousel, in which Audra McDonald played Carrie Pipperidge, and she and Mr. Snow marched around lily-white New England attended by offspring of all different colors and races. You saw it in American Beauty, when the character played by Kevin Spacey repulsed an unlooked-for advance from the gay-bashing colonel played by Chris Cooper.

More recently, you saw it in Being John Malkovich in the scenes in which Malkovich was supposed to be playing someone else inhabiting his own body. There may even have been an element of meta-drag in the episode of The Sopranos in which Michael Imperioli came close to writing his own character out of the series.

Meta-drag, which is “non-vertical”—it creates links of a moral and thematic nature after the fashion of hypertext—makes us momentarily question what does or doesn’t matter or what might or might not be true. For a split second, some piece of knowledge that you knew you possessed but had forgotten because artistry—good acting or the compelling nature of the story—had made you oblivious to it (someone’s gender, the rumors about them, their racial background) flits through your mind. The thought it prompts is gone in a moment, but it leaves an afterimage.

Meta-drag thinks it’s important to remember that nothing is as simple as black-or-white, real-or-fictional, alien-or-human, crazy-or-sane. And, yes—as with the earlier meta-drag, the fundamental opposition lurking behind all these others is probably the primal old right-and-wrong opposition. But that would be boring to dwell on. That would be banal. So meta-drag dresses it up in all these other outfits, the more elaborate the better. That’s what those often dizzying degrees of reality are about: they’re fun and impressive. Also, there’s a moral truth they tend to reenact over and over, which is that when you expose one hypocrisy there’s always another ready to spring up and take its place.

New York Press, July 4, 2000

The Merchant of Death

Twenty-five years ago, I was traveling by train up the Hudson’s east shore. About an hour out of Manhattan, I glanced up from my book. We were about four miles north of West Point, near Storm King. About a thousand feet away, a great red castle

Twenty-five years ago, I was traveling by train up the Hudson’s east shore. About an hour out of Manhattan, I glanced up from my book. We were about four miles north of West Point, near Storm King. About a thousand feet away, a great red castle towered above a craggy, lush green island. The massive keep was a roofless ruin. But it was nonetheless a vision unutterably romantic, and I wondered why and how it had come to stand there.

The island is called Pollepel, from the Dutch for “spoon” or “pot ladle.” It is mostly rock and covers six and three-quarters acres with a maximum elevation of 115 feet.

Craig Poole, an area resident, noted in a recent article that the Indians believed the island was haunted and would not stay on it at night. This made Pollepel a useful refuge from Indian attacks. Early Dutch mariners believed Pollepel marked the northern limit of the domain of the Heer of Dunderberg, the fiend of the Hudson Highlands. New sailors were inoculated against the goblin king by ducking in the river as their boats passed the island. During the Revolution, patriots constructed a chevaux de frise—an underwater fence of 106 iron-tipped logs, designed to impale and sink ships—between the island and Plum Point on the western shore. For most of the next century, Pollepel was a center for rumrunners and moonshiners.

Then, on December 5, 1900, Pollepel was sold to Francis Bannerman VI and all things changed. Frank Bannerman was born in Dundee, Scotland in 1851. His family name had been granted to an ancestor by King Robert the Bruce at Bannockburn as an honor for valor in regaining a banner momentarily taken by the enemy.

Frank’s family emigrated to Brooklyn in 1854. Francis Bannerman V, Frank’s father, began buying flags, rope, and naval stores acquired at Navy auctions for resale. When he joined the Union army, Frank took his place. After the Civil War, the War and Navy Departments discarded weapons by the ton, often for mere scrap value. Frank bought as much as he could. During the next five decades, Bannerman’s became the world’s largest buyer of military surplus.

Frank loved old weapons and uniforms as historical artifacts and promoted their sale as decorative items. Thom Johnson, who lectures on Bannerman and his island, claims roughly half the cannon used as war memorials throughout the United States originally came from Francis Bannerman’s Sons.

Most arms dealers are brokers: one orders the weapons and waits. Frank maintained a huge stock in his warehouses, ready for immediate delivery. After moving from Brooklyn to New York City, Frank began outfitting entire militia regiments. Later, he would equip whole armies.

During the 1870s, Frank began publishing a catalogue, which was revised and published annually into the 1960s. The catalogue was far more than a price list, illustrated with thousands of line drawings, engravings, and photographs. Frank described his guns, swords, and other militaria in mouth-watering detail with lavish accounts of their history and use. The New York Sun wrote, “Bannerman could tell an interesting story about everything he had for sale.”

Frank Bannerman reveals himself in his catalogues: a practical romantic, enamored with tales of valor, intrigued by how things work, and a good writer with a dry sense of humor and an unembarrassed religious faith. He pointed out that the New Testament shows the Apostles carrying at least two swords with them as they accompanied Jesus in his preaching, Peter using “one to good effect,” and argued that “the carrying of weapons met with the approval of the Prince of Peace.” Yet he prayed for the day that Bannerman’s Military Museum would become “The Museum of Lost Arts.”

In the meantime, as “St. John’s vision of Satan bound and the one thousand years of peace” was not yet in sight, Frank was making money in the second-hand arms market, particularly in South and Central America. Smaller countries needed weapons without needing state of the art equipment. Frank was their man.

Haiti, in particular, often dealt with him. Indeed, Bannerman could be considered almost a part of Haiti’s political process. Robert Debs Heinl’s admirable Written in Blood, perhaps the only extensive modern history of Haiti in English, points out that late Victorian Haiti enjoyed liberal democratic constitutions. They were ignored by all players. Every few years, some politician conceived presidential ambitions. He went unto the hills and sounded out the caciques, the local politico-military bosses. This is analogous to entering the New Hampshire primaries. If he found support, he then went to campaign contributors, usually promising that bread cast upon the waters now would return a thousand-fold upon his victory when his supporters could steal the customs house receipts. This is like raising campaign funds from road contractors when running for County Superintendent of Highways.

The money went to arms dealers in New York, comparable to political consultants who produce commercials. One gathers Bannerman was among them: the Haitian common soldiery were largely clothed in Civil War castoffs, a market Bannerman had long since cornered. Bannerman’s catalogues offered everything a candidate might need, from muskets and Gatling guns to second-hand steam yachts, suitable for conversion to warships “fully armed and equipped,” FOB New York—all, of course, strictly cash in advance.

The candidate then returned to Haiti, landing outside Port-au-Prince to link up with cacique armies streaming down from the hills. The incumbent president calculated his chances, looted the Treasury for the last time, and took the steamer for Jamaica. The victor seized the National Palace while the troops had a good time in the big city. A few years later, it would begin all over again.

Around the turn of the century, Frank purchased 501 Broadway, near Greene Street, for a retail store and free military museum, opened to the public in 1905. The New York Herald said of it, “No museum in the world exceeds it in the number of exhibits.” There were the thousand different guns (“from the early matchlock, up to the present day automatic”), the thousand different swords (from the Roman “bronze blade…to the present day regulation”), the thousand different pistols, and the other appurtenances of the pride, pomp, and circumstance of glorious war.

Amid all this, after the Spanish-American War of 1898, Frank purchased ninety percent of all captured Spanish war materiel in a sealed bid auction. Buying the accumulation of Spain’s four centuries in Cuba was all in a day’s work for Frank: flags, body armor, medals, uniforms, swords, saddles, Gatling guns, field and coastal artillery, 200,000 Mauser rifles, 30 million cartridges, pistols, shells, and thousands of tons of black powder.

The City government took a dim view of keeping it within city limits. Hence Frank’s purchase of Pollepel. For the next seventeen years, Frank Bannerman designed and supervised the construction of the island’s storehouses, workshops, docks, gardens, and moat. He built a power plant and laid telephone lines to the mainland. He created an artificial harbor by sinking in place old wheat barges and railroad floats filled with stones and dirt, covering them with concrete, and building an ornate breakwater with towers and bridged gateways.

Then, in 1908, Frank Bannerman started his private castle, the fortress of his dreams, armed with an extraordinarily dark, brooding, and passionate misunderstanding of Scottish Victorian architecture. The entire complex — breakwaters and all — became an encrusted mass of crennelations, turrets, moats, and battlements. The Bannerman arms were above nearly every doorway; the street lights were formed like thistles; Biblical quotations were cast into the fireplace mantles. Almost all of it was done without the help of an architect.

Pollepel became, as one commentator said, “a bit of Scotland…deposited on a bare bit of rock mid-river.” A heavily illustrated slick-paper pamphlet, Bannerman Island, published by the firm in 1918, captures Pollepel’s eccentricity. On the cover, the warehouse glowers above the landing dock, ornamented with six-inch naval guns, with a road leading to a gate tower and portcullis. Along the roof line, just below the crennelations, the words “BANNERMAN’S ISLAND ARSENAL” stand out darkly against the light brick.

The largest cannon could be barged up the Hudson to his dock, with “the surrounding water provided at least moderate protection from casual visitors,” as Joseph Schroeder wrote in his introduction and by motorboat, and gave them a great time with swimming, hiking, picnics, and firing cannon from his castle walls.
Frank died in November, 1918, worn out by the First World War. Within two years of his death, his arsenal survived a spectacular accident on August 15, 1920 when two hundred pounds of black powder exploded, heaving “brick, munitions, and equipment high into the summer sky.” The explosion flung a twenty-five foot section of wall nearly a quarter mile to land across the New York Central tracks along the east shore. The cities of Hudson and Peekskill were shaken by the explosion. Schroeder wrote that “Contemporary newspaper reports state that rescue boats were kept from approaching the island for some time by…exploding ammunition.” Yet only three people were injured.

The Bannerman family continued using the island as a summer residence. In a recent interview, Jane Bannerman, the widow of Frank’s grandson Charles, said that the castle was well-maintained as late at the thirties, with the grounds cluttered with relics such as the great chain placed across the Hudson at West Point during the Revolution, sleighs and other equipment used by Peary to seek the North Pole, and a table used by George Washington.

The catalogue continued offering the stuff little boys’ dreams are made of: African spears, Moroccan saddles, cannon, swords, scimitars, uniforms, helmets, cocked hats, armor, medals, pistols, rifles, artillery, armed yachts. But the Second World War bred massive competition: Frank Bannerman’s heirs were not so much his own flesh and blood as the Army-Navy stores in every town. Francis Bannerman’s Sons played no role in the disposition of World War II surplus comparable to that of the Spanish American War. The firm could no longer provide huge quantities of equipment from stock as when, during the Russo-Japanese War, Frank Bannerman had sold the Japanese 100,000 Mausers and 20 million cartridges — all from inventory, all immediately available.

The Broadway store, still considered the greatest private military museum in the world, was now five stories and several subbasements of dusty confusion. Yet, as Harry Wandrus wrote in the June, 1960 issue of Hobbies magazine, a searcher willing to get filthy could still find complete Civil War vintage Springfield and Enfield muskets and the spare parts to maintain them.

Conditions on the island were worse. The buildings had been neglected and much of the materiel in storage damaged beyond recovery. In the January, 1959 issue of Guns magazine, Bill Edwards described visiting the island with Val Forgett, an explosives expert whom Bannerman’s had retained to deactivate the live ammunition: “Noting…two giant 16 inch shells flanking the entrance to the harbor, Val grabbed a wrench and had an assistant boost him up to the nose so he could unscrew the fuse…the fuses were live and the projectiles full of high explosive…” Apparently, their efforts were successful: they survived.

“The Largest Dealers in the World in Military Goods” closed 501 Broadway in 1959. The museum pieces from both the store and the island went to the Smithsonian Institution. In 1967, the Bannermans sold the island to New York State. They supposedly removed the old military merchandise. But the closing was a rush job: Jane Bannerman said, “We always meant to go back to get personal things, like my grandmother’s Irish linen bed sheets.”

The Taconic State Park Commission took possession on July 1, 1968. They offered public tours of the island and planned to preserve the buildings as a park. On the night of August 8, 1969, a great fire reduced the buildings to bare walls. Perhaps not all the munitions had been removed. Some still speculate about what lies under the ruins.

The Bannerman Island Trust, PO Box 843, Glenham NY 12527, telephone (914) 831-6346, has persuaded the State to study reopening the island to the public and works to stabilize the buildings for future restoration. For today, as Lenore Person recently wrote, the island “is covered with poison ivy. And it’s infested with snakes and deer ticks.”

Yet Frank Bannerman’s castle still stirs the imagination, its ruined walls rising from the river mists, as distant, as untouchable, as a dream.