Pocket Change

If you want to find a mirror of a society’s ideal—the image of what it hopes and imagines itself to be—public sculpture is as good a place as any to start, and none is more common or readily available than the public sculpture we carry around with us on the

If you want to find a mirror of a society’s ideal—the image of what it hopes and imagines itself to be—public sculpture is as good a place as any to start, and none is more common or readily available than the public sculpture we carry around with us on the coins in our pockets.

This year will bring some changes to the world’s most common public sculpture, the Lincoln penny. The occasion is the Lincoln bicentennial, and the Mint is happy. Collectors and speculators were glad to pay $8.95 for the two-roll sets of the new coins (worth $1.00) that went on sale on February 12, 2009 and sold out within a month. And so far few have complained about the new reverse design, which represents the Kentucky log cabin at the Abraham Lincoln Birthplace National Historic Site. (Of course, that cabin is itself a representation of someone’s idea of the original structure.)

Three more designs, one to be issued every three months throughout the year, will represent respectively Lincoln’s education, his pre-Presidential careers as lawyer and politician, and his Presidency. In 2010 and beyond, the Mint will issue yet another reverse, “emblematic of President Lincoln’s preservation of the United States of America as a single and united country.” So there will be five new designs, each issued by the mints at Philadelphia, Denver, and San Francisco (each mint’s coins has a special mint mark, P, D, and S, respectively), creating fifteen new coins for the delectation of collectors within less than thirteen months.

The original Lincoln cent, designed by Victor David Brenner, reflected the genius of the sculptor and of President Theodore Roosevelt, himself an aesthete, who forced change on the Mint bureaucracy of his day because he found the coinage of the United States unworthy of a great republic. It still is. For the most part, the heroes on our coinage and paper money depict the men considered great half a century ago.

Surely John F. Kennedy’s reputation has undergone re-evaluation since he replaced Benjamin Franklin on the half-dollar in a moment of national grief. Walt Whitman, George Gershwin, George S. Kaufman, Jonas Salk, Earl Warren, Eleanor Roosevelt, Ronald Reagan, Sojourner Truth—from the arts and sciences alone the list of possible alternatives to the present set of political icons on our coins and currency (which date from the New Deal or before) is almost limitless.

All this brings to mind something I thought about while emptying my pockets the other day.

At some time in the last century, I was taken to a Broadway revival of the musical comedy “1776.” In one scene, an actor named Paul Michael Valley, who played Thomas Jefferson, briefly stood in profile, silhouetted against an open door. Some suburban housewife in the next row murmured to her neighbor, “He looks just like the guy on the nickel.” Indeed, he did, which may explain his casting.

Anyway, while putting my pocket change on the dresser, I noticed one of those odd nickels struck by the Mint to commemorate the bicentennials of the Louisiana Purchase and the Lewis and Clark Expedition. The obverse looked like the Man in the Moon. Of course, the head was still old Tom’s, but the image had changed.

The Purchase itself was commemorated in 2004 by adapting the design of Jefferson’s Indian Peace Medal for the reverse of the five cent piece. The Indian Peace Medals, a British tradition continued after Independence, were large, attractive silver medals awarded by the United States to Indian chiefs or other important men on such occasions as major conferences or the signing of treaties. Intriguingly, this custom reflected the European tradition of exchanging decorations at historic moments of concord, which the United States has otherwise never adopted.

In place of the King appeared the current President, and on the reverse, in Jefferson’s case, appeared two clasped hands, the one to the right with a metal wristband such as frequently worn by Indian chiefs, and the one to the left with an army officer’s braided cuff, all beneath a crossed hatchet and inverted peace pipe. The medal also bore the words, “Peace and Friendship.”

Many of Jefferson’s medals were given to Indians during Lewis and Clark’s expedition from St. Louis to the Pacific Coast between 1804 and 1806. They are mentioned in the Expedition’s Journal as among the articles taken for presentation to the Indians. On August 1, 1804, the Journal records the gift of a “First Grade” medal and flag to a “Grand Chief,” medals of the Second Grade to lesser chiefs; and of the Third Grade to inferior chiefs. Certificates to accompany the medals were also issued, such as one surviving in a California collection which refers to “the special confidence reposed by us in the sincere and unalterable attachment of War Charpa the Sticker, a Warrior of the Soues Nation, to the United States…”

Later in 2004, the Mint issued yet another kind of nickel with a reverse featuring one of the Expedition’s flatboats, driven by both sail and poles, like a Mediterranean war galley.

In the spring of 2005, the year of the coin I found in change, the Mint had doubly changed the coin. The obverse had Jefferson’s profile, oddly presented as to leave the coin resembling the Man in the Moon, with the word “Liberty” in Jefferson’s handwriting. The reverse adapted one of the Mint’s most popular designs, James Earl Fraser’s Buffalo nickel of 1913-1938, for the next coin in the series. Over 1.2 billion Buffalo nickels were struck during that quarter-century. They have now vanished from circulation. But as late as the 1960s, one still found Buffalo nickels in change, with the stern profile of an austere Indian warrior on the obverse and the massive buffalo on the reverse.

Fraser’s visual economy in its design is profoundly moving: without a touch of sentimentality, few accessible works of art so powerfully visualize the nobility of tragedy. And his commanding, virile bison dominates the design of its coin.

But as is often the case in the Mint’s modern adaptations of older designs from a heroic past, the modern buffalo, despite its unequivocal masculinity, lacks confidence. It seems neutered, almost cringing, standing, somehow off-balance, on a small, sloping patch of prairie, fenced in by the words “United States of America.”

In the fall and winter of 2005, the coin was changed yet again: the Man in the Moon obverse was coupled with a new reverse, a view of the Pacific Ocean with a quote from Captain Clark: “Ocean in view! O! the joy!” The minor irritant here is that Clark, whose writings betray a libertarian, if not Shakespearean, attitude toward standardized spelling, had written “Ocian.” While the Oxford English Dictionary includes “ocian” among its citations, the Mint corrected Clark’s usage. According to CNN, when questioned about this, a Mint spokeswoman answered, “We didn’t want to confuse anyone into thinking we couldn’t spell.” Again, a lack of confidence, this time in the reality of Clark’s spelling.

The following year showed the most unusual change in the design. Even as Monticello returned to the reverse, the obverse had Jefferson gazing at the viewer in one of American coinage’s first full-face designs. This is unusual for a technical reason. Coins first show signs of wear on the highest point of the design. The traditional profile tends to wear gracefully. But a full face design tends to wear out nose first.

The most notorious example of this was Copper Nose coinage of England’s King Henry VIII. By the 1540s, Henry was running out of money due to his personal and public extravagance. He both raised taxes and debased the coinage, transforming the nominally silver shilling to a copper coin which was dipped into a silver nitrate solution. Electrolysis left a thin wash of silver on the coins.

Instead of a conventional profile, the new coins bore the King’s facing image, executed with surprising candor, bearded and repellently bloated. Even a little wear on the coin’s highest relief – which with a facing portrait is the nose – revealed its copper core. As the coating wore off the most prominent feature – Henry’s nose – it became reddish brown. Hence, the king acquired the nickname “Old Coppernose.”

Such a sobriquet is unlikely to be attached to Jefferson, who is, after all, nearly two centuries dead. His coin is silver-colored metal all through. But the full face design is off-putting and unattractive, and one hopes the Mint will return to the customary usage in its future coinage.

Another Last Hurrah?

As becomes a citizen, I have occasionally run for public office. As Edouard Herriot, four times Prime Minister under the Third Republic, said whenever he was running for anything, from conseilleur municipal to President de la Republique, “I have placed myself at the disposal of my friends and the

As becomes a citizen, I have occasionally run for public office. As Édouard Herriot, four times Prime Minister under the Third Republic, said whenever he was running for anything, from conseilleur municipal to Président de la République, “I have placed myself at the disposal of my friends and the service of the Republic.” In my case, I was simply doing my best to sabotage and annoy the office-holding element among The Wicked Who Prosper.

Most recently, I stood for Vice President of the United States in the New Hampshire primaries, which I wrote about in 2000.  It was all in good fun and as—much to my surprise and delight—I won, I found myself with yet another anecdote for dinner conversation.

So I was saddened when the May 1, 2009 issue of Richard Winger’s indispensable Ballot Access News reported that on April 22 the New Hampshire legislature passed a bill to eliminate the vice-presidential primary.  As Mr. Winger notes, “No other state has a vice-presidential primary.”

He goes on to point out that “Generally, no one who really has a chance to be chosen by a major party for vice-president ever files in this primary.” There is, in a sense, a reason for this. Since New Hampshire instituted the vice-presidential primary over fifty years ago, the contest had developed a laudable purpose uniquely its own: the potential embarrassment of an incumbent vice-president.  If, like myself, you believe that politicians are fair game, then the vice-presidential primary is simply a happy hunting ground.  Besides, until fairly recently, the vice-presidency was an absurdly empty job with its occupants as worthy of respect as the hapless Alexander Throttlebottom, the vice-president in Gershwin’s 1931 musical Of Thee I Sing.

All is not yet lost: the Legislature’s website indicates that the bill has not yet gone to the governor for signature. Though my stumping days are behind me, I for one fervently hope that it does not. While only one incumbent has actually been defeated (Dan Quayle, 1992, who did not have his name appear on the ballot, did not wage a subrosa campaign for votes as most incumbents do, and was overwhelmed by the unknown candidate who paid his filing fee and appeared on the ballot, never to be heard from again), the possibilities presented in 2012 by the loose-lipped Joe Biden seem limitless and irresistible.  It would be a pity if New Hampshire were to spare him that potential humiliation.

Aurelia Greene, Evergreen

Last week, a special election was held for Bronx Borough President, a job which, since the Charter reforms of the early 1990s, is largely ceremonial. The marvelously named incumbent, Adolfo Carrion, had resigned office to accept appointment as Director of the White House Office of Urban Affairs….The Assemblywoman’s name rang a chime in memory. She had been

Last week, a special election was held for Bronx Borough President, a job which, since the Charter reforms of the early 1990s, is largely ceremonial. The marvelously named incumbent, Adolfo Carrion, had resigned office to accept appointment as Director of the White House Office of Urban Affairs.

Even before State Assemblyman Ruben Diaz Jr. had won the Borough Presidency, he had announced that he would appoint State Assemblywoman Aurelia Greene his Deputy Borough President. She will be the deputy to a public official who doesn’t have much to do. Her duties are unlikely to be taxing.

The Assemblywoman’s name rang a chime in memory. She had been involved with Bronx Community School Board #9. Both she and her late husband, the Honorable Reverend Dr. Jerome Greene, who favored using all his titles at once, had been elected to the Board repeatedly. The Honorable Reverend Doctor had been its president from time to time, and by coincidence, Board #9 had employed several of Assemblywoman Greene’s relatives while she was serving on the Board.

Pastor of the Bronx Charismatic Prayer Fellowship, a church that met in his family manse, and Founder and President of the Bronx Unity Democratic Club, Dr. Greene had pled guilty in 1991 to larceny when he admitted using City money to pay for cameras, television equipment, and other merchandise purchased for his personal use. He also admitted using Board of Education employees to print political campaign literature at Board of Education expense.

The Greenes had previously been indicted for stealing a piano from Intermediate School 145. They beat the rap on that one, although the piano had ended up in their house, where it was supposedly used in his religious services. Dr. Greene was re-indicted on misdemeanor theft of services charges for using school employees to transport the piano to his house. However, that charge was apparently resolved when Dr. Greene pled to the larceny charge.

Both Greenes were serving on Board #9 when, in 1988, the late Chancellor Richard Green suspended them and the rest of the board amid charges of drug use and drug trafficking, extorting money from teachers, and stealing school equipment. The investigations leading to the Board’s suspension had stemmed from the arrest and subsequent conviction of a Board #9 principal, Matthew Barnwell of P.S. 53, for buying crack.

In addition to Dr. Greene and Mr. Barnwell, eight other people from Board #9 were convicted of crimes that included signing phony invoices, bribery, and defrauding the government. The Chancellor removed the district superintendent, Dr. Annie Wolinsky, for mismanagement. Dr. Wolinsky was unable to explain how, while schools went without basics like chalk and paper, thousands of dollars of uncatalogued supplies were stacked in the district warehouse, or why eight district employees worked only at videotaping the board members. She also couldn’t explain her failure to discipline Mr. Barnwell for, among other things, 142 instances of lateness or absence in the course of a 184-day school year.

However, death pays all debts. Despite his criminal record, Dr. Greene has been immortalized by the New York City Council which, by enacting Local Law 131 in 2005, renamed part of Teller Avenue in The Bronx as Reverend Jerome A. Greene Place. In his remarks at a December 29, 2005 public hearing before he signed the bill, Mayor Michael Bloomberg stated that the individuals commemorated that day-Dr. Greene among them-were being honored for their lifetime accomplishments.

I gather that one commentator has suggested that Assemblywoman Greene wanted a job that wouldn’t require a commute to Albany. As she trades the State Capitol for Bronx Borough Hall, everything looks pleasant for her: she is trading one well-paid job for another equally remunerative, and as she journeys toward the sunset of life, the downhill road is comfortably paved with city paychecks.

Piracy Then and Now

Now that the drama of the Somali pirates is passed—for the moment, anyway—it seems worth commenting on an op-ed piece about the incident that appeared in the Sunday, April 12, 2009 edition of The New York Times.

By midday of that day the American merchantman Maersk Alabama was safe in Mombassa

Now that the drama of the Somali pirates is passed—for the moment, anyway—it seems worth commenting on an op-ed piece about the incident that appeared in the Sunday, April 12, 2009 edition of The New York Times.

By midday of that day the American merchantman Maersk Alabama was safe in Mombassa, Kenya. Captain Richard Phillips, who’d been held for five days in a 28-foot lifeboat, had been freed.  Three of the four pirates who’d held him captive had been slain—three shots, three kills—by American sniper fire from the destroyer U.S.S. Bainbridge.

The Times piece started out by pointing out that American citizens being held hostage by pirates is nothing new, alluding to the American wars against the Barbary pirates of North Africa at the turn of the 19th century.  It quickly became vague and diffuse, dwelling on no single incident and offering very few specifics about that interlude, save for a brief reference to the destroyer’s namesake, Captain William Bainbridge, who was himself taken by pirates in 1804 when his command, the frigate U.S.S. Philadelphia, grounded on an uncharted reef off the shores of Tripoli and was rendered defenseless.

Instead, the Times piece dwelt at some length on the loss of nineteen American soldiers in the Battle of Mogadishu on October 3-4, 1993, an incident dramatized in the 2001 movie Black Hawk Down.  There, the Americans had been supporting a United Nations initiative to end yet another in a long series of civil wars in Somalia.  After Somali militia had killed and reportedly mutilated twenty-four Pakistani peacekeepers, the UN Security Council authorized the arrest of those responsible.  A small American force went into Mogadishu to capture the Somali foreign minister and his political adviser and remove them by truck.

We had underestimated the enemy capacity for resistance. The Somali militia were, after all, irregulars, semi-trained guys in colored T-shirts and flip-flops, and no one took them seriously despite their AK-47s,  rocket-propelled grenades, and proven capacity for urban warfare, let alone that we were on their turf in their country.

The mission went south very quickly.  The Somalis barricaded the streets with jalopies and piles of burning tires and shot down two American helicopters.  Thus a one-hour operation became a fifteen hour battle, ending only when a column of American, Malaysian, and Pakistani troops, tanks, and armored personnel carriers fought into the city and evacuated the mission.

The bodies of several dead American soldiers were dragged through the streets by Somali mobs.  This was bad for American domestic consumption, and so on this basis alone the Clinton administration stopped all American actions against the Somali militia, withdrew all troops within six months, and thereafter avoided the use of boots on the ground to support its foreign policy.

In focusing on a single mismanaged mission kept alive in public memory by a Hollywood blockbuster, the Times editorialists successfully avoided discussing the success of that earlier mission against the North African pirates, long before air support, radio, or steam power.

Captain Bainbridge’s ship had been swarmed by Tripolitanian gunboats and compelled to surrender.  Once Philadelphia had been refloated and brought into Tripoli, the pirates seem to have found the ship—a late 18th century square rigger—a little sophisticated for them.  Nonetheless, her mere existence was a potential threat to Western shipping.  After all, the Tripolitanians might eventually figure out how to sail and fight her.

The commander of the American squadron in the Mediterranean, Commodore Edward Preble, was an old Revolutionary who apparently understood that threatened or actual violence is as integral to effective foreign policy as negotiation.  He chose to eliminate the threat posed by the captured man-of-war by destroying her.

He ordered a mission that would involve sailing into a heavily-fortified enemy harbor, boarding and burning a warship held by pirates, and then, God willing, returning home.  Lieutenant Stephen Decatur, just twenty-five years old, was appointed to lead the operation.  He was given the Mastico, a filthy captured Tripolitanian ketch (a tiny two-masted sailing craft), which was renamed U.S.S. Intrepid (the first of four American warships that have borne that name).

His crew was taken from the schooner U.S.S. Enterprise and Preble’s flagship, U.S.S. Constitution.  With a Maltese pilot who knew the harbor at Tripoli, Decatur sailed from the Sicilian port of Syracuse on February 3, 1804.  Storms delayed him en route, and he did not reach Tripoli until late afternoon on February 16.  Intrepid was disguised as a Maltese trading vessel under British colors, the United Kingdom having maintained good relations with the pirates by paying them tribute, which we might today call protection money.

Around 7:00 pm, navigating by moonlight, Decatur sailed into the harbor and, claiming to have lost his anchors, requested and received permission to tie up alongside Philadelphia.  As Intrepid came alongside, with some of her crew tossing lines to the frigate, Decatur and the others were huddled along the bulwarks, ready for action.  A guard aboard Philadelphia saw something and shouted, “Amerikanos!”  Decatur gave the order to board and, cutlass in hand, led sixty men up the frigate’s side.  The pirates, caught by surprise, did not resist; most dove overboard and swam for shore.  Within twenty minutes, Philadelphia was ablaze.  Decatur got his men back to Intrepid, which, pursued by the fire of 141 guns, escaped the harbor.  None of his men were killed; one was injured; Philadelphia burned to the waterline and sank.

Commodore Preble then blockaded the Barbary ports, bombarded their cities, and sank their ships.  His successor, Samuel Barron, sent the Marines ashore.  They captured the city of Derne and defeated the Tripolitanian armies in two land battles. (The Marines commemorate this victory in the line in the Marines’ Hymn, “…to the shores of Tripoli,” and their officers’ dress swords, which are patterned after one given to a Marine lieutenant by an Ottoman prince to honor the American’s valor at Derne). In 1805, the treaty of peace between the United States and Tripoli, Tunisia and Algeria, “negotiated at the cannon’s mouth,” was signed aboard U.S.S. Constitution.

Admiral Horatio Lord Nelson called the Intrepid mission “the most bold and daring act of the age.”  Perhaps it exemplifies what American audacity and imagination could do when Washington was three months away by sail and an officer had to improvise the execution of his mission on the spot.

Why does the American use of force to suppress pirates – seagoing extortionists, after all – seem to so frighten the Times? One can only speculate. Perhaps there is something virile in controlled violence that makes the editors nervous.

Most peoples get the governments they deserve.  Somalia has nearly no government at all, numberless private armies, police forces, and religious militias, one of the world’s highest infant mortality rates, and little public access to potable water.

Some free market types argue that Somalia is nearly a libertarian paradise, with a highly efficient and competitive service sector (including crystal clear cell phones, privately-owned, managed, and policed airports, and electric power service in most major cities), and a completely free press.  That seems irrelevant in a country where two-thirds of the population doesn’t live in cities and nearly two-thirds of the adults can’t read.

The so-called Transitional Federal Government, which occupies Baidoa, the country’s third largest city, is recognized by some foreign powers as the government of Somalia.  Its authority nominally extends over the country.  Its effective power doesn’t quite make it past the Baidoa city line and its attempts to move into Mogadishu, the national capital, have not yet succeeded.  The TFG hasn’t yet been able to collect taxes or establish any stable revenue and is entirely dependent upon foreign aid.  Outside Baidoa, Somalia is effectively divided into numerous petty states, ruled by clan-based warlords, local chieftains, and Islamists, a condition akin to that prevailing before its colonization by Italy during the late 19th century.  One might say that Somalia, after enjoying the benefits of Western civilization for nearly a century, has simply returned to its former condition.

Now, in the last two years, by turning to piracy, many Somalis have unwittingly exported their violent culture into the mainstream of commerce.  Perhaps they haven’t understood that, once they began preying on Americans, they encountered another culture—of instant news, the sound bite, the blog, and the politicians who live in it.  The result has not been pretty.

Since Julius Caesar and Pompey the Great, piracy has been eradicated only by force: the ordered use of violence to eliminate the threat of armed robbers to peaceful commerce. The Americans have shown they are still willing to use it.

If the Somalis understood our history, they would realize that the United States has always been ready to use violence to suppress pirates, bandits, and others who prey upon businessmen going about their business.

In 1913, after the Titanic disaster, the nations of the trading world established the International Ice Patrol to monitor icebergs and report their presence in the sea lanes.  The Patrol is conducted by the United States Coast Guard: its costs are apportioned among the nations whose ships travel the North Atlantic.  Perhaps the trading nations should give similar treatment to the suppression of Somali piracy, as a cost of maintaining the seas as a medium of commerce.

The Gray Chrysanthemum

In his eight decades, Sadakichi Hartmann fried eggs with Walt Whitman, discussed verse with Stéphane Mallarmé, and drank with John Barrymore, who once described him as “a living freak presumably sired by Mephistopheles out of Madame Butterfly.”

Critic, poet, novelist, playwright, dancer, actor, and swaggering egotist, Hartmann might lift your watch

In his eight decades, Sadakichi Hartmann fried eggs with Walt Whitman, discussed verse with Stéphane Mallarmé, and drank with John Barrymore, who once described him as “a living freak presumably sired by Mephistopheles out of Madame Butterfly.”

Critic, poet, novelist, playwright, dancer, actor, and swaggering egotist, Hartmann might lift your watch (he was a superb amateur pickpocket) but his opinion was not for sale. Such a man evoked diverse opinions. Gertrude Stein said, “Sadakichi is singular, never plural.” Augustus Saint-Gaudens (whose equestrian statue of Sherman stands near the Plaza) wrote to him, “What you think of matters of art I consider of high value.” W.C. Fields said, “He is a no-good bum.”

Sadakichi Hartmann was born November 8, 1869 on Deshima Island, Nagasaki, Japan. His father was a German merchant, his mother Japanese. His father disowned him at fourteen, shipping him to a great-uncle in Philadelphia with three dollars in his pocket. Sadakichi observed, “Events like these are not apt to foster filial piety.”

While working at menial jobs, he educated himself at the Philadelphia Mercantile Library and published articles, poems, and short stories in Boston and New York newspapers. He crossed the Delaware to Camden to introduce himself to Walt Whitman. In 1887, he published a New York Herald article quoting Whitman’s opinions about other writers, which Whitman denounced for misquoting him. Undaunted, Sadakichi expanded the article to a pamphlet, Conversations with Walt Whitman. He later studied in Europe, meeting Liszt, Whistler, Mallarmé, and Verlaine, glimpsing Ibsen and exchanging letters with Anatole France. (He later sold France’s letter to an autograph hunter, exacting dinner for two at Maxim’s with two bottles of Pol Roger).

At twenty-three, he wrote his first play, Christ, drawn from Hartmann’s private non-historical and non-Biblical ideas, particularly a sexual relationship between Jesus and Mary Magdalene. When Christ, which he claimed had been compared to Shakespeare’s Titus Andronicus, was published in 1893, James Gibbons Huneker, the American aesthete, called it “the most daring of all decadent productions.” It was banned in Boston: the censors burned nearly every copy and jugged Sadakichi in the Charles Street Jail.

While working as a clerk for McKim, Mead & White, he described Stanford White’s drawings as “Rococo in excelsis. To be improved upon only by the pigeons, after the drawings become buildings.” White dispensed with his services. Thereafter, Sadakichi kept the pot boiling with hundreds of German-language essays for the New Yorker Staats-Zeitung.

In 1901, he published a two-volume History of American Art, which became a standard textbook. The History remains worth reading as an intelligent evaluation of American art’s first four centuries, marked by Hartmann’s insight into the modernist movements. His judgments are nearly clairvoyant: he analyzes the work of Thomas Eakins, Winslow Homer, and Albert Pinkham Ryder, all then nearly unknown. He also discusses Alfred Stieglitz as a photographer. Sadakichi was the first American critic to seriously discuss photography as an art form. Hartmann later contributed his most incisive writing to Stieglitz’s Camera Notes and Camera Work.

By 1906, he was famous, in Huneker’s words, as “the man with the Hokusai profile and broad Teutonic culture.” He stole a taxicab and somehow won acquittal in Jefferson Market Night Court by proving he did not know how to drive. When Moriz Rosenthal, a pianist who had studied under Liszt, enriched the Hungarian Rhapsodies by improvising a series of rapid scales during a Carnegie Hall concert, Sadakichi roared from the gallery, “Is this necessary?” As the ushers tossed him out, Hartmann shouted, “I am a man needed but not wanted!”

From 1907 to 1914, he lived intermittently at the Roycroft Colony, an artistic commune at East Aurora, New York, where he ghostwrote books for its founder, Elbert Hubbard, a soap salesman who had become rich by marrying the boss’ daughter. Hubbard had fallen in love with William Morris’s Kelmscott Press. Though utterly untalented, he saw himself as a Renaissance man just like Morris and attempted to recreate the Morris enterprises at Roycroft by spending money. Hubbard rationalized his ghostwritten books by arguing that all great art was collaboration.

In 1915, Sadakichi entered 58 Washington Square South, then known as Bruno’s Garret. Guido Bruno, its proprietor, was an exotically mustached émigré whose bluff, plausible manner and florid speech unsuccessfully concealed an instinct for the main chance. Bruno realized tourists would flock to gape at bohemians in their search for what some called Greenwich Thrillage. He promoted his Garret through flamboyant magazines, all featuring his name in the title: Bruno’s Weekly, Bruno’s Scrap Book, Bruno’s Review of Two Worlds, Bruno’s Bohemia, Bruno’s Chap Book, Bruno’s Review of Life, Love, and Literature and, simply, Bruno’s. In The Improper Bohemians, Allan Churchill described Bruno’s Garret as a layman’s dream of the artist’s life, where “artists’-model types of girls and hot-eyed young men who declared themselves poets, writers, and painters…behaved during working hours like artistic freaks,” declaiming verse and splattering canvases before tourists, herded from the Fifth Avenue bus, while Bruno collected admissions at the door.

The impresario proclaimed Sadakichi a genius. In Bruno’s Chap Book for June 1915, nine years before the Nouvelle Revue Française organized the first Western haiku competition, Sadakichi published “Tanka, Haikai: Fourteen Japanese Rhythms.” Five months later, Bruno proclaimed Sadakichi king of the Bohemians.

In 1916, Sadakichi went west. He revisited his first play’s subject in a novel, The Last Thirty Days of Christ, which envisioned Jesus as a mystic and philosopher misunderstood even by his disciples (anticipated Nikos Kazantzakis’ The Last Temptation of Christ by a generation). It was praised by Ezra Pound and eulogized by Benjamin De Casseres as one of the most strikingly original works of American literature. He played the Court Magician in Douglas Fairbanks Sr.’s Arabian Nights fantasy The Thief of Bagdad (1924) for $250 in cash and a case of whiskey every week.

In 1938, Sadakichi moved to a shack on an Indian reservation in Banning, California. He still swaggered despite age, asthma, and a bad liver. (“My ailments are exceeded only by my debts.”) After Sadakichi collapsed on a bus, a doctor asked his symptoms. Hartmann replied, “I have symptoms of immortality.”

Though he still contributed to Art Digest and several European reviews, he generally lived on handouts cadged from admirers as tribute to his increasingly outrageous personality. John Barrymore, when asked why women still found Sadakichi, “this fugitive from an embalming table, so attractive,” replied, “Because he looks like a sick cat. Women are nuts about sick cats.”

Sadakichi’s friends in Los Angeles centered around the Bundy Drive studio of John Decker. They included Barrymore and Hearst editor Gene Fowler, then enjoying a wildly successful stint as a script doctor. This audience welcomed Sadakichi’s stories, all told with sly self-mockery and perfect timing. Fowler, according to his valentine, Minutes of the Last Meeting, he first spoke with Hartmann over the telephone in 1939. A few days later, while Fowler was at his office, a studio policeman said a crazy old man was asking for him. “When I told him he smelled of whisky,” the guard reported to Fowler, “he said I ought to be smelling his genius.”

Fowler went outside. The old man stood nearly six feet tall and weighed 132 pounds. Despite the heat, he wore a threadbare, baggy gray overcoat with an old sweat-stained fedora pushed back on his shock of gray hair, which had inspired Barrymore to nickname him the Gray Chrysanthemum.

He announced, “Where I come from and where I go doesn’t matter. For I am Sadakichi Hartmann.” (I’m condensing Fowler’s version here. Fowler extended his hand, saying, “Sadakichi, I am happy to know you.” The critic replied, “Is that remark a sample of your brilliance? You may live another century, Fowler, but you will never meet another son of the gods like me. You have something to drink?”

Fowler stammered, “As a matter of fact, I’m not drinking and—”

“What have your personal habits to do with my destiny?”

“I hadn’t expected a thirsty guest,” Fowler explained.

“You should always have something on hand to offset the stupidity of this place.” As Sadakichi helped himself to Fowler’s scotch, he said, “Be careful that you do not fall in love with your subject—in love with my wonderful character and genius. It will blind you, and your writing will suffer.”

Later, when an automobile accident interrupted Fowler’s work on the biography—it left him with “two split vertebrae, three cracked ribs, a skull injury, and wrenched knees. Otherwise I was as good as new”—Sadakichi complained that Fowler was malingering so as “to avoid becoming famous.” (“He suddenly realizes that I am much too big a subject for his limited talents.”)

Once, Fowler took Sadakichi to the Los Angeles Museum of History, Science and Art. At the entrance, Sadakichi saw a wheelchair. He sat in it and motioned Fowler to push him about the gallery. He commented loudly on the paintings: van Eyck’s Virgin and Child (“Painted on his day off”) and a Rembrandt portrait of his wife Saskia (“Second-rate…he had begun to lose his mind when he painted it”). A curator bustled up, and Sadakichi asked him where the washroom was. The curator gave him directions. “Bring it to me,” Hartmann commanded.

Fowler recalled an attempt to have Hartmann examined by a physician. The doctor recommended that he be operated on for his hernia. Sadakichi resented this proposal—understandably, since in those days such ruptures could be repaired only by first removing the testicles. Decker urged the old man to bid the glands farewell. (“They have served their purpose and undoubtedly merit an honorable retirement.”) “Ghouls!” Sadakichi cried and turned his rage on the doctor. “Why don’t you men of medicine do something worthwhile instead of castrating a genius!” (Barrymore, weighing in, sustained Hartmann’s objections. “After all,” said he, “it is hard to cast aside comrades of happier times.”)

“Other people,” said Sadakichi, “talk and talk about dying. I’m doing it!” So he did, on November 21, 1944.

New York Press, May 1, 2001

Dagger John and the Triumph of the Irish

Among the publishing sensations of 1836 was a book by one Maria Monk entitled Awful Disclosures, which purported to be her memoir of life in a Montreal nunnery. Hot stuff by early 19th-century standards, Monk’s book claimed that all nuns were forced to have sex with priests and that the “fruit of priestly lusts” were baptized, murdered, and carried away for secret burial in purple velvet sacks. Nuns who tried to leave the convent were whipped, beaten, gagged, imprisoned, or secretly murdered. Maria claimed to have escaped with her unborn child.

nast-daggerjohn

From New York Press, March 25, 2003

Among the publishing sensations of 1836 was a book by one Maria Monk entitled Awful Disclosures, which purported to be her memoir of life in a Montreal nunnery. Hot stuff by early 19th-century standards, Monk’s book claimed that all nuns were forced to have sex with priests and that the “fruit of priestly lusts” were baptized, murdered, and carried away for secret burial in purple velvet sacks. Nuns who tried to leave the convent were whipped, beaten, gagged, imprisoned, or secretly murdered. Maria claimed to have escaped with her unborn child.

In fact, Maria had never been a nun. She was a runaway from a Catholic home for delinquent girls, and her child’s father was no priest, but merely the boyfriend who had helped her escape. Nevertheless, Awful Disclosures became an overnight bestseller, echoing as it did the most popular anti-Catholic slanders of the day and reflecting the savage hatred of the Irish with which they went hand in hand. It was the cultural climate that partly led to John Joseph Hughes, fourth bishop and first archbishop of New York, becoming what one reporter called “the best known, if not exactly the best loved, Catholic bishop in the country.”

John Hughes was an Irishman, an immigrant and a poor farmer’s son. Though intelligent and literate, he had little formal education before he entered the seminary. He was complicated: warm, impulsively charitable, vain (he wore a wig) and combative (he once admitted to “a certain pungency of style” in argument). No man accused him of sainthood; many found him touched with greatness. He built St. Patrick’s Cathedral and founded America’s system of parochial education; he once threatened to burn New York to the ground. Like all archbishops and bishops, Hughes placed a cross in his signature. Some felt it more resembled a knife than the symbol of the redemption of the world, and so the gutter press nicknamed him “Dagger John.” He probably loved it.

Born on June 24, 1797 in Annaloghan, County Tyrone, Hughes later observed that he’d lived the first five days of his life on terms of “social and civil equality with the most favored subjects of the British Empire.” Then he was baptized a Catholic. British law forbade Catholics to own a house worth more than five pounds, hold the King’s commission in the military or receive a Catholic education. It also forbade Roman Catholic priests to preside at Catholic burials, so that—as William J. Stern noted in a 1997 article in City Journal—when Hughes’s younger sister Mary died in 1812, “the best [the priest] could do was scoop up a handful of dirt, bless it, and hand it to Hughes to sprinkle on the grave.”

In 1817, Hughes emigrated to America. He was hired as a gardener and stonemason by the Reverend John Dubois, rector of St. Mary’s College and Seminary in Emmitsburg, Maryland. Believing himself called to the priesthood, Hughes asked to be admitted to the seminary. Father Dubois rejected him as lacking a proper education.

Hughes had met Mother Elizabeth Ann Bayley Seton, a convert to Catholicism who had become a nun after her husband’s death and occasionally visited St. Mary’s. She saw something in the Irishman that Dubois had not and asked the rector to reconsider. So Hughes began his studies in September 1820, graduating and receiving ordination to the priesthood in 1826. He was first assigned to the diocese of Philadelphia.

Anti-Catholic propaganda was everywhere in the City of Brotherly Love. Hughes’s temperament favored the raised fist more than the turned cheek. So when, in 1829, a Protestant newspaper attacked “traitorous popery,” Hughes denounced its editorial board of Protestant ministers as “clerical scum.” And after scores of Protestant ministers fled the 1834 cholera epidemic, which Nativists blamed on the Irish, Hughes ridiculed the ministers—”remarkable for their pastoral solicitude, as long as the flock is healthy….”

In 1835, Hughes won national fame when he debated John Breckenridge, a prominent Protestant clergyman from New York. Breckenridge conjured up the Inquisition, proclaiming that Americans wanted no such popery, no loss of individual liberty. Hughes described the Protestant tyranny over Catholic Ireland and the scene at his sister’s grave. He said he was “an American by choice, not by chance…born under the scourge of Protestant persecution” and that he knew “the value of that civil and religious liberty, which our…government secures for all.” The debate received enormous publicity, making Hughes a hero among many American Catholics. It was noted in Rome.

Dubois, who had left St. Mary’s to become bishop of New York, suffered a series of blows to his health. Hughes was barely forty. Nevertheless, in January 1838, he was appointed co-adjutor bishop—assuring him the succession to Dubois—and was consecrated in the old St. Patrick’s Cathedral on Mott Street. To the older man, it was a terrible humiliation to see a man he had deemed unqualified for the priesthood succeed him. When Dubois died, in 1842, he was buried at his request beneath the doorstep of Old St. Patrick’s Cathedral so the Catholics of New York might step on him in death as they had in life.

Hughes’s first order of business to gain control of his own diocese. Under state law, most Catholic churches and colleges were owned and governed by boards of trustees—laymen, elected by a handful of wealthy pew holders (parishioners who couldn’t afford pew rents couldn’t vote), who bought the property and built the church. When, in 1839, the trustees of Old St. Patrick’s Cathedral had the police remove from the premises a new Sunday school teacher whom Dubois had appointed, Hughes called a mass meeting of the parish. He likened the trustees to the British oppressors of the Irish, thundering that the “sainted spirits” of their forebears would “disavow and disown them, if…they allowed pygmies among themselves to filch away rights of the Church which their glorious ancestors would not yield but with their lives to the persecuting giant of the British Empire.” He later said that by the time he had finished speaking, many in the audience were weeping like children. He added, “I was not far from it myself.”

The public schools were then operated by the Public School Society, a publicly funded but privately managed committee. They favored “non-denominational” moral instruction, reflecting a serene worldview that Protestantism was a fundamental moral code and the basis of the common culture. In fact, as Hughes biographer Father Richard Shaw pointed out, “the entire slant of the teaching was very much anti-Irish and very much anti-Catholic.” The curriculum referred to deceitful Catholics, murderous inquisitions, vile popery, Church corruption, conniving Jesuits and the pope as the anti-Christ of Revelations.

Bishop Dubois had advised Catholic parents to keep their children out of the public schools to protect their immortal souls. But Hughes understood the need for formal education among the poor. He demanded that the Public School Society allocate funds for Catholic schools: “We hold…the same idea of our rights that you hold of yours. We wish not to diminish yours, but only to secure and enjoy our own.” He concluded by warning that should the rights of Catholics be infringed, “the experiment may be repeated to-morrow on some other.”

On October 29, 1840, a public hearing was held at City Hall, with numerous lawyers and clergymen representing the Protestant establishment and Hughes the Catholics. Hughes opened with a three-and-a-half-hour spellbinder. The Protestants spent the next day and a half insulting Hughes as an ignorant ploughboy and demonizing Catholics “as irreligious idol worshippers, bent on the murder of all Protestants and the subjugation of all democracies,” according to historian Ray Allen Billington. The City Council denied his request.

With elections less than a month away, Hughes created his own party, Carroll Hall, named for the only Catholic signer of the Declaration of Independence. He ran a slate of candidates to split the Democratic vote, thereby punishing the Democrats for opposing him. The Democrats lost by 290 votes. Carroll Hall had polled 2,200.

In April 1842 the Legislature replaced the Public School Society with elected school boards and forbade sectarian religious instruction. When the Whigs and Nativists had the King James version declared a non-sectarian book, Hughes set about establishing what has become the nation’s major alternative to public education, a privately funded Catholic school system. He would create more than 100 grammar and high schools and help found Fordham University and Manhattan, Manhattanville and Mount St. Vincent Colleges.

Anti-Catholicism had gained legitimacy by the 1840s. Now the Nativist movement included not only Protestant fundamentalists who saw Catholicism as Satan’s handiwork, but also intellectuals—like Mayor James Harper, of the Harper publishing house—who considered Catholicism incompatible with democracy. All hated the Irish. Harper’s described the “Celtic physiognomy” as “simian-like, with protruding teeth and short upturned noses.” Their cartoonist, Thomas Nast, caricatured the Irish accordingly.

Between May and July of 1844, Nativist mobs in Philadelphia, summoned to “defend themselves against the bloody hand [of the Pope],” ransacked and leveled at least three churches, a seminary and nearly the entire Catholic residential neighborhood of Kensington. When Hughes learned a similar pogrom, beginning with an assault upon Old St. Patrick’s Cathedral, was planned in New York, he called upon “the Catholic manhood of New York” to rise to the defense of their churches and he armed them. A mob that stoned the stained glass windows of the cathedral found the building full of riflemen, and the violence went no further. Hughes later wrote that there had not been “a [Catholic] church in the city…not protected with an average force of one to two thousand men-cool, collected, armed to the teeth….”

Invoking the conflagration that kept Napoleon from using Moscow as his army’s winter quarters, Hughes warned Mayor Harper that if one church were attacked, “should one Catholic come to harm, or should one Catholic business be molested, we shall turn this city into a second Moscow.” New York’s buildings were largely wooden, and the city had burned twice in the previous century. There were no riots.

On July 19, 1850, Pope Pius IX created the archdiocese of New York, a development reflecting the growth of both the city’s Catholic population and the influence of Hughes himself. Having received the white woolen band of an archbishop from the hands of the Supreme Pontiff, Hughes embarked on a new project, “…a cathedral…worthy of our increasing numbers, intelligence and wealth as a religious community.” On August 15, 1858, before a crowd of 100,000, he laid the cornerstone of the new St. Patrick’s Cathedral at 5th Avenue and 51st Street. He would not see it finished. On January 3, 1864, death came for the archbishop.

After Maria Monk gave birth to a second illegitimate child, her Protestant champions quietly abandoned her. She became a prostitute, was arrested for pickpocketing and died in prison. Her book is still in print.

hughes1


The Conservative Case Against George W. Bush

Theodore Roosevelt, that most virile of presidents, insisted that, “To announce that there should be no criticism of the president, or that we are to stand by the president, right or wrong, is not only unpatriotic and servile, but is morally treasonable to the American people.” With that in mind

Theodore Roosevelt, that most virile of presidents, insisted that, “To announce that there should be no criticism of the president, or that we are to stand by the president, right or wrong, is not only unpatriotic and servile, but is morally treasonable to the American people.” With that in mind, I say: George W. Bush is no conservative, and his unprincipled abandonment of conservatism under the pressure of events is no statesmanship. The Republic would be well-served by his defeat this November.

William F. Buckley’s recent retirement from the National Review, nearly half a century after he founded it, led me to reflect on American conservatism’s first principles, which Buckley helped define for our time. Beneath Buckley’s scintillating phrases and rapier wit lay, as Churchill wrote of Lord Birkenhead, “settled and somewhat somber conclusions upon… questions about which many people are content to remain in placid suspense”: that political and economic liberty were indivisible; that government’s purpose was protecting those liberties; that the Constitution empowered government to fulfill its proper role while restraining it from the concentration and abuse of power; and that its genius lay in the Tenth Amendment, which makes explicit that the powers not delegated to government are reserved to the states or to the people.

More generally, American conservatives seek what Lord Acton called the highest political good: to secure liberty, which is the freedom to obey one’s own will and conscience rather than the will and conscience of others. Any government, of any political shade, that erodes personal liberty in the name of social and economic progress must face a conservative’s reasoned dissent; for allowing one to choose between right and wrong, between wisdom and foolishness, is the essential condition of human progress. Although sometimes the State has a duty to impose restrictions, such curbs on the liberty of the individual are analogous to a brace, crutch, or bandage. However necessary in the moment, they are best removed as soon as possible, as they tend to weaken and to cramp. Thus American conservative politics championed private property, an institution sacred in itself and vital to the well-being of society. It favored limited government, balanced budgets, fiscal prudence, and avoidance of foreign entanglements.

More subtly, American conservatism viewed human society as something of an organism in itself. This sense of society’s organic character urged the necessity of continuity with the past, with change implemented gradually and with as little disruption as possible. Thus, conservatism emphasized the “civil society”—the private voluntary institutions developed over time by passing the reality test (i.e., because they work) such as families, private property, religious congregations and neighborhoods—rather than the State. In nearly every sense, these institutions were much closer to the individuals who composed them than the State could ever be. They had the incidental and beneficial effect of protecting one’s personal liberty against undue intrusion from governments controlled by fanatics and busybodies—the phenomenon Edmund Burke presciently termed “armed ideologies”—and thus upheld our way of life as flying buttresses supported a Gothic cathedral.

But the policies of this administration self-labeled “conservative” have little to do with tradition. Rather, they tend to centralize power in the hands of the government under the guise of patriotism. If nothing else, the Bush administration has thrown into question what being a conservative in America actually means.

Forty years ago, when Lyndon Johnson believed the United States could afford both Great Society and the Vietnam War, conservatives attacked his fiscal policies as extravagant and reckless. Ten years ago, the Republican Party regained control of Congress with the Contract with America, which included a balanced-budget amendment to restore fiscal responsibility. But today, thanks to tax cuts and massively increased military spending, the Bush administration has transformed, according to the Congressional Budget Office, a ten-year projected surplus of $5.6 trillion into a deficit of $4.4 trillion: a turnaround of $10 trillion in roughly 32 months.

The Bush Administration can’t even pretend to keep an arm’s length from Halliburton, the master of the no-bid government contract. Sugar, grain, cotton, oil, gas, and coal: These industries enjoy increased subsidies and targeted tax breaks not enjoyed by less well-connected industries. The conservative Heritage Foundation blasts the administration’s agricultural subsidies as the nation’s most wasteful corporate welfare program. The libertarian Cato Institute has called the administration’s energy plan “three parts corporate welfare and one part cynical politics…a smorgasbord of handouts and subsidies for virtually every energy lobby in Washington” that “does little but transfer wealth from taxpayers to well-connected energy lobbies.” And the Republican Party’s Medicare drug benefit, the largest single expansion of the welfare state since Johnson’s Great Society, was designed to appeal to senior citizens who, as any competent politician knows, show up at the polls.

None of this is conservative, though it is in keeping with the Bush family’s history. Kevin Phillips, whose 1969 classic The Emerging Republican Majority outlined the policies that would lead to the election of President Reagan, describes in his American Dynasty the Bush family’s rise to wealth and power through crony capitalism: the use of contacts obtained in public service for private profit. Phillips argues that the Bushes don’t disfavor big government as such: merely that part of it which regulates business, maintains the environment, or aids the needy. Subsidizing oil-well drilling through tax breaks, which made George H. W. Bush’s fortune, or bailing out financial institutions, such as Neil Bush’s bankrupt Silverado Savings and Loan, however, is a good thing.

This deficit spending also helps Bush avoid the debate on national priorities we would have if these expenditures were being financed through higher taxes on a pay-as-you-go basis. After all, we’re not paying the bill now; instead, it will come due far in the future, long after today’s policy-makers are out of office. And this debt is being incurred just as the baby boomers are about to retire. In January 2004, Charles Kolb, who served in the Reagan and George H. W. Bush White Houses, testified before Congress that, at a time when demographics project more retirees and fewer workers, projected government debt will rise from 37 percent of the economy today to 69 percent in 2020 and 250 percent in 2040. This is the sort of level one associates with a Third World kleptocracy.

Even worse than this extravagance are the administration’s unprecedented intrusions into our constitutional privacy rights through the Patriot Act. If it does not violate the letter of the Fourth Amendment, it violates its spirit. To cite two examples, the FBI has unchecked authority through the use of National Security Letters to require businesses to reveal “a broad array of sensitive information, including information about the First Amendment activities of ordinary Americans who are not suspected of any wrongdoing.” Despite the Fourth Amendment’s prohibition on unreasonable search and seizure, the government need not show probable cause: It does not need to obtain a warrant from a judge. And who can trust any law enforced by John Ashcroft, who single-handedly transformed a two-bit hubcap thief like José Padilla first into a threat to national security and then, through his insistence that Padilla, an American citizen, could be held without charges, into a Constitutional crisis?

All this stems from Bush’s foreign policy of preemptive war, which encourages war for such vague humanitarian ends as “human rights,” or because the United States believes another country may pose a threat to it. Its champions seem almost joyously to anticipate a succession of wars without visible end, with the invasion of Iraq merely its first fruit: former Bush appointee Richard Perle, from his writings on foreign policy, would have us war against nearly every nation that he defines as a rogue. The ironic consequence of this policy to stabilize the world is greater instability. It reminds me of the old FDR jingle from the Daily Worker:

I hate war, and so does Eleanor,
But we won’t feel safe until everybody’s dead.

To be sure, there’s more than enough blame to go around with the Congress’ cowardly surrender to the Executive of its power to declare war. The Founding Fathers, who knew war from personal experience, explicitly placed the war power in the hands of the Congress. As James Madison wrote over 200 years ago:

The Constitution expressly and exclusively vests in the Legislature the power of declaring a state of war… The separation of the power of declaring war from that of conducting it is wisely contrived to exclude the danger of its being declared for the sake of its being conducted.

But since the Korean War (which the Congress defined as a “police action” to avoid using its war powers), war has been waged without its formal declaration. Thus Congressional power atrophies in the face of flag-waving presidents. Perhaps Congress is too preoccupied with swilling from the gravy trough that our politics has become to recall its Constitutional role as a co-equal branch of government, guarding its powers and privileges against executive usurpation. The Congress has forgotten that the men who exacted Magna Carta from King John at sword point instituted Parliament to restrain the executive from its natural tendency to tax, spend, and war.

Moreover, there is nothing conservative about war. As Madison wrote:

Of all the enemies to public liberty war is, perhaps, the most to be dreaded, because it comprises and develops the germ of every other. [There is an] inequality of fortunes, and the opportunities of fraud, growing out of a state of war, and…degeneracy of manners and of morals…No nation could preserve its freedom in the midst of continual warfare.

By contrast, business, commerce, and trade, founded on private property, created by individual initiative, families, and communities, has done far more to move the world forward than war. Yet faith in military force and an arrogant belief that American values are universal values still mold our foreign policy nearly a century after Woodrow Wilson, reelected with a promise of keeping America out of World War I, broke faith with the people by engineering a declaration of war within weeks of his second inauguration.

George W. Bush’s 2000 campaign supposedly rejected Wilsonian foreign policy by articulating both the historic Republican critique of foreign aid and explicitly criticizing Bill Clinton’s nation-building. Today, the administration insists we can be safe only by compelling other nations to implement its vision of democracy. This used to be called imperialism. Empires don’t come cheap; worse, “global democracy” requires just the kind of big government that conservatives abhor. When the Wall Street Journal praises the use of American tax dollars to provide electricity and water services in Iraq, something we used to call socialism, either conservatism has undergone a tectonic shift or the paper’s editors are being disingenuous.

This neo-conservative policy rejects the traditional conservative notion that American society is rooted in American culture and history—in the gradual development of American institutions over nearly 230 years—and cannot be separated from them. Instead, neo-conservatives profess that American values, which they define as democracy, liberty, free markets, and self-determination, are “universal” rather than particular to us, and insist they can and should be exported to ensure our security.

This is nonsense. The qualities that make American life desirable evolved from our civil society, created by millions of men and women using the freedom created under limited constitutional government. Only a fool would believe they could be spread overnight with bombs and bucks, and only a fool would insist that the values defined by George W. Bush as American are necessarily those for which we should fight any war at all.

Wolfowitz, Perle, and their allies in the Administration claimed the Iraqis would greet our troops with flowers. Somehow, more than a year after the president’s “Mission Accomplished” photo-op, a disciplined body of well-supplied military professionals is still waging war against our troops, their supply lines, and our Iraqi collaborators. Indeed, the regime we have just installed bids fair to become a long-term dependent of the American taxpayer under U.S. military occupation.

The Administration seems incapable of any admission that its pre-war assertions that Iraq possessed weapons of mass destruction were incorrect. Instead, in a sleazy sleight of hand worthy of Lyndon Johnson, the Administration has retrospectively justified its war with Saddam Hussein’s manifold crimes.

First, that is a two-edged sword: If the crimes of a foreign government against its people justify our invasion, there will be no end of fighting. Second, the pre-war assertions were dishonest: Having decided that Iraq possessed weapons of mass destruction, the policymakers suppressed all evidence that it did not. This immorality is thrown into high relief by the war’s effect on Iraqi civilians. We have no serious evidence of any connection between Iraq and 9/11. Dropping 5000-pound bombs on thousands of people who had nothing to do with attacking us is as immoral as launching airplanes at an American office building.

To sum up: Anything beyond the limited powers expressly delegated by the people under the Constitution to their government for certain limited purposes creates the danger of tyranny. We stand there now. For an American conservative, better one lost election than the continued empowerment of cynical men whose abuse of power unrestrained by principle is based upon the compromise of conservative beliefs. George W. Bush claims to be conservative. His administration’s unwholesome intrusion into domestic life and personal liberty, and the local governments who imitate it, suggest otherwise. George W. Bush is no no friend of limited, constitutional government—and no friend of freedom. The Republic would be better served by his defeat in November.

New York Press, August 4, 2004

The Way of the Perfect Samurai

He wrote near the end that his life was divided into four rivers: writing, theater, body, and action. He memorialized all of it through photographs. Some were conventional. When Yukio Mishima came to New York with his wife for a belated honeymoon in 1960, they were photographed on the Staten Island ferry and before the Manhattan skyline, like any tourist couple.

He wrote near the end that his life was divided into four rivers: writing, theater, body, and action. He memorialized all of it through photographs. Some were conventional. When Yukio Mishima came to New York with his wife for a belated honeymoon in 1960, they were photographed on the Staten Island ferry and before the Manhattan skyline, like any tourist couple.

A bodybuilder for the last two decades of his life, his love of self-display crossed into exhibitionism. Thus, the beautiful, homoerotic photographs: Mishima in a fundoshi, a loincloth, kneeling in new-fallen snow with a dai katana, the great sword of a samurai, or posing as Guido Reni’s St. Sebastian (complete with arrows). He even posed for Barakei (roughly, “Death by Roses”), a magnificently produced luxury book of extraordinary nude photographs, and somehow was disturbed by the consequent letters received from various admirers requesting still bolder portraits—after all, he was a family man with a wife and two children.

Perhaps the four rivers joined in his most famous photograph: Mishima stripped to the waist, his chest bulging with muscle and gleaming with sweat, his brows knotted and eyes glaring, wielding a massive, two-handed, three-foot-long dai katana. It was an elegant weapon, made by the legendary 17th-century swordsmith Seki no Magoroku, and kept razor-sharp. About his head is a hachimaki, a white headband bearing the Rising Sun and a medieval samurai slogan, “Serve the Nation for Seven Lives.”

Yukio Mishima first came to New York in 1951 at twenty-five. Within the previous two years, he had published two outstanding novels, Confessions of a Mask and Forbidden Colors. The critics hailed him as a genius. He spent ten days in the city, going to the top of the Empire State Building, seeing Radio City Music Hall and the Museum of Modern Art, catching Call Me Madam and South Pacific. New York did not appeal to him: he found it, according to biographer John Nathan, “like Tokyo five hundred years from now.”

Mishima was born Kimitake Hiraoka, the eldest son of a middle-class family. Before he was two months old, his paternal grandmother took him from his parents and kept him until he was twelve. Her ancestors had been samurai, related by marriage to the Tokugawa, who were shoguns. She was chronically ill and unstable, yet she loved theater and took him to the great classics, such as The 47 Ronin, a magnificent celebration of feudal allegiance, of loyalty and honor even unto death, and perhaps the most stirring Kabuki play.

Through her family connections, Mishima entered the elite Peers’ School, and by fifteen he was publishing in serious literary magazines. He took the pen name Yukio Mishima to escape his father’s persecution (his father, a Confucian, considered fiction mendacity and destroyed his son’s manuscripts whenever possible). In 1944 he graduated as valedictorian and received a silver watch from the Emperor. His luck held: he failed an army induction physical and thus survived the Second World War.

From the beginning, Mishima’s productivity was stunning: in 1948, he published thirteen stories, a first novel, a collection of novellas, two short plays and two critical essays. On November 25, 1948, after retiring from a nine-month career at the Finance Ministry, he began his first major novel, Confessions of a Mask. Mishima brilliantly evokes his closeted protagonist’s awareness of being different and sense of unique shame. Within two years, Mishima revisited this theme in Forbidden Colors, now noting homosexuality’s ubiquity. Spending all that time in gay bars, taking notes, can do that to you. Besides, homosexuality occupies a different place in Japanese culture than it does in ours. During the two centuries before Japan reopened to the West, some of its most flamboyant heroes were bisexual picaros whose panache and courage on the battlefield were equaled by delicacy and endurance in a diversity of intimate situations.

In July 1957, after Alfred A. Knopf published his Five Modern Noh Plays, Mishima returned to New York. (He told his biographer John Nathan that Knopf dressed “like the King in an operetta, or a whiskey trademark.”) Mishima was interviewed by The New York Times, met Christopher Isherwood, Tennessee Williams, and their friends, saw eight Broadway shows, and went several times to the New York City Ballet.

He returned to Japan to find a wife, which was not as easy as one might think. Although marriages were still often arranged, and he was one of Japan’s most distinguished men of letters, Mishima’s affect was apparently not particularly attractive. (A weekly magazine had polled Japan’s young women on the question, “If the Crown Prince and Yukio Mishima were the only men remaining on earth, which would you prefer to marry?” More than half the respondents preferred suicide.) Nevertheless, his marriage to Yoko Sugiyama proved successful. They stopped in New York on their belated honeymoon, where he saw two of his plays performed in English at the cutting-edge Theatre de Lys. They had two children and he was an attentive, devoted father.

The family lived in a house Mishima had ordered built in the Western manner. It has been described as Victorian colonial, perhaps because the language lacks words to better describe it. “For Mishima,” Nathan explains, “the essence of the West was late baroque, clashing colors, garishness…” He describes him assuring “his horrified architect,” that ‘I want to sit on rococo furniture wearing Levi’s and an aloha shirt; that’s my ideal of a lifestyle.'”

From 1965 to 1970, he worked on his four-volume epic, The Sea of Fertility (Spring Snow, Runaway Horses, The Temple of Dawn, and The Decay of the Angel). “The title,” he said, “is intended to suggest the arid sea of the moon that belies its name.” It is his masterpiece, as he knew it would be.

At first glance, in taking the theme of the transformation of Japanese society over the past century, Mishima is revisiting the tired, even trite conflict between traditional values and the spiritual sterility of modern life. One might better define this work as a lyric expression of longing, which he apparently believed the central force in life: that longing led one to beauty, whose essence is ecstasy, which results in death. His fascination with death is erotic: he was drawn to it as most of us are drawn to the company and the touch of the beloved.

In his essay “Sun and Steel,” he wrote of “a single, healthy apple…at the heart of the apple, shut up within the flesh of the fruit, lurks the core in its wan darkness, tremblingly anxious to find some way to reassure itself that it is a perfect apple. The apple certainly exists, but to the core this existence as yet seems inadequate… Indeed, for the core, the only sure mode of existence is to exist and to see at the same time. There is only one method of solving this contradiction. It is for a knife to be plunged deep into the apple so that it is split open and the core is exposed to the light… Yet then the existence of the cut apple falls to pieces; the core of the apple sacrifices existence for the sake of seeing.”

Mishima stood about five feet, two inches. He glowed with charisma and an undeniable, disturbing sexuality. Every memoir testifies to his extraordinary energy. He was brilliant and witty, even playful. He had self-knowledge and a keen irony, and his own absurdities were often its target. He became politically active on the extreme right and in 1968 organized the Shield Society, which became his elegantly uniformed private army.

Both Japanese and Westerners testified to his extraordinary empathy—his ability to understand and respond to others. Thus his genius for conversation: the man who loved discussing the Japanese classics, Oscar Wilde, or the dozen shades of red differentiated in the Chinese spectrum could also discuss weightlifting or kendo or a thousand other subjects, each gauged to his listener. He could make his companion feel that he or she was the most important person in the world to him, which was a useful gift for a man who understood that he lived behind masks, or in a series of compartments, and that no one knew him whole.

In November of 1970, Yukio Mishima was forty-five. He’d published thirty novels, eighteen plays, twenty volumes of verse, and twenty volumes of essays; he was an actor and director, a swordsman and bodybuilder, a husband and father. He spoke three languages fluently; he had gone around the world seven times, modeled in the nude, flown in a fighter jet, and conducted a symphony orchestra. During the previous evening, he had told his mother that he had done nothing in his life that he had wanted to do.

On November 25, twenty-two years to the day from beginning Confessions of a Mask, he led a party of four members of the Shield Society to a meeting with the commanding general of the Eastern Army of the Japanese Self-Defense Force. He had finished revising the manuscript of The Decay of the Angel only that day; it was on a table in the front hall of his house, ready for his publisher’s messenger.

At army headquarters, with only swords and daggers, Mishima and his men took the commanding general hostage. They demanded that the troops be assembled outside the building to hear Mishima speak. A little before noon, with 800 soldiers milling about, Mishima leaped to the parapet of the building, dressed in the Shield Society’s uniform. About his head was the hachimaki. He began speaking, but the police and television helicopters drowned out many of his words. He spoke of the national honor; and demanded the army join him in restoring the nation’s spiritual foundations by returning the Emperor to supreme power.

He had once said, “I come out on stage determined to make the audience weep and instead they burst out laughing.” It held true now: the soldiers shouted that he was a bakayaro, an asshole. After a few minutes, he gave up. He cried out three times, “Heiko Tenno Banzai” (“Long Live the Emperor”), and stepped back.

He loved Jocho Yamamoto’s classic Hagakure, an 18th-century instruction manual for the warrior. Jocho states, “The way of the samurai is death…the perfect samurai will, in a fifty-fifty life or death crisis, simply settle it by choosing immediate death.”

Mishima had fantasized about kirigini—to go down fighting against overwhelming odds, sword in hand. Now he kicked off his boots and removed his uniform until he wore only a fundoshi. He sat down on the carpet and took a dagger, a yorodoishi, in his right hand. He inhaled deeply. Then his shoulders hunched as he drove the blade into his abdomen with great force. As his body attempted to force out the weapon, he grasped his right hand with his left and continued cutting. The blood soaked the fundoshi. The agony must have been unimaginable. Yet, he completed the cut. His head collapsed to the carpet as his entrails spilled from his body.

He had instructed Morita, his most trusted follower, “Do not leave me in agony too long.” Now, Morita struck down with Mishima’s dai katana. He was inept: the beheading required three strokes. Then Morita took his own life.

Mishima’s motives remain the subject of speculation: madness, burnout, or fatal illness. Some whispered that he might have enjoyed the pain. Others suggested he and Morita had committed shinju, a double love-suicide. Some argued esthetics. A reading of Sun and Steel suggests that suicide was the logical completion of his search for beauty. Others take him seriously. Perhaps it was a matter of honor, and his death the most sincere protest he could muster against modern life.

To this day, thousands of Japanese observe the anniversary of his suicide.

New York Press, May 9, 2000

Nassau Street

Nassau Street was named some time before 1696 in honor of William of Nassau, the Dutch prince who became King William III of England in a 1689 coup d’etat. Now largely a pedestrian mall, it winds south from its intersection with Park Row at Printing House Square to Wall Street. Much of it is lined with late-Victorian office buildings, their imposing masonry and cast-iron facades rising almost unnoticed above the frenetic retailing on their ground floors.

“Nassau Street—where stamp collecting began.” (Old advertising slogan of the Subway Stamp Co., formerly of 111 Nassau Street in lower Manhattan)

Nassau Street was named some time before 1696 in honor of William of Nassau, the Dutch prince who became King William III of England in a 1689 coup d’etat. Now largely a pedestrian mall, it winds south from its intersection with Park Row at Printing House Square to Wall Street. Much of it is lined with late-Victorian office buildings, their imposing masonry and cast-iron facades rising almost unnoticed above the frenetic retailing on their ground floors.

For roughly a century, from the 1860s through the 1970s, Nassau Street was the mecca of American philately—postage stamp collecting. Some called the neighborhood the Stamp District. Entire buildings, like the Morton Building at 116 Nassau, were filled with stamp dealers. Sanders Zuckerman, who has been selling stamps in the area for fifty-nine years—the Daily News proclaimed him “a legend in the stamp business”—says collectors came from all over the world to buy and sell stamps.

Stamp collecting was a new fad in the 1860s. The first postage stamp, Great Britain’s one-penny black, had been issued only in 1840; the first known American stamp collector, William H. Faber of Charleston, South Carolina, began collecting in 1855. New York’s first stamp dealers appeared in the early 1860s. They did business along the fences of New York’s City Hall Park, where stamps were pinned up on boards for the delectation of passersby.

Open-air merchants—whether street pharmacists dealing in controlled substances or vendors selling souvenirs from a cart—are marginal people, engaged in what the Marxists call the early stages of capital accumulation. The man who made stamp dealing a business and Nassau Street the center of American philately was John Walter Scott (1845-1919). Scott had dabbled in stamp dealing in his teens while working as a merchant’s clerk in London. He emigrated to New York in the summer of 1863. At first, this did not seem to be a good idea. There were no jobs: the draft riots in early July had devastated much of the city. Scott’s job search was so unsuccessful that he even considered enlisting in the Union army.

One day, according to Edwin P. Hoyt’s One Penny Black, Scott struck up a conversation with an outdoor stamp dealer in City Hall Park. The dealer advanced him about a hundred dollars’ worth of stamps, which Scott agreed to sell as his agent. He was amazingly successful: he was soon earning $30 a month, roughly the wages of a skilled workman, and quite enough for a single man to live on. Scott then wrote to his sister, who began buying and sending stamps to him from England, and he went into business for himself.

In 1868, he opened an office on Nassau Street. He had been issuing one-page monthly price lists since June 1867. In September 1868, Scott issued a paperbound booklet, A Descriptive Catalogue of American and Foreign Postage Stamps, Issued from 1840 to Date. With the knack for self-important publicity that marked or marred him throughout his career, Scott trumpeted the pamphlet as the “16th edition” of his catalog. This was because he was counting each of his one-page lists as a separate earlier edition.

In the same year, he published a stamp album, a book with blank pages on which collectors might affix their stamps. He also started the American Journal of Philately. He was not the first American philatelic journalist: S. Allan “Just-as-Good Taylor had first published his Stamp Collector’s Record from Montreal in December 1864. (A brilliant counterfeiter, he openly insisted his stamps were “just as good” as the real things.) Scott finessed this fact, as he did most facts that inconvenienced him: his official biography says that he published the first “important” American stamp journal.

Truth presented no barrier to the vaulting imagination of J.W. Scott. He claimed sales of 15,000 albums. There were then probably not 15,000 stamp collectors in the world. His competitors claimed Scott had reduced lying to a science. No one cared.

Like most entrepreneurs, Scott was extraordinarily self-interested. A true child of the Gilded Age, he would turn a blind eye to others’ dishonesty if he could turn an outwardly licit dollar from it. Thus, he often dealt with “stamp finders,” men and women whose nose for rare stamps was often aided by a knack for larceny. Scott never asked where the stamps came from. One of his pet finders, known only as “Mr. McGinnity,” had “entered” the Philadelphia Customs House and raided its records for old stamps; another stamp finder raided the New York Institution for the Blind. He carried off numerous stamps clipped from its old correspondence, promising to return to pay for them. (The Institution is still waiting for the money.)

Scott also lobbied the United States government into cheating collectors by reprinting its old and valuable postage stamps. He even produced what were politely called “representations” of rare stamps, such as the so-called Postmaster stamps issued by individual American post offices before 1847, when the government began issuing its own. Such shenanigans put Scott, in some ways, on a par with “Just-as-Good” Taylor.

Taylor’s boast that his counterfeits were better than the originals was often true. (One scholar characterized Taylor’s forgeries as “fine engravings, totally different from the crude typographic printing” of the real stamps.) By the early 1870s, Taylor was part of the “Boston Gang” of crooked dealers and journalists, specializing in inventing South American issues. Years before El Salvador, Guatemala, Haiti, and Paraguay had released their first stamps, for example, the Boston Gang was printing and selling bogus stamps from these countries, backed by supposedly official documents, which were themselves forgeries. Taylor published equally fictitious articles about these stamps in his magazine, which helped create a market for his product. Only an age that combined slow communications with exploding collector demand for exotic stamps made this possible, and, at the end, only a federal counterfeiting rap brought him down.

Other hustlers were equally artistic, like Sam Singer, the repairman. Torn or mutilated stamps have no value to collectors. According to Hoyt, Singer could take a half-dozen mangled stamps and from them manufacture a composite that fooled most collectors and dealers. Like Taylor, he was proud of his work: he became so good that he sometimes bought stamps that he himself had repaired, not realizing until later that they had been damaged and mended. When the millionaire collector Colonel Edward H.R. Green found himself with one of Sam’s specials, he purchased a magnifier that could enlarge a stamp’s image from one inch to four feet square. It cost him $22,000; the movers had to remove the doorframe to bring it into the Colonel’s townhouse on West 90th Street.

In this century, Nassau Street’s most flamboyant dealer was actually an honest man. Herman “Pat” Herst Jr. (he was born on St. Patrick’s Day, March 17, 1909, which led his friends to nickname him Pat) graduated from Reed College and the University of Oregon in 1932. He then came east by jumping a slow freight and riding the rods. He landed a twelve-dollar-a-week job as a runner for Lebenthal & Co., the municipal bond brokers, that took him into the Stamp District, where he met several Lebenthal clients who collected stamps when not clipping coupons. They rekindled his childhood interest in philately: he began buying and selling stamps as a vest-pocket dealer. By 1936, Lebenthal was paying him $28 a week; his stamp dealings earned twice that, and he left Wall Street for Nassau. His business became so heavy that he welcomed an elevator operators’ strike: it let him catch up on his paperwork.

He published a newsletter, Herst’s Outbursts, from 1940 until 1968. It charmingly combined self-promotion, anecdotes about stamps, and a passion for trivia. (A friend once asked, “Pat, what’s the population of Cincinnati?” Herst replied, “Yesterday or today?”) He also published columns and articles in the philatelic press. Eventually, he recycled his journalism into a series of popular books. Nassau Street, his memoir of stamp dealing in the 1930s and 1940s, has sold more than 100,000 copies in seven editions since 1960.

Herst was among the first dealers to abandon the bustle of Nassau Street. In 1945, he moved his family and his business to Shrub Oak, N.Y., then a hamlet with a population of 674. As he received more than 100,000 pieces of mail a year, the local post office was immediately reclassified from third to second class. However, at that time even a second-class post office did not make household deliveries. From his love of trivia, Herst knew that an 1862 law permitted private posts under just these circumstances. With the help of his children and their German shepherd, Herst’s private post delivered mail door to door for two cents a letter. Naturally, he issued his own stamps, including one depicting the dog. Most went to collectors.

Today, though now headquartered in Ohio, Scott’s still publishes its annual catalog of stamps of the world. From J.W. Scott’s one-page “first edition” it has grown to six massive paper-bound volumes. Scott’s also publishes numerous stamp albums, including the renowned Scott’s International. Volume 1, which is somewhat thicker than the Manhattan Yellow Pages, houses nearly every stamp issued by every nation in the world between 1840 and 1940. Volume 2 only reached 1949. Subsequent albums now appear roughly every year to accommodate the gushing flow of stamps from every nation in the world, most meant for sale to collectors rather than for postal use.

Nassau Street is no longer the mecca of American philately. Even Sanders Zuckerman characterizes himself as the last of the dinosaurs. Gentrification, soaring taxes, rising commercial rents, and increasing competition from mail-order dealers operating from low-tax, low-rent states forced most dealers to move or close during the late 1970s. Today, the Verizon Yellow Pages lists only three dealers in the Nassau Street area under “Stamps for Collectors.”

Zuckerman, who operates Harvey Dolin & Company from 111 Fulton Street, usually wearing a necktie with a pattern of postage stamps, also sells coins, baseball cards, and World’s Fair and World War II memorabilia to get by. He says young people don’t collect stamps. When recently asked why he was still in business, the old man shrugged. “I like the place and I like the people,” he said. “I’m not going to retire till they close the lid on me.”

New York Press, November 5, 2002

Education by Degrees

I first heard of John Bear in 1990, when a man from Michigan named Bob Adams told me about the Ethiopian ear-pickers. In 1966, Southern Methodist University gave Bob Hope an honorary doctorate after the entertainer gave it a substantial donation. Up at Michigan State University, John Bear, earning his doctorate the hard way, resented this. He founded the Millard Fillmore Institute to honor

Bears’ Guide to Earning Degrees by Distance Learning, Ten Speed Press, PO Box 7123, Berkeley, CA 94707, $29.95, www.tenspeed.com

I first heard of John Bear in 1990, when a man from Michigan named Bob Adams told me about the Ethiopian ear-pickers. In 1966, Southern Methodist University gave Bob Hope an honorary doctorate after the entertainer gave it a substantial donation. Up at Michigan State University, John Bear was earning his doctorate the hard way. Bear resented this. He knew that President Fillmore refused all honorary doctorates, even from Oxford. Bear then founded the Millard Fillmore Institute to honor the 13th president’s memory. The Institute awarded doctorates with ornately engraved diplomas on genuine imitation parchment that read, “By virtue of powers which we have invented…” granting “the honorary and meretricious” doctorate “magna cum grano salis”—with a big grain of salt.

Six years later, while studying in London, he tried the same thing on a larger scale. He and some friends created the London Institute for Applied Research and ran advertisements in American publications: “Phony honorary doctorates for sale, $25.” Several hundred were sold, presumably keeping the promoters in whiskey and cigars. As Bear wrote, half the world’s academic establishment thought L.I.A.R. was a great gag. The other half felt it threatened life as we knew it. After wearing out the joke, Bear traded the remaining diplomas to a Dutchman for 100 pounds of metal crosses and Ethiopian ear-pickers. (The Dutchman is still selling them—for $100 a piece.)

With this kind of experience, Bear first published Bear’s Guide, his profoundly serious and wildly funny guide to alternative higher education, more than a quarter-century ago. The latest edition, the 14th, crossed my desk last week. This is probably the best available practical guide to obtaining legitimate college degrees without full-time attendance in a conventional college setting, whether through correspondence, independent study, college credit through examination or life-experience learning, or the Internet. As Bear notes, in 1970, if one wanted to earn a degree without sitting in a classroom for three or four years and wanted to remain in North America, one had two choices: the Universities of London and of South Africa. Today, one has more than 1000 options.

I loved my completely traditional undergraduate experience, down to the last mug of beer. But that was a quarter-century ago, when one could pay a year’s tuition with the money one earned over the summer as a dishwasher. That isn’t the case anymore.

Also, American college education is more about obtaining a credential than inheriting the intellectual legacy of the West. I regret this; so, I sense, does Bear. This is part of a phenomenon that might be called “credentialism.” One might define it as a false objectivity in personnel decisions by substituting credentials—particularly academic diplomas—for the analysis of character, intelligence, and ability or even the intelligent exercise of judgment in hiring, firing, and promoting.

Bear argues that an academic degree is more useful to one’s career than practical knowledge. Whether this is good for society is immaterial. He illustrates this point with an anecdote about a telephone call from the man in charge of sawing off tree limbs for a Midwestern city. The city government had decreed that all agency heads must have baccalaureates. The head sawyer didn’t have one. If he didn’t earn a degree within two years, he would lose the job he had competently performed for two decades. The reality of his competence was immaterial to someone else’s need for false objectivity.

We in New York are not immune from this. The city government now requires applicants for the police examinations to have sixty college credits. Yet no one who has attended college would argue that accumulating credits raises barriers to brutality or provides a sure test of intelligence, industry, courage, and character.

To Bear, traditional education awards degrees for time served and credit earned, pursuant to a medieval formula combining generalized and specialized education in a classroom on a campus. The kind of nontraditional education emphasized by his book awards degrees on the basis of “competencies” and “performance skills,” using “methodologies” that cultivate self-direction and independence through planned independent study, generally off campus.

Granted, nontraditional routes are now radically less expensive. One can obtain a bachelor’s degree from New York’s Excelsior College (formerly Regents College) or New Jersey’s Thomas Edison State College without stepping into a classroom. For example, Excelsior awards degrees to persons who have accumulated sufficient credits through various means, including noncollege learning experience such as corporate training programs, military training and professional licenses; equivalency examintions such as the College-Level Examination Program (CLEP), the Defense Activity for Non-Traditional Education Support (DANTES), the Graduate Record Examinations (GRE); its own nationally recognized examination program; and even educational portfolios evaluated through its partnerships with other institutions, such as Ohio University.

However, in a world that cheapens the humanities to a mere credential and refuses to evaluate intelligence, experience, and common sense, it’s a short step to advancing one’s career through exaggeration and even downright deceit. Remember that a diploma is merely a document evidencing the holder’s completion of a particular course of study.

Even the once-sacred transcript, the official record of the work one has done to earn a degree, is no longer written in stone. Creative use has been made of color copiers and laser printers to alter records; college computer systems have been hacked into–in some instances for fun and in others in order to alter records for profit.

Actually, it would seem that finagling has always been part of the American doctoral tradition. Bear reports that the first American doctorate came about in the following way.

In the beginning, only someone with a doctorate could bestow one on another person. At the end of the 17th century, however, Harvard’s faculty had no instructors with doctorates. Its president, Increase Mather, belonged to a religious sect that was anathema to the Church of England and hence legally ineligible to receive a doctoral degree from any English university. Harvard’s faculty, which then consisted of two people, solved this problem by unanimously agreeing to award Mather an honorary doctorate. Mather, in turn, conferred doctorates upon his instructors. And they began doctoring their students.

Yale awarded America’s first professional doctorate when Daniel Turner, a British physician, gave Yale some fifty medical textbooks. Yale awarded him an M.D. in absentia. (Turner never set foot in America). Some, according to Bear, suggested that the M.D. must stand for multum donavit (“he gave a lot”).

As one might expect, Bear also discusses  the anomaly of the honorary degree. In a country whose government is forbidden from granting titles of nobility, higher education fills the gap with honorary doctorates, which are simply titles bestowed for various reasons upon various individuals. Bear suggests an analogy to an army granting the honorary rank of general to a civilian who may then use it in everyday life.

Of course there are doctorates and there are doctorates. My alma mater grants honorary doctorates to a few distinguished men and women every year. Among them, invariably, is the chief executive of some corporation whose foundation has made a substantial contribution to the college’s endowment. The Rev. Kirby Hensley’s renowned Universal Life Church, which awards an honorary Doctor of Divinity degree to anyone who ponies up $30 (it used to be only $5), merely takes this to its logical extreme.

My favorite chapters in Bear’s book discuss phony degrees and diploma mills, some of which operate wildly beyond the law. In 1978, one diploma mill proprietor was arrested as Mike Wallace was interviewing him for 60 Minutes. Usually unaccredited, usually operating in one of the handful of states that barely regulate private higher education (currently Hawaii seems the happy hunting ground of the degree mill), such institutions flourish because people want to avoid the work involved in getting a real degree. After 60 Minutes aired its program, the network received thousands of telephone calls and letters from people who wanted the addresses and telephone numbers of the diploma mills exposed by the program.

And who can blame them? In some states, a doctorate from a one-room Bible school is sufficient to set up practice as a marriage counselor and psychotherapist. At least one major figure in the New York City Parking Violations Bureau scandals had been a marriage counselor on the strength of his advanced degrees from the College of St. Thomas in Montreal, Canada. This was a theological seminary sponsored by an Old Catholic church whose archbishop, a retired plumber (I met him once: his weakness for lace on his episcopal finery left me cold), operated the college from His Excellency’s apartment. Quebec did not regulate religious seminaries, and this allowed the archbishop to claim—accurately—that the degrees were lawful and valid. They were also worthless.

As Bear notes, in Hawaii and Louisiana the one-man church founded yesterday may sponsor a university today that will grant a doctorate in nuclear physics tomorrow. One Louisiana diploma mill successfully argued that as God created everything, all subjects were the study of God and therefore a religious degree. This may be theologically sound, but if I learned my physician held his M.D. from this school, I would get a referral.

As long as people value others more for whatever pieces of paper they can produce than for their qualities of mind and character, the diploma mill will flourish. But the intelligent careerist will use common sense and the guides of John Bear.

New York Press, September 24, 2002