Diploma Mill Redux

I have a long-time love affair with the underworld of diploma mills which, in a society overawed by credentials, is an unending source of amusement and entertaining copy.

So I was unsurprised to learn that, as recently as February 2007, the New York City Department of Investigation reported that fourteen city firefighters had used bogus diplomas, purchased from St. Regis University (an on-line institution, supposedly located in Liberia) to seek promotion to officer positions such as deputy chief, battalion chief, and captain. This stemmed from a relatively new Departmental policy requiring college credits as well as practical firefighting experience to gain promotion.

Four of the fourteen were actually promoted on the strength of these counterfeit credentials, including Daniel O’Gara, who was advanced to Battalion Chief after obtaining a St. Regis baccalaureate for $550.

At the end of the scandal, the fourteen paid fines totaling some $136,000.  According to the New York Post, those who had been promoted kept their jobs because they had all, since their promotions, obtained enough legitimate credits to qualify for their new jobs.

Requiescat L.D. Knox

On May 29, 2009, The New York Times published the obituary of L. D. “None of the Above” Knox, 80, a farmer and politician from Winnsboro, Louisiana who had crusaded for over forty years to make “None of the Above” an option on the Pelican State’s ballots.
In 1979, he went so far as to make “None of the Above” his additional middle name and used it thereafter whenever he ran for office.

On May 29, 2009, The New York Times published the obituary of L. D. “None of the Above” Knox, 80, a farmer and politician from Winnsboro, Louisiana who had crusaded for over forty years to make “None of the Above” an option on the Pelican State’s ballots.

In 1979, he went so far as to make “None of the Above” his additional middle name and used it thereafter whenever he ran for office.  The Times states:

His aim—allowing voters to call for a new election with new candidates by voting for “none of the above”—remained his main plank in subsequent elections.

“The people of this country have never had a free election,” he said in 1991.  “We don’t have a right to reject candidates.  We have to take the lesser of the evils.”

From his notices in papers across Louisiana, Mr. Knox seems to have been well-liked and respected, although most of his electoral defeats were one-sided blowouts.

Yet, as I argued in 2004, the “None of the Above” option has increasing appeal when many elections are effectively uncontested—as in the case of the upcoming New York City mayoralty, where billionaire incumbent Michael Bloomberg’s unlimited funds effectively push his opponents completely out of the public eye.

Just Say “NOTA”

throw-bums-white1

From New York Press, Janury 21, 2004

One of my New Year’s resolutions was to throw out the old papers piled up on my desk. I’m not a pack rat like the Bronx guy who spent two days trapped in his apartment under an avalanche of his own magazines and newspapers, but I’ve a weakness for letting interesting documents accumulate. So, late on the afternoon of January 1, 2004, I went to work.

One thing I turned up was the New York City Campaign Finance Board’s Voter Guide for the General Election of November 4, 2003. I kept it for laughs after reading the statements published in it by the candidates in my city council district. Although the Democratic incumbent, an affable party hack, seemed pleasant enough, his pompous,  jargon-ridden prose indicted him for bad thinking. His sole opponent, a Democrat who, having lost his party’s primary in September, had been nominated by the Republicans, was more interested in advertising his East Harlem restaurant—complete with directions—than public policy. I don’t know about his food, but publicizing one’s business with taxpayers’ money seems to betray bad taste, if not bad ethics. Neither man would have satisfied James Madison’s hope that our elections should feature candidates “who possess the most attractive merit.”

What to do about such losers? In New York, nothing. One of these guys was going to be elected. Furthermore, as we now know, despite their assaults on liberty and property (tax hikes of 18 percent on residential property and nearly 2000 percent on cigarettes, banning smoking in bars, and laws penalizing this newspaper’s street boxes), every incumbent city council member seeking re-election in November was returned to City Hall. Much of the reason wasn’t apathy. The incumbents were mostly unopposed at the general election, or opposed only by characters you would vote for only as a joke. The same was true of last year’s judicial elections in Manhattan. Nor is this a phenomenon peculiar to New York: in 2002, seventy-eight of the 435 seats in the U.S. House of Representatives were uncontested by one of the two major parties, which usually meant no contest at all.

Mere elections—even honest elections—are no symptom of democracy. For example, a generation ago in the Philippines, Ferdinand Marcos’s dictatorship held regular and contested elections. However, the only opposition candidates allowed on the ballot were wackos: the Filipino equivalent of our Prohibitionists, Greenbackers, and Lyndon LaRouche. Electable opponents had an odd way of being bumped off. And we all know now that Yugoslavia’s Milosevic repeatedly won freely contested elections on the road to ethnic cleansing. Democracy is more than a ritual we observe every November. At any rate, it should be.

A workable alternative to New York’s system of bad choice/no choice has been used in Nevada for a generation. One of the Silver State’s attractions (beside the absence of state income tax) is  its voters’ right to vote against all candidates. Since 1975, under The Nevada Revised Statutes 293.269, ballots for statewide office or for president and vice president must always include “None of these Candidates.”

Consequently, sometimes you can beat somebody with nobody.  In 1976, “None of these Candidates” won the Republican congressional primary with 47.3 percent of the vote, much to the embarrassment of the hack perennial candidates left far behind. Two years later, it won the Democratic  congressional primary. In the 1980 Democratic presidential primary, it outpolled Senator Edward Kennedy and nearly defeated President Carter. It won the 1986 Democratic primary for state treasurer, beating five real candidates, and beat Ralph Nader  in the 1996 Green presidential primary. Two years ago, in the Democratic gubernatorial primary, “None of these Candidates” embarrassed the machine candidate (who openly favored raising taxes) by polling 24 percent of the vote to his 35 percent (a topless dancer came in third with 21 percent).

But Nevada law still lets the candidate with the most votes be elected or nominated, even if “None of these Candidates” wins. It’s a safety valve, not a barrier to the hacks. A better option would be to require a new election if “None of these Candidates” outpolled the candidates, with the losers barred from the ballot. This is the practice in Russia and a few Eastern European countries, where voters may simply reject all the candidates and try again.

newballotThe option of voting for “None of these Candidates” or “None of the Above”—NOTA for short—enjoys support on both left and right. The Wall Street Journal endorsed NOTA in 1996, after Representative Wes Cooley of Oregon was re-nominated despite being exposed as both a fraud and a  phony war hero. Although unopposed at the Republican primary, Cooley received only 23,000 votes while 31,000 voters cast blank ballots or various write-ins. The voters had no effective way to deny his re-nomination. This was two years after Representative Mel Reynolds of Illinois was re-elected unopposed following his post-primary indictment for raping a minor, possession of child pornography, and obstruction of justice. (Later convicted, forced to resign, and imprisoned, Reynolds was pardoned by President Clinton on his last day in office so he could work as a youth counselor for the Reverend Jesse Jackson).

And the hits just keep on coming. Some of us remember the 1991 Louisiana gubernatorial runoff between Klansman/hustler/racist agitator David Duke (now imprisoned for mail fraud) and the flamboyantly dishonest Edwin Edwards (now imprisoned for fraud, racketeering, and extortion). Then, it was a choice between vulgarity and obscenity. Governor Edwards, whose supporters proclaimed, “Vote for the crook. It’s important,” later quipped that the only folks who didn’t vote for him were one-armed people: they couldn’t hold their noses and pull the lever by his name at the same time.

Others may recall some local elections in New York, such as the 1987 Bronx district attorney’s race in which the effectively unopposed incumbent died before Election Day, requiring voters to elect a corpse; or the death of West Side Representative Ted Weiss in September 1992, which permitted the Democratic machine to anoint a loyalist assemblyman as his successor and then a party district leader as the assemblyman’s successor, all without a single primary.

Presently then, New Yorkers have three empty options in elections such as the one in my city council district last November: voting for one or another empty suit; writing in someone’s name (which will not be counted); or not voting. None of them matter. But adopting NOTA would let voters simply reject unacceptable candidates and try again. Even an unopposed candidate might lose if the voters found him unworthy, or felt they just didn’t know enough to make an informed decision.

Some oppose NOTA because holding a new election every time “None of these Candidates” won might be expensive. Against this is the possibility of defeating unqualified hacks who merely know how to game the system and whose incompetence would probably lead to inefficiency and waste at the taxpayers’ expense. More absurd is the possibility of a series of elections in which “None of these Candidates” wins, instead of allowing on the ballot better candidates who actually wage informative campaigns on relevant issues. (And some political puritans argue that the Nevada option would let voters avoid making hard choices: as if most Americans weren’t already avoiding such decisions by simply not voting at all.)

Of course we all understand that most of these arguments are mere eyewash to conceal the hacks’ self-interest. Having spent nearly fifteen years in City Hall, I speak from personal experience in suggesting that most politicians only pay lip service to democracy. For them, the paramount issue is controlling the system.  Anything that weakens that control is unacceptable. They don’t have to read Machiavelli to understand that.

Yet, as Dr. John Pitney suggests in “The Right to Vote No,” NOTA really may come down to first principles. If free government is really based on the consent of the governed, the people should have a clear way of effectively withholding their consent from candidates who are unworthy, unknown, or unopposed. Otherwise, they simply may passively withdraw their consent. Perhaps, by not voting, they’re doing that already.

Confessions of a Theme Whore

It’s a big relief to me that the television season has drawn to a close—particularly that there will be no new episodes of House to miss. I’d been having a terrible time since the show moved to Monday nights. I guess I’m not television conscious that early in the week

It’s a big relief to me that the television season has drawn to a close—particularly that there will be no new episodes of House to miss. I’d been having a terrible time since the show moved to Monday nights. I guess I’m not television conscious that early in the week. I did my best. Nearly every Monday a point in the day would come when it would occur to me that it was Monday, and that I had to remember to watch House that night. But something would always come up—even if I knew perfectly well at six o’clock or at seven that it would behoove me to finish whatever I was doing by eight, it would always slip my mind, and suddenly I’d look up and it would be twenty or ten or four or seventeen past the hour.

It’s not as though it wasn’t important to me. I like the way House episodes begin the same way I used to get a kick out of the old teasers for Law & Order. Remember those? The pleasant anticipation of waiting first for the gruesome moment when someone would stumble across a corpse and then for the tasteless crack that Jerry Orbach always made over the deceased? House teasers are formulaic in the same way. A knowing snippet of contemporary life leads us to expect that a particular person is about to fall writhing to the floor. Then someone else entirely falls writhing to the floor.

I don’t think I caught a single House teaser this season. Instead, I seemed to keep coming in on scenes between Chase and Cameron, and they bore me. Well, she bores me. Well, she bores me now. So I’d swear and sigh and promise myself that the next week would be different. And then the following week it would happen all over again. It reminded me of a phase I went through where I was trying to develop an alcohol addiction. I’d pour myself a drink and then put it down somewhere and forget all about it. It wasn’t that I wasn’t motivated. I just couldn’t focus. I couldn’t commit.

The season finale of House, as it happened, was about precisely this phenomenon—about a guy at war with himself because the right and left sides of his brain couldn’t communicate with each other. Of course, tuning in late, I didn’t know this; and having missed the show the week before, I had no idea that House and Cuddy had had sex in the previous episode after she’d helped him through a grueling night of Vicodin withdrawal. So the revelation that Cuddy and House had not actually had sex, and that all of it—including the detox session—had been an opiate-induced hallucination didn’t have the impact on me that it seemed to have on everyone else.

I’m not sure it would have in any case. More than any television series I can remember, House seems to me not to be about what happens in the story. Or maybe it’s just that what characters have to say about what happens in the story is always much more interesting—just as the metaphor, analogue, or association that ultimately leads to House’s diagnostic epiphany is always more interesting than the actual solution to whatever medical mystery is haunting him and his team.

Anyway, catching up on the series later, I couldn’t help noticing how much of the season past seemed to be taken up with this notion of The Divided Self.

At this point, I should really come clean about a condition I suffer from. It’s probably congenital—I know I’ve had since I was a small child: an impulse toward over-interpretation. It’s really more of a compulsion. I see patterns and motifs everywhere. I can’t stop seeing them, and even though I know they’re probably not there, probably don’t exist, I can’t stop finding them. I guess it’s like any other addiction: the truth is I don’t really want to stop.

Which is why, when I headed over to the online episode guides for House and began reading synopses to see where my viewing deficiencies lay, I seemed to see The Divided Self everywhere. I saw it in parts of the series I’d caught all or part of—in the episode about the teenage mother who changes her mind about letting Cuddy adopt her baby, and the one about Cuddy’s mixed feelings over the baby she does eventually adopt, and the one about her ambivalence over the continued need for her presence at the hospital.

Putting away the winter clothes, I caught up on episodes I’d missed and continued to see elements of psychomachia in various forms: in Cuddy’s attempt to have Cameron become more like her, taking over Cuddy’s job; in the incessant mood-swings the series seemed to be having over Chase and Cameron’s impending marriage; in the episode about the guy with “locked-in” syndrome, who was brain-dead to all outward appearances but very much alive and alert, albeit unable to communicate. I saw it in the episodes where House began hanging out with an apparition of Wilson’s dead girlfriend, Amber, which turned out to be the embodiment of his unconscious. And, of course, when I finally saw the episode that began with the suicide of the character played by the actor Kal Penn, I saw The Divided Self there, too—particularly given the way the script seemed to harp on no one’s having had any idea there had been anything wrong.

Some of this, of course, is legitimate; I understand that. (I saw there was even an episode called “House Divided.”) But some of the things that ran through my mind were just plumb crazy—certifiable. Like the thought I had during the scene after Kutner’s funeral and cremation, when everyone was standing around watching the smoke rise into the sky: I had the fleeting notion that the show had artfully managed to induce a state of schizophrenia in us, because we were simultaneously sad (well, sad-ish) and amused, knowing that the real-life circumstance necessitating Mr. Penn’s departure from the series had been his well-publicized decision to take a job in the Obama administration.

So, as you can see—and here we are back at The Divided Self—I’m of two minds about all this. There’s a part of me that wants to point out that the series creators could have chosen any number of ways to get rid of Kutner. They didn’t have to engineer it in such a way as to raise the specter of inner conflict. But there’s that other, more rational, side of me that knows that a television series like House is written by a committee of people, some working on one episode and some on another. And I ask myself how likely it is that they sit around plotting ways of making me smile and tear up at the same time. Or structuring a season so that a bunch of episodes that one character spends talking to a figure who responds but isn’t really there are balanced by an episode that a bunch of characters spend talking to someone who is there, even though he can’t respond.

Another example of how I can go haywire over a theme was what happened with The Sopranos. Early on, I’d taken it into my head that the series was about art on some very profound and interesting level—or about Tony’s relationship with art—and I wrote about this around the time the second season was about to air.

There was a certain limited validity to this. Tony certainly had issues with the artwork in Melfi’s office, for instance. The opening shot of the series showed him sitting in her waiting room framed between the feet of a life-size bronze—a nude that he was eying with hostility and suspicion. In another episode he took umbrage at a harmless painting on one of her walls—a landscape dominated by a large red barn. Catching sight of it, he frowned, walked over, and examined it more closely, zeroing in on a darkened doorway that, when the camera zoomed in, seemed to yawn eerily. In the next scene, he accused Melfi of having “a trick picture” in her office.

There was also an early sequence where Tony, dragged his bratty daughter into a church and waxed sentimental over the fact that his grandfather and great uncle had built it. When she was skeptical about their having built it alone, just the two of them, Tony was patient. No, they had built it with “a crew of laborers,” but the point was they’d known how to do it. In the next scene, we watched one of Tony’s “crew” blow up a restaurant.

I became a little obsessed with the theme of Art in The Sopranos. Then I decided that there was something going on with nature, too. Tony and his pals seemed to have a difficult time with art and nature both. They had artistic and idyllic yearnings, but whenever they got involved with art or nature, things seemed to go badly.

The fifth season included a sequence in which Tony commissioned a rather vulgar portrait of himself posed with a horse he had acquired a financial interest in. When the horse came to a bad end, Tony had asked Paulie, another subordinate, to destroy the painting, but instead Paulie had kept it and had the figure of Tony touched up to resemble Napoleon. Like everyone else, I thought that was hilarious, but I also pondered on how it figured in the art-and-nature schema.

Later that season, the painting came up again when Tony paid a visit to Paulie and saw it hanging in his living-room. He proceeded to have a rather complicated series of reactions: rage, indignation, bewilderment, something verging on awe, and finally a sort of lingering nostalgia as he gazed at the painting a last time before leaving it in the trash.

That was the episode in which Tony decided that his favorite cousin (played by Steve Buscemi) was going to have to get whacked and ended up doing it himself. In fact, in the very next scene—right after the one with the portrait and right before the one where Tony blew his cousin away with a shotgun—we saw Buscemi drive onto a property dominated by a large red barn and into the dark spooky doorway of another barn, which the camera lingered on as it had lingered on the dark doorway of the barn in Melfi’s painting.

Well, I went completely nerts, leaping up and gesticulating, shouting that it was the same scene. And my husband and the friends we were watching the episode with were all very kind. Because, of course, it was nonsense, sheer nonsense. As if David Chase and his crew of writers sat around mapping out complex systems of imagery, saying “We’ll put the scene where Gandolfini whacks Buscemi right after the thing with the portrait of Tony; oh, and let’s have the farm where the hit takes place look just like that painting way back three seasons ago.” I mean, really.

And yet…and yet. Not long ago, we started watching some of those late episodes again. I’d forgotten how unequivocally horrible everyone becomes in that last season. I’d also forgotten how Tony’s relationship with Melfi ends. Her own analyst (played by director and film historian Peter Bogdonavich) shames her into realizing that she has merely been enabling Tony all these years. (He draws her attention to a study suggesting that “the talking cure” simply gives sociopaths a chance to sharpen their skills rather than leading to insight.) Soon after, she terminates Tony’s therapy, offering to refer him to someone else.

Revisiting all this, I began to discern—or thought I did—a solution to some of the questions that the opening season of the series had left me with: about art and its importance in the series, about Tony’s relationship with it, about David Chase’s takes on psychoanalysis and on Melfi’s clinical skills, and even about the connection between psychoanalysis and art.

It seemed to me we ended up with a realization that the self-awareness that “talk therapy” engenders in the ordinary patient has merely offered Tony the tools and material with which to construct a false version of the truth and reinvent his own image of himself, and that this is the only kind of creative endeavor that people like Tony can every really successfully engage in. And I even found myself wondering whether David Chase might have read that Robert Warshow essay about the movie-gangster’s connection with the city and if there could be some validity to the uncomfortable relationship with nature that I’d wanted to ascribe to Tony and his crowd.

I honestly don’t know what to think anymore. I don’t know which is less likely, that someone as steeped in American popular culture as Peter Bogdanovich would not know the Warshow essay about the gangster-movie genre, and wouldn’t at some point over the years have brought it to David Chase’s attention, or the idea that the twenty-some-odd people it took to write the series could fashion and sustain a thematic structure that complex over a period of six years.

I’m reminded of the only time I ever went for a tarot reading. There was this guy that I couldn’t seem to break up with, and a friend of my mother’s, tired of the situation, finally sent me off to see this psychic she swore by.

I took a taxi down to Mulberry Street. The psychic was waiting for me on his stoop. And right off, while I was still coming up the steps, he started telling me that the moment he’d seen me emerge from the cab he’d had this very strong, very clear feeling…I had an aura…he sensed that I was involved with a guy who was no good, who was trouble…he thought his name began with an R…“Robert”…or “Richard” perhaps…

The session was a disaster: I didn’t respond to the psychic’s inept guesses in a sufficiently helpful manner, and he ended up throwing me out. All the same, afterward I couldn’t decide which was more implausible: that my mother’s friend had actually gone to the trouble of calling him up and tipping him off about my boyfriend’s name, or that the guy really was magic.

Pocket Change

If you want to find a mirror of a society’s ideal—the image of what it hopes and imagines itself to be—public sculpture is as good a place as any to start, and none is more common or readily available than the public sculpture we carry around with us on the

If you want to find a mirror of a society’s ideal—the image of what it hopes and imagines itself to be—public sculpture is as good a place as any to start, and none is more common or readily available than the public sculpture we carry around with us on the coins in our pockets.

This year will bring some changes to the world’s most common public sculpture, the Lincoln penny. The occasion is the Lincoln bicentennial, and the Mint is happy. Collectors and speculators were glad to pay $8.95 for the two-roll sets of the new coins (worth $1.00) that went on sale on February 12, 2009 and sold out within a month. And so far few have complained about the new reverse design, which represents the Kentucky log cabin at the Abraham Lincoln Birthplace National Historic Site. (Of course, that cabin is itself a representation of someone’s idea of the original structure.)

Three more designs, one to be issued every three months throughout the year, will represent respectively Lincoln’s education, his pre-Presidential careers as lawyer and politician, and his Presidency. In 2010 and beyond, the Mint will issue yet another reverse, “emblematic of President Lincoln’s preservation of the United States of America as a single and united country.” So there will be five new designs, each issued by the mints at Philadelphia, Denver, and San Francisco (each mint’s coins has a special mint mark, P, D, and S, respectively), creating fifteen new coins for the delectation of collectors within less than thirteen months.

The original Lincoln cent, designed by Victor David Brenner, reflected the genius of the sculptor and of President Theodore Roosevelt, himself an aesthete, who forced change on the Mint bureaucracy of his day because he found the coinage of the United States unworthy of a great republic. It still is. For the most part, the heroes on our coinage and paper money depict the men considered great half a century ago.

Surely John F. Kennedy’s reputation has undergone re-evaluation since he replaced Benjamin Franklin on the half-dollar in a moment of national grief. Walt Whitman, George Gershwin, George S. Kaufman, Jonas Salk, Earl Warren, Eleanor Roosevelt, Ronald Reagan, Sojourner Truth—from the arts and sciences alone the list of possible alternatives to the present set of political icons on our coins and currency (which date from the New Deal or before) is almost limitless.

All this brings to mind something I thought about while emptying my pockets the other day.

At some time in the last century, I was taken to a Broadway revival of the musical comedy “1776.” In one scene, an actor named Paul Michael Valley, who played Thomas Jefferson, briefly stood in profile, silhouetted against an open door. Some suburban housewife in the next row murmured to her neighbor, “He looks just like the guy on the nickel.” Indeed, he did, which may explain his casting.

Anyway, while putting my pocket change on the dresser, I noticed one of those odd nickels struck by the Mint to commemorate the bicentennials of the Louisiana Purchase and the Lewis and Clark Expedition. The obverse looked like the Man in the Moon. Of course, the head was still old Tom’s, but the image had changed.

The Purchase itself was commemorated in 2004 by adapting the design of Jefferson’s Indian Peace Medal for the reverse of the five cent piece. The Indian Peace Medals, a British tradition continued after Independence, were large, attractive silver medals awarded by the United States to Indian chiefs or other important men on such occasions as major conferences or the signing of treaties. Intriguingly, this custom reflected the European tradition of exchanging decorations at historic moments of concord, which the United States has otherwise never adopted.

In place of the King appeared the current President, and on the reverse, in Jefferson’s case, appeared two clasped hands, the one to the right with a metal wristband such as frequently worn by Indian chiefs, and the one to the left with an army officer’s braided cuff, all beneath a crossed hatchet and inverted peace pipe. The medal also bore the words, “Peace and Friendship.”

Many of Jefferson’s medals were given to Indians during Lewis and Clark’s expedition from St. Louis to the Pacific Coast between 1804 and 1806. They are mentioned in the Expedition’s Journal as among the articles taken for presentation to the Indians. On August 1, 1804, the Journal records the gift of a “First Grade” medal and flag to a “Grand Chief,” medals of the Second Grade to lesser chiefs; and of the Third Grade to inferior chiefs. Certificates to accompany the medals were also issued, such as one surviving in a California collection which refers to “the special confidence reposed by us in the sincere and unalterable attachment of War Charpa the Sticker, a Warrior of the Soues Nation, to the United States…”

Later in 2004, the Mint issued yet another kind of nickel with a reverse featuring one of the Expedition’s flatboats, driven by both sail and poles, like a Mediterranean war galley.

In the spring of 2005, the year of the coin I found in change, the Mint had doubly changed the coin. The obverse had Jefferson’s profile, oddly presented as to leave the coin resembling the Man in the Moon, with the word “Liberty” in Jefferson’s handwriting. The reverse adapted one of the Mint’s most popular designs, James Earl Fraser’s Buffalo nickel of 1913-1938, for the next coin in the series. Over 1.2 billion Buffalo nickels were struck during that quarter-century. They have now vanished from circulation. But as late as the 1960s, one still found Buffalo nickels in change, with the stern profile of an austere Indian warrior on the obverse and the massive buffalo on the reverse.

Fraser’s visual economy in its design is profoundly moving: without a touch of sentimentality, few accessible works of art so powerfully visualize the nobility of tragedy. And his commanding, virile bison dominates the design of its coin.

But as is often the case in the Mint’s modern adaptations of older designs from a heroic past, the modern buffalo, despite its unequivocal masculinity, lacks confidence. It seems neutered, almost cringing, standing, somehow off-balance, on a small, sloping patch of prairie, fenced in by the words “United States of America.”

In the fall and winter of 2005, the coin was changed yet again: the Man in the Moon obverse was coupled with a new reverse, a view of the Pacific Ocean with a quote from Captain Clark: “Ocean in view! O! the joy!” The minor irritant here is that Clark, whose writings betray a libertarian, if not Shakespearean, attitude toward standardized spelling, had written “Ocian.” While the Oxford English Dictionary includes “ocian” among its citations, the Mint corrected Clark’s usage. According to CNN, when questioned about this, a Mint spokeswoman answered, “We didn’t want to confuse anyone into thinking we couldn’t spell.” Again, a lack of confidence, this time in the reality of Clark’s spelling.

The following year showed the most unusual change in the design. Even as Monticello returned to the reverse, the obverse had Jefferson gazing at the viewer in one of American coinage’s first full-face designs. This is unusual for a technical reason. Coins first show signs of wear on the highest point of the design. The traditional profile tends to wear gracefully. But a full face design tends to wear out nose first.

The most notorious example of this was Copper Nose coinage of England’s King Henry VIII. By the 1540s, Henry was running out of money due to his personal and public extravagance. He both raised taxes and debased the coinage, transforming the nominally silver shilling to a copper coin which was dipped into a silver nitrate solution. Electrolysis left a thin wash of silver on the coins.

Instead of a conventional profile, the new coins bore the King’s facing image, executed with surprising candor, bearded and repellently bloated. Even a little wear on the coin’s highest relief – which with a facing portrait is the nose – revealed its copper core. As the coating wore off the most prominent feature – Henry’s nose – it became reddish brown. Hence, the king acquired the nickname “Old Coppernose.”

Such a sobriquet is unlikely to be attached to Jefferson, who is, after all, nearly two centuries dead. His coin is silver-colored metal all through. But the full face design is off-putting and unattractive, and one hopes the Mint will return to the customary usage in its future coinage.

Another Last Hurrah?

As becomes a citizen, I have occasionally run for public office. As Edouard Herriot, four times Prime Minister under the Third Republic, said whenever he was running for anything, from conseilleur municipal to President de la Republique, “I have placed myself at the disposal of my friends and the

As becomes a citizen, I have occasionally run for public office. As Édouard Herriot, four times Prime Minister under the Third Republic, said whenever he was running for anything, from conseilleur municipal to Président de la République, “I have placed myself at the disposal of my friends and the service of the Republic.” In my case, I was simply doing my best to sabotage and annoy the office-holding element among The Wicked Who Prosper.

Most recently, I stood for Vice President of the United States in the New Hampshire primaries, which I wrote about in 2000.  It was all in good fun and as—much to my surprise and delight—I won, I found myself with yet another anecdote for dinner conversation.

So I was saddened when the May 1, 2009 issue of Richard Winger’s indispensable Ballot Access News reported that on April 22 the New Hampshire legislature passed a bill to eliminate the vice-presidential primary.  As Mr. Winger notes, “No other state has a vice-presidential primary.”

He goes on to point out that “Generally, no one who really has a chance to be chosen by a major party for vice-president ever files in this primary.” There is, in a sense, a reason for this. Since New Hampshire instituted the vice-presidential primary over fifty years ago, the contest had developed a laudable purpose uniquely its own: the potential embarrassment of an incumbent vice-president.  If, like myself, you believe that politicians are fair game, then the vice-presidential primary is simply a happy hunting ground.  Besides, until fairly recently, the vice-presidency was an absurdly empty job with its occupants as worthy of respect as the hapless Alexander Throttlebottom, the vice-president in Gershwin’s 1931 musical Of Thee I Sing.

All is not yet lost: the Legislature’s website indicates that the bill has not yet gone to the governor for signature. Though my stumping days are behind me, I for one fervently hope that it does not. While only one incumbent has actually been defeated (Dan Quayle, 1992, who did not have his name appear on the ballot, did not wage a subrosa campaign for votes as most incumbents do, and was overwhelmed by the unknown candidate who paid his filing fee and appeared on the ballot, never to be heard from again), the possibilities presented in 2012 by the loose-lipped Joe Biden seem limitless and irresistible.  It would be a pity if New Hampshire were to spare him that potential humiliation.

Aurelia Greene, Evergreen

Last week, a special election was held for Bronx Borough President, a job which, since the Charter reforms of the early 1990s, is largely ceremonial. The marvelously named incumbent, Adolfo Carrion, had resigned office to accept appointment as Director of the White House Office of Urban Affairs….The Assemblywoman’s name rang a chime in memory. She had been

Last week, a special election was held for Bronx Borough President, a job which, since the Charter reforms of the early 1990s, is largely ceremonial. The marvelously named incumbent, Adolfo Carrion, had resigned office to accept appointment as Director of the White House Office of Urban Affairs.

Even before State Assemblyman Ruben Diaz Jr. had won the Borough Presidency, he had announced that he would appoint State Assemblywoman Aurelia Greene his Deputy Borough President. She will be the deputy to a public official who doesn’t have much to do. Her duties are unlikely to be taxing.

The Assemblywoman’s name rang a chime in memory. She had been involved with Bronx Community School Board #9. Both she and her late husband, the Honorable Reverend Dr. Jerome Greene, who favored using all his titles at once, had been elected to the Board repeatedly. The Honorable Reverend Doctor had been its president from time to time, and by coincidence, Board #9 had employed several of Assemblywoman Greene’s relatives while she was serving on the Board.

Pastor of the Bronx Charismatic Prayer Fellowship, a church that met in his family manse, and Founder and President of the Bronx Unity Democratic Club, Dr. Greene had pled guilty in 1991 to larceny when he admitted using City money to pay for cameras, television equipment, and other merchandise purchased for his personal use. He also admitted using Board of Education employees to print political campaign literature at Board of Education expense.

The Greenes had previously been indicted for stealing a piano from Intermediate School 145. They beat the rap on that one, although the piano had ended up in their house, where it was supposedly used in his religious services. Dr. Greene was re-indicted on misdemeanor theft of services charges for using school employees to transport the piano to his house. However, that charge was apparently resolved when Dr. Greene pled to the larceny charge.

Both Greenes were serving on Board #9 when, in 1988, the late Chancellor Richard Green suspended them and the rest of the board amid charges of drug use and drug trafficking, extorting money from teachers, and stealing school equipment. The investigations leading to the Board’s suspension had stemmed from the arrest and subsequent conviction of a Board #9 principal, Matthew Barnwell of P.S. 53, for buying crack.

In addition to Dr. Greene and Mr. Barnwell, eight other people from Board #9 were convicted of crimes that included signing phony invoices, bribery, and defrauding the government. The Chancellor removed the district superintendent, Dr. Annie Wolinsky, for mismanagement. Dr. Wolinsky was unable to explain how, while schools went without basics like chalk and paper, thousands of dollars of uncatalogued supplies were stacked in the district warehouse, or why eight district employees worked only at videotaping the board members. She also couldn’t explain her failure to discipline Mr. Barnwell for, among other things, 142 instances of lateness or absence in the course of a 184-day school year.

However, death pays all debts. Despite his criminal record, Dr. Greene has been immortalized by the New York City Council which, by enacting Local Law 131 in 2005, renamed part of Teller Avenue in The Bronx as Reverend Jerome A. Greene Place. In his remarks at a December 29, 2005 public hearing before he signed the bill, Mayor Michael Bloomberg stated that the individuals commemorated that day-Dr. Greene among them-were being honored for their lifetime accomplishments.

I gather that one commentator has suggested that Assemblywoman Greene wanted a job that wouldn’t require a commute to Albany. As she trades the State Capitol for Bronx Borough Hall, everything looks pleasant for her: she is trading one well-paid job for another equally remunerative, and as she journeys toward the sunset of life, the downhill road is comfortably paved with city paychecks.

Susan Boyle and the Tigers of the Night

I’ve read several Google pages’ worth of commentaries on Susan Boyle, the middle-aged Scottish woman whose April 11 performance of a song from  Les Miserables on the U.K. version of “American Idol,” “Britain’s Got Talent,” millions of people have watched on YouTube. So far, though, I’ve yet to encounter an article, essay, or blog post that touches on the aspect of the phenomenon that I found most fascinating and moving.

I’ve read several Google pages’ worth of commentaries on Susan Boyle, the middle-aged Scottish woman whose April 11 performance of a song from  Les Miserables on the U.K. version of “American Idol,” “Britain’s Got Talent,” millions of people have watched on YouTube. So far, though, I’ve yet to encounter an article, essay, or blog post that touches on the aspect of the phenomenon that I found most fascinating and moving.

No one as far has commented on the nature of the number Boyle performed—no one has pointed out that it’s a lousy song. This is perfectly understandable. To do so would have seemed nasty or mean-spirited, and the whole point of the Susan Boyle phenomenon—at first glance, certainly—was the conquest of nastiness, the silencing of the urbane, supercilious stance embodied by Simon Cowell in the television persona he has created for himself. But I think it’s important that Boyle wowed the world with a dreadful, inept piece of sentimental tripe. In fact, I’d argue it’s the key to the whole phenomenon. Because part of what was galvanizing about Boyle’s performance of “I Dreamed A Dream” was that she made it a good song.

In order to appreciate the consummate badness of “I Dreamed A Dream” you don’t necessarily have to have the lyrics in front of you, but it helps. Like many of the songs in Les Miserables, it’s a concatenation of hackneyed tropes and verbal clichés—“empty songs with empty lyrics” was how the parodist Gerard Alessandrini put it in his send-up of one of the show’s more unspeakable numbers (“Empty Chairs At Empty Tables”). These are songs that bank on the idea that if you make something vague enough and general enough it will appeal to everyone—which is a kind of perversion of universality. (Good songs discover a particular or idiosyncratic truth that we can generalize ourselves.)

I dreamed a dream in time gone by
When hope was high and life worth living,
I dreamed that love would never die,
I dreamed that God would be forgiving.

Of course, there are bad songs and bad songs. If this one seemed particularly laughable in the context of the show it was because the character who sang it had appeared for the first time only seconds before and died virtually seconds later. Actually, the song is worse even than I’d remembered. Boyle left out a verse that contains the lines “He slept a summer by my side/ He filled my days with endless wonder;/ He took my childhood in his stride…” The collision of euphemism with cliché doesn’t often get more infelicitous than that.

Plenty of die-hard fans of Les Miserables would go to the barricades to defend it; but even folks who love the show, or who have a love-hate relationship with it (like me), tend to admit to the transparent awfulness of “I Dreamed A Dream.” Indeed, the song became briefly famous on the musical-theater-writing circuit, when the show first opened in New York, for a lyric so inept that no one has ever been able to make even the remotest guess at what it’s supposed to mean. It comes in the first two lines of the bridge but is hard to hear in the YouTube video, either because Boyle was tastefully downplaying that part of the song or because it dips suddenly to a place slightly below her natural register.

But the tigers come at night,
With their voices soft as thunder,
As they tear your hope apart,
As they turn your dream to shame.

No one has ever been able to figure out where the tigers come from or what they have to do with anything.

There are moments in popular culture when a work of art undergoes a transformation because something in real life creates a context where there was a vacuum before. Something like that happened around fifteen years ago with the revival of the Broadway musical Chicago. If you talk to people who saw the original 1975 production, which was coolly received by audiences and critics and didn’t last long, you hear a lot about hostility toward the audience, how the murdering showgirls who become stars called you “suckers” and threw roses out into the house at the end, claiming to be living examples of what makes America great and what it stands for. Two decades later, the same finale was greeted with amusement and delight.

What had changed? Very little about the show. But Court TV had come along, and the world had watched the fiasco of the O.J. Simpson case—with its mountebanks and cheap theatrics—lead to a lot of new-made celebrities and no conviction. Suddenly a show that reveled cynically in the hypocrisies of a society that makes stars out of lawyers and murderers had a lot of meaning to a lot of people.

A 1970s audience had plenty to be cynical about, but in the post-Watergate era the media were not the villains. With no specific target for its satire, the show was (rightly I think) interpreted to be excoriating the audience. Twenty years later, events in the real world had given the show moral heft, and an audience could join in its contempt for something outside of the theater.

Something analogous, I think, was happening last week when we all watched that video of Susan Boyle performing that silly song. In this case what created a reference point was Boyle herself. There were things we knew about her that gave the lyrics of the song a context it’s probably never had before and may never have again. Some of them she had told us herself in the little sound-bytes before her performance, but most of them were things we intuited from her manner, affect, and bearing—from the way she laughed when she laughed and clowned when she clowned and cringed when she cringed and stood her ground with what seemed like an obliviousness to how she was coming off.

When she began to sing and everyone was blown away, two factors were at work. One was the incongruity between her singing and the persona she projected. She didn’t sing the way we expected someone to sing who came off the way she did—awkward and gormless. The second thing that blew us away was the lyrics to the song. It wasn’t that she made them seem good or true or meaningful; it was that they were true (and therefore good and meaningful) when she sang them. We knew that because we knew that some of those things we’d guessed about Susan Boyle were true. So when she sang about a lonely, sad, and disappointed life, a song made mute and silent by its own anonymity became eloquent through her idiosyncrasy.

Everyone keeps going on about how satisfying it was to see her “impress” the judges on “Britain’s Got Talent.” But that’s not primarily what you see when you watch the judges in the YouTube video. You’re not watching people who are impressed so much as people who are strangely moved by something they weren’t expecting to encounter. (I’m talking about the two men, really; the woman was obviously performing.) They’re aware of the triteness and vacuity of the lyrics, but they’re thinking about them all the same.

And that, I think, more than Boyle’s triumph, is what moved us when we watched the video, the barely perceptible changes that happened in the faces of the judges—the way this one swallowed or that one found himself having to turn an involuntary sigh in to a  smile. Oh sure, it was satisfying to see Boyle “wipe”—as one of the articles I read put it—“the smirk off Simon Cowell’s face,” and thrilling to get caught up in the audience’s wild excitement. But the moments in the video that made you and me and all the journalists and bloggers who wrote about the experience weep come a little later; they offer a fleeting glimpse of what’s going on in the mind of a listener as the phony, empty lyrics become filled with a possibility of meaning—because it’s a song about someone who had envisaged something else from life and Susan Boyle had walked on stage as one who couldn’t possibly envision another life for herself.

In the end, it was a kind of relay effect: the judges were moved by something they saw in Susan Boyle, and we were moved by something we saw in them. For us, as for them, it had to do with a suggestion of the unknown aspects of another soul—unguessed passions, thoughts, and experiences; the idiosyncratic and unpredictable; the tigers that come at night from out of nowhere and don’t belong in the song.

Piracy Then and Now

Now that the drama of the Somali pirates is passed—for the moment, anyway—it seems worth commenting on an op-ed piece about the incident that appeared in the Sunday, April 12, 2009 edition of The New York Times.

By midday of that day the American merchantman Maersk Alabama was safe in Mombassa

Now that the drama of the Somali pirates is passed—for the moment, anyway—it seems worth commenting on an op-ed piece about the incident that appeared in the Sunday, April 12, 2009 edition of The New York Times.

By midday of that day the American merchantman Maersk Alabama was safe in Mombassa, Kenya. Captain Richard Phillips, who’d been held for five days in a 28-foot lifeboat, had been freed.  Three of the four pirates who’d held him captive had been slain—three shots, three kills—by American sniper fire from the destroyer U.S.S. Bainbridge.

The Times piece started out by pointing out that American citizens being held hostage by pirates is nothing new, alluding to the American wars against the Barbary pirates of North Africa at the turn of the 19th century.  It quickly became vague and diffuse, dwelling on no single incident and offering very few specifics about that interlude, save for a brief reference to the destroyer’s namesake, Captain William Bainbridge, who was himself taken by pirates in 1804 when his command, the frigate U.S.S. Philadelphia, grounded on an uncharted reef off the shores of Tripoli and was rendered defenseless.

Instead, the Times piece dwelt at some length on the loss of nineteen American soldiers in the Battle of Mogadishu on October 3-4, 1993, an incident dramatized in the 2001 movie Black Hawk Down.  There, the Americans had been supporting a United Nations initiative to end yet another in a long series of civil wars in Somalia.  After Somali militia had killed and reportedly mutilated twenty-four Pakistani peacekeepers, the UN Security Council authorized the arrest of those responsible.  A small American force went into Mogadishu to capture the Somali foreign minister and his political adviser and remove them by truck.

We had underestimated the enemy capacity for resistance. The Somali militia were, after all, irregulars, semi-trained guys in colored T-shirts and flip-flops, and no one took them seriously despite their AK-47s,  rocket-propelled grenades, and proven capacity for urban warfare, let alone that we were on their turf in their country.

The mission went south very quickly.  The Somalis barricaded the streets with jalopies and piles of burning tires and shot down two American helicopters.  Thus a one-hour operation became a fifteen hour battle, ending only when a column of American, Malaysian, and Pakistani troops, tanks, and armored personnel carriers fought into the city and evacuated the mission.

The bodies of several dead American soldiers were dragged through the streets by Somali mobs.  This was bad for American domestic consumption, and so on this basis alone the Clinton administration stopped all American actions against the Somali militia, withdrew all troops within six months, and thereafter avoided the use of boots on the ground to support its foreign policy.

In focusing on a single mismanaged mission kept alive in public memory by a Hollywood blockbuster, the Times editorialists successfully avoided discussing the success of that earlier mission against the North African pirates, long before air support, radio, or steam power.

Captain Bainbridge’s ship had been swarmed by Tripolitanian gunboats and compelled to surrender.  Once Philadelphia had been refloated and brought into Tripoli, the pirates seem to have found the ship—a late 18th century square rigger—a little sophisticated for them.  Nonetheless, her mere existence was a potential threat to Western shipping.  After all, the Tripolitanians might eventually figure out how to sail and fight her.

The commander of the American squadron in the Mediterranean, Commodore Edward Preble, was an old Revolutionary who apparently understood that threatened or actual violence is as integral to effective foreign policy as negotiation.  He chose to eliminate the threat posed by the captured man-of-war by destroying her.

He ordered a mission that would involve sailing into a heavily-fortified enemy harbor, boarding and burning a warship held by pirates, and then, God willing, returning home.  Lieutenant Stephen Decatur, just twenty-five years old, was appointed to lead the operation.  He was given the Mastico, a filthy captured Tripolitanian ketch (a tiny two-masted sailing craft), which was renamed U.S.S. Intrepid (the first of four American warships that have borne that name).

His crew was taken from the schooner U.S.S. Enterprise and Preble’s flagship, U.S.S. Constitution.  With a Maltese pilot who knew the harbor at Tripoli, Decatur sailed from the Sicilian port of Syracuse on February 3, 1804.  Storms delayed him en route, and he did not reach Tripoli until late afternoon on February 16.  Intrepid was disguised as a Maltese trading vessel under British colors, the United Kingdom having maintained good relations with the pirates by paying them tribute, which we might today call protection money.

Around 7:00 pm, navigating by moonlight, Decatur sailed into the harbor and, claiming to have lost his anchors, requested and received permission to tie up alongside Philadelphia.  As Intrepid came alongside, with some of her crew tossing lines to the frigate, Decatur and the others were huddled along the bulwarks, ready for action.  A guard aboard Philadelphia saw something and shouted, “Amerikanos!”  Decatur gave the order to board and, cutlass in hand, led sixty men up the frigate’s side.  The pirates, caught by surprise, did not resist; most dove overboard and swam for shore.  Within twenty minutes, Philadelphia was ablaze.  Decatur got his men back to Intrepid, which, pursued by the fire of 141 guns, escaped the harbor.  None of his men were killed; one was injured; Philadelphia burned to the waterline and sank.

Commodore Preble then blockaded the Barbary ports, bombarded their cities, and sank their ships.  His successor, Samuel Barron, sent the Marines ashore.  They captured the city of Derne and defeated the Tripolitanian armies in two land battles. (The Marines commemorate this victory in the line in the Marines’ Hymn, “…to the shores of Tripoli,” and their officers’ dress swords, which are patterned after one given to a Marine lieutenant by an Ottoman prince to honor the American’s valor at Derne). In 1805, the treaty of peace between the United States and Tripoli, Tunisia and Algeria, “negotiated at the cannon’s mouth,” was signed aboard U.S.S. Constitution.

Admiral Horatio Lord Nelson called the Intrepid mission “the most bold and daring act of the age.”  Perhaps it exemplifies what American audacity and imagination could do when Washington was three months away by sail and an officer had to improvise the execution of his mission on the spot.

Why does the American use of force to suppress pirates – seagoing extortionists, after all – seem to so frighten the Times? One can only speculate. Perhaps there is something virile in controlled violence that makes the editors nervous.

Most peoples get the governments they deserve.  Somalia has nearly no government at all, numberless private armies, police forces, and religious militias, one of the world’s highest infant mortality rates, and little public access to potable water.

Some free market types argue that Somalia is nearly a libertarian paradise, with a highly efficient and competitive service sector (including crystal clear cell phones, privately-owned, managed, and policed airports, and electric power service in most major cities), and a completely free press.  That seems irrelevant in a country where two-thirds of the population doesn’t live in cities and nearly two-thirds of the adults can’t read.

The so-called Transitional Federal Government, which occupies Baidoa, the country’s third largest city, is recognized by some foreign powers as the government of Somalia.  Its authority nominally extends over the country.  Its effective power doesn’t quite make it past the Baidoa city line and its attempts to move into Mogadishu, the national capital, have not yet succeeded.  The TFG hasn’t yet been able to collect taxes or establish any stable revenue and is entirely dependent upon foreign aid.  Outside Baidoa, Somalia is effectively divided into numerous petty states, ruled by clan-based warlords, local chieftains, and Islamists, a condition akin to that prevailing before its colonization by Italy during the late 19th century.  One might say that Somalia, after enjoying the benefits of Western civilization for nearly a century, has simply returned to its former condition.

Now, in the last two years, by turning to piracy, many Somalis have unwittingly exported their violent culture into the mainstream of commerce.  Perhaps they haven’t understood that, once they began preying on Americans, they encountered another culture—of instant news, the sound bite, the blog, and the politicians who live in it.  The result has not been pretty.

Since Julius Caesar and Pompey the Great, piracy has been eradicated only by force: the ordered use of violence to eliminate the threat of armed robbers to peaceful commerce. The Americans have shown they are still willing to use it.

If the Somalis understood our history, they would realize that the United States has always been ready to use violence to suppress pirates, bandits, and others who prey upon businessmen going about their business.

In 1913, after the Titanic disaster, the nations of the trading world established the International Ice Patrol to monitor icebergs and report their presence in the sea lanes.  The Patrol is conducted by the United States Coast Guard: its costs are apportioned among the nations whose ships travel the North Atlantic.  Perhaps the trading nations should give similar treatment to the suppression of Somali piracy, as a cost of maintaining the seas as a medium of commerce.

The Gray Chrysanthemum

In his eight decades, Sadakichi Hartmann fried eggs with Walt Whitman, discussed verse with Stéphane Mallarmé, and drank with John Barrymore, who once described him as “a living freak presumably sired by Mephistopheles out of Madame Butterfly.”

Critic, poet, novelist, playwright, dancer, actor, and swaggering egotist, Hartmann might lift your watch

In his eight decades, Sadakichi Hartmann fried eggs with Walt Whitman, discussed verse with Stéphane Mallarmé, and drank with John Barrymore, who once described him as “a living freak presumably sired by Mephistopheles out of Madame Butterfly.”

Critic, poet, novelist, playwright, dancer, actor, and swaggering egotist, Hartmann might lift your watch (he was a superb amateur pickpocket) but his opinion was not for sale. Such a man evoked diverse opinions. Gertrude Stein said, “Sadakichi is singular, never plural.” Augustus Saint-Gaudens (whose equestrian statue of Sherman stands near the Plaza) wrote to him, “What you think of matters of art I consider of high value.” W.C. Fields said, “He is a no-good bum.”

Sadakichi Hartmann was born November 8, 1869 on Deshima Island, Nagasaki, Japan. His father was a German merchant, his mother Japanese. His father disowned him at fourteen, shipping him to a great-uncle in Philadelphia with three dollars in his pocket. Sadakichi observed, “Events like these are not apt to foster filial piety.”

While working at menial jobs, he educated himself at the Philadelphia Mercantile Library and published articles, poems, and short stories in Boston and New York newspapers. He crossed the Delaware to Camden to introduce himself to Walt Whitman. In 1887, he published a New York Herald article quoting Whitman’s opinions about other writers, which Whitman denounced for misquoting him. Undaunted, Sadakichi expanded the article to a pamphlet, Conversations with Walt Whitman. He later studied in Europe, meeting Liszt, Whistler, Mallarmé, and Verlaine, glimpsing Ibsen and exchanging letters with Anatole France. (He later sold France’s letter to an autograph hunter, exacting dinner for two at Maxim’s with two bottles of Pol Roger).

At twenty-three, he wrote his first play, Christ, drawn from Hartmann’s private non-historical and non-Biblical ideas, particularly a sexual relationship between Jesus and Mary Magdalene. When Christ, which he claimed had been compared to Shakespeare’s Titus Andronicus, was published in 1893, James Gibbons Huneker, the American aesthete, called it “the most daring of all decadent productions.” It was banned in Boston: the censors burned nearly every copy and jugged Sadakichi in the Charles Street Jail.

While working as a clerk for McKim, Mead & White, he described Stanford White’s drawings as “Rococo in excelsis. To be improved upon only by the pigeons, after the drawings become buildings.” White dispensed with his services. Thereafter, Sadakichi kept the pot boiling with hundreds of German-language essays for the New Yorker Staats-Zeitung.

In 1901, he published a two-volume History of American Art, which became a standard textbook. The History remains worth reading as an intelligent evaluation of American art’s first four centuries, marked by Hartmann’s insight into the modernist movements. His judgments are nearly clairvoyant: he analyzes the work of Thomas Eakins, Winslow Homer, and Albert Pinkham Ryder, all then nearly unknown. He also discusses Alfred Stieglitz as a photographer. Sadakichi was the first American critic to seriously discuss photography as an art form. Hartmann later contributed his most incisive writing to Stieglitz’s Camera Notes and Camera Work.

By 1906, he was famous, in Huneker’s words, as “the man with the Hokusai profile and broad Teutonic culture.” He stole a taxicab and somehow won acquittal in Jefferson Market Night Court by proving he did not know how to drive. When Moriz Rosenthal, a pianist who had studied under Liszt, enriched the Hungarian Rhapsodies by improvising a series of rapid scales during a Carnegie Hall concert, Sadakichi roared from the gallery, “Is this necessary?” As the ushers tossed him out, Hartmann shouted, “I am a man needed but not wanted!”

From 1907 to 1914, he lived intermittently at the Roycroft Colony, an artistic commune at East Aurora, New York, where he ghostwrote books for its founder, Elbert Hubbard, a soap salesman who had become rich by marrying the boss’ daughter. Hubbard had fallen in love with William Morris’s Kelmscott Press. Though utterly untalented, he saw himself as a Renaissance man just like Morris and attempted to recreate the Morris enterprises at Roycroft by spending money. Hubbard rationalized his ghostwritten books by arguing that all great art was collaboration.

In 1915, Sadakichi entered 58 Washington Square South, then known as Bruno’s Garret. Guido Bruno, its proprietor, was an exotically mustached émigré whose bluff, plausible manner and florid speech unsuccessfully concealed an instinct for the main chance. Bruno realized tourists would flock to gape at bohemians in their search for what some called Greenwich Thrillage. He promoted his Garret through flamboyant magazines, all featuring his name in the title: Bruno’s Weekly, Bruno’s Scrap Book, Bruno’s Review of Two Worlds, Bruno’s Bohemia, Bruno’s Chap Book, Bruno’s Review of Life, Love, and Literature and, simply, Bruno’s. In The Improper Bohemians, Allan Churchill described Bruno’s Garret as a layman’s dream of the artist’s life, where “artists’-model types of girls and hot-eyed young men who declared themselves poets, writers, and painters…behaved during working hours like artistic freaks,” declaiming verse and splattering canvases before tourists, herded from the Fifth Avenue bus, while Bruno collected admissions at the door.

The impresario proclaimed Sadakichi a genius. In Bruno’s Chap Book for June 1915, nine years before the Nouvelle Revue Française organized the first Western haiku competition, Sadakichi published “Tanka, Haikai: Fourteen Japanese Rhythms.” Five months later, Bruno proclaimed Sadakichi king of the Bohemians.

In 1916, Sadakichi went west. He revisited his first play’s subject in a novel, The Last Thirty Days of Christ, which envisioned Jesus as a mystic and philosopher misunderstood even by his disciples (anticipated Nikos Kazantzakis’ The Last Temptation of Christ by a generation). It was praised by Ezra Pound and eulogized by Benjamin De Casseres as one of the most strikingly original works of American literature. He played the Court Magician in Douglas Fairbanks Sr.’s Arabian Nights fantasy The Thief of Bagdad (1924) for $250 in cash and a case of whiskey every week.

In 1938, Sadakichi moved to a shack on an Indian reservation in Banning, California. He still swaggered despite age, asthma, and a bad liver. (“My ailments are exceeded only by my debts.”) After Sadakichi collapsed on a bus, a doctor asked his symptoms. Hartmann replied, “I have symptoms of immortality.”

Though he still contributed to Art Digest and several European reviews, he generally lived on handouts cadged from admirers as tribute to his increasingly outrageous personality. John Barrymore, when asked why women still found Sadakichi, “this fugitive from an embalming table, so attractive,” replied, “Because he looks like a sick cat. Women are nuts about sick cats.”

Sadakichi’s friends in Los Angeles centered around the Bundy Drive studio of John Decker. They included Barrymore and Hearst editor Gene Fowler, then enjoying a wildly successful stint as a script doctor. This audience welcomed Sadakichi’s stories, all told with sly self-mockery and perfect timing. Fowler, according to his valentine, Minutes of the Last Meeting, he first spoke with Hartmann over the telephone in 1939. A few days later, while Fowler was at his office, a studio policeman said a crazy old man was asking for him. “When I told him he smelled of whisky,” the guard reported to Fowler, “he said I ought to be smelling his genius.”

Fowler went outside. The old man stood nearly six feet tall and weighed 132 pounds. Despite the heat, he wore a threadbare, baggy gray overcoat with an old sweat-stained fedora pushed back on his shock of gray hair, which had inspired Barrymore to nickname him the Gray Chrysanthemum.

He announced, “Where I come from and where I go doesn’t matter. For I am Sadakichi Hartmann.” (I’m condensing Fowler’s version here. Fowler extended his hand, saying, “Sadakichi, I am happy to know you.” The critic replied, “Is that remark a sample of your brilliance? You may live another century, Fowler, but you will never meet another son of the gods like me. You have something to drink?”

Fowler stammered, “As a matter of fact, I’m not drinking and—”

“What have your personal habits to do with my destiny?”

“I hadn’t expected a thirsty guest,” Fowler explained.

“You should always have something on hand to offset the stupidity of this place.” As Sadakichi helped himself to Fowler’s scotch, he said, “Be careful that you do not fall in love with your subject—in love with my wonderful character and genius. It will blind you, and your writing will suffer.”

Later, when an automobile accident interrupted Fowler’s work on the biography—it left him with “two split vertebrae, three cracked ribs, a skull injury, and wrenched knees. Otherwise I was as good as new”—Sadakichi complained that Fowler was malingering so as “to avoid becoming famous.” (“He suddenly realizes that I am much too big a subject for his limited talents.”)

Once, Fowler took Sadakichi to the Los Angeles Museum of History, Science and Art. At the entrance, Sadakichi saw a wheelchair. He sat in it and motioned Fowler to push him about the gallery. He commented loudly on the paintings: van Eyck’s Virgin and Child (“Painted on his day off”) and a Rembrandt portrait of his wife Saskia (“Second-rate…he had begun to lose his mind when he painted it”). A curator bustled up, and Sadakichi asked him where the washroom was. The curator gave him directions. “Bring it to me,” Hartmann commanded.

Fowler recalled an attempt to have Hartmann examined by a physician. The doctor recommended that he be operated on for his hernia. Sadakichi resented this proposal—understandably, since in those days such ruptures could be repaired only by first removing the testicles. Decker urged the old man to bid the glands farewell. (“They have served their purpose and undoubtedly merit an honorable retirement.”) “Ghouls!” Sadakichi cried and turned his rage on the doctor. “Why don’t you men of medicine do something worthwhile instead of castrating a genius!” (Barrymore, weighing in, sustained Hartmann’s objections. “After all,” said he, “it is hard to cast aside comrades of happier times.”)

“Other people,” said Sadakichi, “talk and talk about dying. I’m doing it!” So he did, on November 21, 1944.

New York Press, May 1, 2001