July 31, 2006

This global conflict is approved for all audiences

This chatter about PBS agonizing over whether to self-censor a documentary series on the veritable family-friendly petting zoo that was World War II to avoid FCC indecency fines for airing explicit dialogue would be patently ridiculous–if in light of contemporary coverage of the war in Iraq it weren't so tragically ironic.

Iraqi civilian casualties, now that someone's actually started counting them, are clocking in at record highs of more than 100 each day. Recent events have compelled newspapers to debate the rhetorical distinctions between "escalating sectarian violence" and "civil war" on their opinion pages. More U.S. troops are being sent by the president into Baghdad to battle the rising unrest (anecdotally, a friend of mine in the Army recently returned for her third tour).

Yet attention spans wane and ribbon magnets fade, and now the fighting between Israel and Lebanon and whomever else is in on this latest firepower free-for-all is the Middle East crisis-du-jour. And as this is America, most of us probably still have more informed and passionate opinions on Mel Gibson's DUI and alleged accompanying anti-Semitic diatribe.

Indeed, with the situation in Iraq becoming for most Americans a routine recitation of knee-jerk opinions and a perfunctory trickle of casualty counts, amorphous and removed from everyday experience yet still something we know should be important, the story becomes interchangeable with other pet abstractions from gay marriages to distant wars.

The difference, as Frank Rich wrote Sunday in the New York Times, is that those others have coherent plots and recognizable casts:
The Iraqi people, whose collateral damage was so successfully hidden for so long by the Rumsfeld war plan, remain a sentimental abstraction to most Americans. Whether they are seen in agony after another Baghdad bombing or waving their inked fingers after an election or being used as props to frame Mrs. Bush during the State of the Union address, they have little more specificity than movie extras. Chalabi, Allawi, Jaafari, Maliki come and go, all graced with the same indistinguishable praise from the American president, all blurring into an endless loop of instability and crisis. We feel badly ... and change the channel.
It's no coincidence that the most astute reviews of the current political stage come courtesy of a former theatrical critic.

For when its producers continue to insist everything in Iraq is all raves and roses and has been for years now, the professional storytellers in the news media don't get much to work with, leaving any reporting on Iraq beyond the obligatory lip service to fall by the sidelines as pessimistic, damaging, inconvenient and unproductive.

As for sources, instead of even pretending to debate possible solutions, we've got the early-bird '08 front-runners, Hillary Clinton and John McCain, bantering about how they're practically bi-partisan drinking buddies, which is fabulous news if you're hoping they'll negotiate one of the presidential debates to be a beer pong battle or a keg-standing showdown at some heartland NASCAR bar, but rather frustrating for those hoping they'll actually elevate the discourse above smoke and mirrors and get something accomplished.

It seems few elites can be bothered to care about Iraq these days unless it's useful for painting a political opponent as unpatriotic or one's self or an ally as tough on terrorism, because it's gotten to the point where that's the only intelligible storyline Iraq fits into anymore.

Often, all the audience can do is fall back on that profound yet endearing human flaw of just wanting the words and pictures before them to make sense.

According to the most recent Harris Poll, the proportion of Americans who believe weapons of mass destruction were found in Iraq and that Saddam Hussein was involved in Sept. 11 have actually increased to their highest shares to date, with a solid half of the population now reporting belief in the former.

And unfortunately for everyone involved, if the distinction between truth and fiction gets much more meaningless, residing in the reality-based community may not make much of a difference.

July 26, 2006

Threatening an endangered breed

Perhaps it's officially time to start culling children's TV personalities in morally pristine isolation labs, as a PBS kids' network has fired the host of one of its programs, Melanie Martinez, because she once participated in a project involving–oh, god, shield the children!–sexual humor.

Here's the network's brilliant rationale:
"PBS Kids Sprout has determined that the dialogue in this video is inappropriate for her role as a preschool program host and may undermine her character's credibility with our audience," said Sandy Wax, network president.

Airing for three hours each evening, "The Good Night Show" airs soothing stories and cartoons designed to get an audience of 2-to-5-year-olds ready for bed.
Yes, apparently drowsy pre-schoolers are differentiating this actress from her character, looking up her prior work, tracking down footage of it, comprehending it, getting offended, re-assessing her credibility and re-evaluating their decisions to watch her current project. Right. And Condoleezza Rice is an enchanted foreign policy fairy who can bring peace to the Middle East with a mere visitation.

As many kids that age can't comprehend cause and effect and still think the people on TV are actually in the box, what we have here is plainly another case of alarmist adults crying, "protect the children!" as a cover for their own unjustified over-reaction, to the detriment of the entire enterprise.

When a culture grows so cynical that it's perfectly acceptable to all parties involved to boot an actress off a children's TV show for at one point in her career doing something not pre-emptively sanitized, said culture is in a sorry state.

For though they're not good for much else, kids (the ones who aren't drugged, at least) are starting to look like the only segment of this society allowed to retain and indulge in even a tiny bit of imagination anymore.

It's gotten to the point that whenever we "adults" see anything truly creative or imaginative, at any level, we are woefully wont to write it off as trite or offensive and project a pernicious agenda onto its producers.

One would think an audience of children would have the fewest issues with letting an entertainer's backstage baggage taint his or her present performance, a phenomenon that has no doubt blocked many adult observers from being moved by some fine pieces of work.

Despite what many supposedly educated adults seem to think (just eavesdrop on theatre-goers' commentary after M. Night Shyamalan's latest if you're unconvinced), a piece of creative work is not a reflection or condensation of its creators' past experiences, personal philosophies or moral codes. If it were, art would be terribly, terribly boring.

Besides, people who lead G-rated, well-adjusted, consummately privileged lives rarely have the impetus or inspiration to create even marginally compelling art.

Instead of punishing them, we should be celebrating those who have the guts and the professional integrity to put things out there bearing their signatures without attaching signing statements hedging and disavowing what's contained within.

For in a country where a drunken, delusional draft-dodger can be born-again as a two-term president, should we really be demanding higher standards of our entertainers?

July 24, 2006

A "Quail Hunt" joke would just be too easy

There are some topics that I have consciously placed outside my purview because I just flat out do not understand them enough to comment on them quasi-intelligently, and have on some level made peace with the fact that, barring the sudden influx of infinite time and energy required to untangle and appreciate the breadth of their complexities, I likely never will.

Two of those–the Palestinian-Israeli conflict and the culture of video gaming–are the subject of this rather interesting New York Times piece about how different entities are using video games to immerse ordinary people in all sides of tangible conflicts ranging from war to genocide to partisan politics.

Now, I don't doubt the epistemological value of interactive models. I remember during the 2004 presidential election when all us political geeks were positively mesmerized by the Los Angeles Times' interactive and maniacally addictive electoral college map. Not only did it serve the "bargaining" stage of the grieving process swimmingly by allowing users to test how just one click could have changed the entire outcome in multiple instances, but it underscores the mathematical quirkiness of the entire process. (It also serves as a handy prognostication aid in charting 2008 strategy. This business about the Democrats re-arranging the primary process is intriguing, but check out the impact of finding a candidate whom southerners can be unabashedly enamored of–just go click on all the 2004 red states that went to Clinton in 1996.)

In light of that, just imagine how much time could be gleefully yet hopelessly wasted on this one:
Douglas Thomas, a professor at the Annenberg School for Communications, is developing a redistricting game in which players try to gerrymander different states. "The election system is rigged to keep incumbents in, but nobody understands it," he said. His game is intended "to show them how easy it is to game the system. You'll be able to give it to a first-grade class and let them fix Texas. Then you can say, hey, a 6-year-old can do a more fair job."
But oh, even denser pixelated delight tinged with despair awaits:
In 2003 the Howard Dean campaign hired [Ian Bogost's] company, Persuasive Games, to make a game that showed volunteers how the Iowa primary work was organized. Then the Illinois Republicans paid him to devise four games illustrating their major election planks. In one, you have to ferry sick patients through city streets to hospitals until you discover that the hospitals have become overcrowded. The only way to free more money and space is, hilariously, to enact anti-malpractice-suit legislation. In essence the game takes a cherished bit of Republican ideology and renders it into gameplay.
Does this mean next up we'll get a game in which you get to rescue innocent little stem cell-bearing embryos from petri dishes caught in the clutches of the dastardly rogue that is Michael J. Fox (you have to snatch them at just the right moment when he loosens his grip mid-Parkinson's twitch) before he can get to a scientist to harvest them, then find some unwitting woman walking down the street to impregnate with your trusty Turkey Baster of "On-the-Offensive" Pro-Life Citizenship?

Or how about "Campaign Trail," a spinoff of the beloved "Oregon Trail?" You, the budding presidential candidate fulfilling your manifest destiny, get to stock up on endorsements and contributions before you head out on a cross-country journey fraught with perilous town-hall speeches, marauding bands of drifters tossing out attack ads, choleric babies with cholera to kiss and the most feared venomous reptiles in the land, cable news pundits. Just like the pioneering version, even if in the end you reach your destination, much of your party will have been left in shallow graves, with the survivors politically gangrenous or infiltrated by parasites.

I'd even settle for a Bush White House addition to "The Sims"–trying to keep Dubya on task would be much of the battle, as he would probably keep sneaking off to catch a plane to Crawford to clear brush, or ride his bike, make out with Condi in a supply closet or just watch cartoons. Karl Rove could even materialize out of a cloud of green ectoplasm at night and run around scaring staffers. And who wouldn't jump at the chance to redecorate Dick Cheney's undisclosed location?

Just imagine the possibilities for getting people interested in politics again via gaming–educating citizens on the finer points of confusing-by-design war on terrorism legalities with"Escape from Gitmo;" enjoying the First Amendment at work with "Evading the Press Corps with Tony Snow" and battling John McCain while stretching the definition of torture to its logical limit with "The Enemy Combatant Interrogation Experience." Hell, maybe the budget might even get balanced or the health care crisis resolved or disasters prepared for in virtual trial runs instead of real ones.

Call it crass if you will, for the question remains whether video games modeled on serious, reality-based scenarios "inherently trivialize" what they portray and allow players to manipulate.

But in the case of politics, it's all a giant game, anyway–it just has history and institutions giving it legitimacy, and most of its supposed players electing to remain sidelined–and war is just a violent variant of politics. All instantiating it in a video game does is reinforce the fact that the real events themselves are often just as arbitrary. There just aren't any do-overs.

Consider this your linguistic "cease and desist"

To all you losers with your idiotic online acronyms, needless misspellings and made-up words, now you've gone and made it personal: I would appreciate it if you would all remove the phrase "OH NOES!" from your catalogues of insipid and grammatically suspect expressions of virtual incredulity. As if I didn't already have enough trouble convincing people my last name isn't pronounced like "no" or "know." You all suck.

July 21, 2006

Desecration is in the eye of the beholder

It looks as though the baby boomers, through popularizing the personalized funeral, might finally be the generation that starts treating these tired rites like the potentially stylish social events they are.

Though we're probably still a ways off from seeing "Mortuary Makeover" or "Pimp my Wake" on TLC, when you think about it, it's only natural for upscale party planners to start including requiems in their repertoires–these days, what besides a funeral can compel the average citizen to dress up?

This is also rather encouraging news for my million-dollar idea/midlife crisis entrepreneurial fantasy: the combination contemporary funeral home/cigar and martini bar.

Funerals are for the living, after all–and if given the choice, who would choose to mill about among generic, geriatric decor dotted with tacky photo boards, mismatched memorial bouquets and multimedia slideshows set to crappy country songs when they could gather to reminisce over a fine cigar or a spot of gourmet spirits in a stylish lounge space actually crafted by someone with an eye for form and color?

The whole point of modern art and design, responding to the chaos of World War I with its Mondrian grids and Bauhaus lines, is evoking visual order, elegant simplicity and mathematical harmony to instill a sense of rational, intellectual comfort in its experiencers; wholly future-looking and bereft of referents to a painful past. What could be more perfect for a funeral?

Of course, it would also be perfect to moonlight as a delightfully morbid hotspot after dark.

So, if any dashing, wealthy young morticians out there want to go into business, let me know. Just be aware you'll have to look good next to me in the Vogue photoshoot titled something snappy like "Haute couture goes six feet under" or "And the corpse wore Dior" that follows when this inevitably becomes freaking huge (and what luck, they wouldn't even have to do my makeup).

For as I mentioned a while back in the now-deceased paper that used to actually pay me to ramble on for 1,500 words about such things, I suspect funeral homes are going to have to evolve, and I want to get in on the subterranean floor.

And because it's relevant, I'm lazy and I recently cleaned out my clip file and decided this deserved a resurrection, here's said article, "The death factor," published in Coreweekly Dec. 1, 2005:


Death does strange things to us. Turn on the local news on any given day and you'll find at least one inconsiderate prick who has suddenly, by simple virtue of expiring, transformed into a "really great guy who was always smiling and always helped anyone in need."

We also do strange things to death. Our departed public figures lie in state before lined-up tourists and television cameras, yet photos of our military dead, already sealed in flag-draped coffins, are officially censored and deemed tasteless by many viewers.

American attitudes toward death are as varied and American attitudes toward sex, politics or religion–a key difference being that we actually talk about those other issues, not just hear about them.

Death may be the closest thing to a practical, cultural reality we all share: It frames our news, it scripts our entertainment, it defines our values and it charts the courses of our lives.

Like anything else, death also has its trends. Life expectancy has continually climbed over the years, with the latest Centers for Disease Control figure projecting the average American can expect to live 77.6 years. Death rates are also declining, particularly relative to increasing birth rates–for every American who dies each year, 1.6 are born.

This impacts not only how we define dying and prepare for it, but how accessible we have to make it in our more immediate spheres. Modern medicine allows us to stave off death, but it also draws it out, protracting it into a kind of overlooked gradualism, hidden from view in hospitals and hospices at its end.

At the same time, popular culture is alive with death of every twisted, vivid persuasion. On television, "CSI" makes it a solvable mystery, "Crossing Over" makes it temporary and "Six Feet Under" made it sexy. Death-evocative imagery abounds in advertising, from the humorous to the high fashion. We resurrect dead celebrities to endorse contemporary products, or cast live ones into glib death pools. We strive to simulate death as realistically as possible on film and in binary, even making it interactive.

We create our virtual death, we select to experience it–and thereby we feel like we can command it. But we don't want the everyday, cold death we cannot ensnare hanging around reminding us of our own fragility. When we see it, we're shocked and repulsed, and we ask whomever is showing it to us to stop, out of respect or good taste. Sometimes. Like so many things in our culture, it's complicated.

One facet of this country's markedly ambivalent aversion to corpses is the rise of cremation relative to traditional burial. The Cremation Association of North America recently released a survey that revealed nearly half of all Americans plan to choose cremation to dispose of their mortal remains. In Wisconsin, roughly a third of all deaths are cremated, a hair above the national average.

Cremation's flaring popularity inspires all sorts of meditations on its meaning, from returning ash to ash to re-igniting the symbolic immortality of the spirit. Somehow, it just seems a less morbid, less unsettling alternative–one that many people would be drawn to for reasons grounded in everything from existentialism to environmentalism.

Unfortunately, we may give our fellow mortals a bit too much credit: The most-cited reason for choosing cremation, according to the Cremation Association, is actually saving money. A view of cremation as a simpler, less emotional and more convenient alternative to casket burial comes in a distant second, with less than half the votes won over by the lower price tag subtracting the coffin affords.

No discussion of American attitudes on any subject would suffice without a nod to capitalism and the mighty bottom line. Though death is commonly referred to as the grand leveler, studies have repeatedly shown life's material and social inequities transpose onto death. Those on the lower rungs of a society die more frequently, die earlier and are laid to rest under different circumstances.

In the death industry, billions of dollars pump through just the funeral business each year. According to the National Funeral Directors Association, traditional burial services, not counting cemetery costs, have ascended to average in excess of $6,500.

Dying has become a major, planned-for investment, joining ranks with buying a car or a spot in a retirement condo, and is increasingly precluded by years of costly medical treatments and "end-of-life" care.

Accordingly, trust has remained vital to the funeral industry. The average funeral home in America has been in business for more than 60 years, and the vast majority are owned by individuals, families or small, private corporations. Unless Wal-Mart starts offering one-hour embalming and volume-discounted cremations, it's safe to say Americans prefer to trust their treasured carrion to 'Mom and Pop.'

Americans also cling to the funeral tradition itself, even as casket burial declines in popularity. Nine out of 10 people who want to be cremated also want some type of service held in their remembrance, meaning their customer base is not likely to desert funeral homes any time soon.

The funeral home also remains consistent, a virtual time capsule of interior design. Meant to evoke the familiar comfort of Grandma's house, in actuality it issues an unnerving, distinctive ambiance far more laden than the sum of its paisley fabrics, muted chandeliers and Thomas Kinkade prints. Funeral homes aim to provide spaces as utterly non-threatening and unobtrusive as possible to mourners. Whether they will one day have to "modernize" in look and feel to accomplish that end remains to be seen.

The old axiom that funerals are for the living still seems to hold–but funerals are increasingly being directed by the dead. Pre-planning funerals has become standard operating procedure for many older Americans (some of whom actually seem to enjoy it), both to alleviate burdens on loved ones and exercise post-mortem control over the circumstances and the economic aftermath of death.

Seeking figurative immortality by leaving some kind of tangible legacy or influence on mortal life has been a dominant theme since the beginnings of Western culture, instantiated in the modern world in everything from organ donation to ghost lore to compulsively journaling or scrapbooking one's memoirs.

Others are making more innovative, some might say more unnerving attempts at immortality. There are actually Web sites out there where you can store messages and multimedia content to be e-mailed to your surviving loved ones upon your death. One company, LifeGem, makes synthetic diamonds out of cremains. Another markets "memory medallions," which are computer chips embedded into ordinary cemetery tombstones that transfer digital data about the deceased to a visitor's PDA or laptop.

The flourishing of death in contemporary life raises questions over whether death and grief have become more public as a result. A notable cultural moment in this regard occurred in March with the Terri Schiavo case, which during its 15 minutes in the 24-hour news spotlight, thrust end-of-life issues into public discussion.

The Schiavo case demonstrated that death remains intensely personal and intensely unpleasant. Public opinion polls revealed government intervention in Schiavo's death to be among the George W. Bush administration's most substantial public relations misfires, with most Americans viewing it not as a noble attempt to save a life, but as an insulting, meddling and moralizing political stunt.

But following a flurry of news reports on the "right to die" debate and a reported rush to write living wills, death once again dropped out of the national consciousness. It has not, however, gone to sleep.

In October, the Supreme Court took up the Bush administration's challenge to the state of Oregon's "Death With Dignity Act," which allowed doctors to prescribe lethal doses of drugs to competent, terminally ill patients who wished to end their lives. The act was approved twice by state voters and more than 200 residents ended their lives under the act before it was challenged in 2001 by then Attorney General John Ashcroft.

The Court's ruling, expected sometime before June 2006, will have serious implications for how all Americans are allowed to live and die–yet the case barely made news.

The case is also John Roberts' first as the Supreme Court's new chief justice, but during his confirmation process, attention remained focused largely on how he might steer the Court regarding abortion rights. This other life-and-death issue, with the potential for wider reach into the lives of ordinary people, did not even enter into the discourse.

So continues our culture's awkward dance with death, drawing it near yet holding it at arm's length. As we gain increasing control over all arenas of our lives, we naturally seek to have that control over our deaths.

More than ever, death is viewed almost as an inconvenience, disrupting our schedules and disrespecting our feelings for a while, but ultimately receding into the cultural background behind more pressing matters.

It's not economical to think too seriously about death in our day-to-day lives, as any conclusions the living reach are by nature unverifiable.

Of course, that doesn't stop anyone from attempting to give death reason, order and purpose. In the process, we as a culture reveal what we think about life: Whether too short or too hard, it's at least better than nothing.

July 20, 2006

Actions, words and their relative volumes

In more ever enthralling "American Values Agenda" news, the House of Representatives voted Wednesday by a disturbingly large margin of 260-167 in favor of a bill that would bar federal courts from ruling on cases challenging the constitutionality of the Pledge of Allegiance for its inclusion of "under God." The measure next travels to the Senate, which for the moment seems pleasantly saner and more inclined to let it languish.

I bring this up not because of any need to revisit what a load of crap the "we need to protect this sacred expression of our nation's founding Christian heritage from the evil activist judges" argument is (but if you want to revisit it, I'm happy to oblige, twice), but because of this novel and nonsensical argument to come out of the current "debate:"
Rep. Todd Akin, R-Mo., who sponsored the measure, said that denying a child the right to recite the pledge was a form of censorship. "We believe that there is a God who gives basic rights to all people and it is the job of the government to protect those rights."
Yes, because I'm sure scores of kids with patriotic Tourette's are just clamoring to spontaneously recite the Pledge during school hours and are being unjustly silenced by those godless liberal teachers, and Congress needs to immediately ride to their rescue. Next they'll be claiming reciting the Pledge improves test scores.

If this measure goes anywhere, in addition to some serious First Amendment issues, the precedent such issue-based cherry-picking would set would render any notion of an independent federal judiciary and a legally sound Congress a complete joke (and knowing this lot, lawmakers would probably bar courts from declaring their own illegal marginalization illegal as well) and leave citizens no recourse beyond state courts on pressing issues of the legislative day, which I suppose is precisely the point.

If current leadership cared half as much about actually upholding the substance of this government's founding principles as they do about offensively defending feel-good, flag-waving rhetoric by eroding the very checks and balances enshrined to uphold pluralistic, democratic freedoms, they just might accidentally do something productive.


And on that note, in honor of President Bush's very first, glimmeringly pro-life veto, which stuck it to scientists, Michael J. Fox and that pesky thing called public opinion in one fell swoop by striking down a bill expanding federal funding for stem cell research, enjoy this favorite Doonesbury strip that, despite being published in April, still sums up the situation quite nicely.

July 19, 2006

Do you want searing existential guilt with that?

Every now and then, there comes an article like this one that activates within me a latent paranoia that one day I'm going to wake up and find all the public sector and customer service employees around me have turned into petulant armchair ethicists.

For instance, what if my cashier at Panera turns out to be a militant vegan and PETA activist and assumes it as a moral duty to lecture me on the cruel origins of or, god forbid, deny me by Bacon Turkey Bravo, and with it one of my few consistent sources of joy in the world? By what sick and twisted ethos is that cool?

We've all heard about religious kooks withholding medication or emergency care from sinners on ethical grounds (and sorry, deitistical determinists, but there is a "choice," and it's called taking another job), but is what they're doing any more justified because it has the Bible behind it?

Take this fine homegrown "professional:"
Ultrasound technician Donald Grant of New Richmond, Wis., was fired by a Minneapolis clinic in 2002 after he prayed with a patient to try to persuade her not to get an abortion.

"I'm not a rabid pro-lifer, but I know what I believe," said Grant, also a pastor at a small Pentecostal church. "I was not condemning in any way. But I had no choice but to speak my conscience."
Oh, really? Then do I have no choice but to speak my aesthetic conscience and tell a fellow shopper she's several sizes too big to wear a particular style flatteringly and then suggest something else? Unfortunately not, because not only do I want to avoid physical assault, but it's not my place to offer my opinion.

Add unequal power relations to the mix, in many cases excepting any expert or evidentiary backing, and speaking one's conscience starts to look a lot like unjust usurpation of agency.

Operating in black and white ethics might make you a fine activist or ideologue, but it makes you a pretty piss-poor public servant.

If you sign up for a job that strives for the fabled "objectivity," you have to make internal peace with the fact that you're going to have to deal with people whose choices, ideas and lifestyles you do not personally agree with, and treat them as though they are just as valuable and worthy as anyone else.

As a journalist, quite often you're asked to write about and give exposure to people, groups and ideas you don't personally support. But guess what? You're doing a job and performing a role within a pluralistic society, in which other people exist. And like it or not, but it's the price you pay to live in a relatively free country, they matter, too.

Now, you can always argue your case, refuse the assignment, brand yourself an insolent, biased little prima donna and feel all good about yourself, but odds are the editor is just going to pass it along to someone else and it's going to get printed anyway, with any insight or good you could have passed along by performing your job with your skills and your unique perspective from then on being relegated to fluff pieces on the weather and smalltown crustacean festivals. Who possibly benefits? Not you and your professional integrity, and certainly not those who are exposed to and affected by your work.

When you sign up for the intermediary position–between doctor and patient, between editor and reader–the reality is that you usually don't get to make the grand ethical decisions. Therefore, you have to pick your battles wisely.

I'm all for standing up for your beliefs, but only when it's done pragmatically. For instance, I will never understand why people who are morally opposed to birth control or certain types of "families" choose go into medical fields like obstetrics and gynecology, in which they must realize a significant chunk of their practice will fall under the rubric of ethically untouchable.

That would be like me joining a convent and then refusing to say the prayers, participate in the rituals or wear the symbols because they go against my beliefs.

Perhaps they think they can infiltrate the infidels' den and convert them all from within, who knows. But there are far more effective and positive ways to fight for your cause if you feel that's your calling; and many, many professional positions you can take that allow you to exercise your talents without requiring you to exorcise your ethics.

The EMT in the Post article's lead who refused to transport the abortion patient seems sympathetic and on some level morally admirable if only for the strength of her conviction–until you pause to reflect that she's probably obliviously and unknowingly transported everything from rapists to neo-Nazis to TV weathermen, and thereby aided, abetted and tacitly approved of them all in their evildoing by prolonging their healthy animation.

When you work in a public service field like medicine and deal daily with ethical dilemmas over which intelligent people can't move beyond impasse, where do you draw the line, and just what gives you the omnipotence to decide precisely which members of the public deserve your services?

The double standard here is also glaring. Just imagine if some pharmacist tried to deny a prescription to a hyperactive kid with a cold on grounds that he judged it a frivolous, potentially harmful act of over-medication; or if a doctor counseled a woman in favor of abortion if it turned out she was predisposed to some horrible genetic disease and felt it unethical to risk a pregnancy. Could a health-conscious cashier at an ice cream shop deny a cone to an obese customer without courting a lawsuit? Could an environmentally minded gas station attendant refuse to fill up a Hummer without being pilloried?

And remember, birth control and abortion are still on the legal side of the glorious if battered membrane separating church and state. What if some store clerk tried to refuse selling someone a gun based on personal moral opposition–the NRA lobby would be on it in an instant shouting fundamental rights foul and any such "moral clauses" governing dispensation of firearms would be stricken in a snap.

Listening to someone, treating someone or otherwise serving someone does not mean you're thereby supporting them and everything they stand for. There's an old saying about the cream always rising to the top–ideally, that's true for the good, the right, the just and the beautiful as well, eventually.

There's also an old saying about letting people hang themselves.

I like that one. Any philosophy echoed in a Guns N' Roses song can't be wrong.

July 16, 2006

Curing the common child

Remember the days when kids were just weird, perhaps a tad hyperactive, or–imagine, for the sake of nostalgia–just being kids? Now, it seems, the typical little Johnny is mentally ill and in need of piles of pills for his "grab bag of mood disorders."

As much as I hate to side with the Scientologists on anything, this society's escalating obsession with mind-altering and a la carte prescription drugs is downright disturbing.

Is there no longer anything to be said for a culture in which being unhealthy is still taboo, personal and discreet, instead of something for even its youngest members to wear like a badge of (post-?) post-modern, world-weary communal hipness?

Granted, of course some kids are truly functionally impaired or pose threats to themselves or others–but it's tough to believe the prescription trend data that suggest in just ten years, five times more children have become genuinely psychotic.

Yet the pharmaceutical free-for-all is most definitely on. For instance, according to the first Times article linked to above, summer camps, now facing a "proliferation of children on stimulants for attention deficit disorder, antidepressants or antipsychotic drugs – or on cocktails of all three," plus prescription allergy medicines whose seasons sometimes overlap, are having to set up systems for dispensing drugs, sometimes turning to businesses that have sprung up to fill this privileged niche.

Some dispense campers' capsules at meal times, making it as much a part of the regular routine as macaroni art and "Kum-Ba-Ya:"
Many parents welcome the anonymity that comes when a lot of children take this, that or the other drug, so none stand out from the crowd.

"It's nobody's business who's taking what," said one parent of an Echo camper whose child is medicated for A.D.D. and who asked not to be named for privacy reasons. "It could be an allergy pill. The way they do it now, he feels comfortable. He just goes up with everybody else, gets it and then carries on with his day."
For when everyone's getting drugged, no one's precious self-esteem is threatened–just in case it and all other emotions haven't already smothered in a pharmacological haze.
Other camps prefer the infirmary, to provide more privacy. Camp Pontiac in Copake, N.Y., built a special medication wing with its own entrance and a porch where campers wait their turn.
Does this remind anyone else of the second "Adams Family" movie in which they sent Wednesday and Pugsley to camp, and they got tossed in the "Harmony Hut" until they emerged with negative attitudes inverted by sappy musicals and inspirational posters? How creepily Orwellian.

And I understand there are regulations and all, but if you're going to drug the little monkeys, shouldn't you also be teaching them to be responsible for their own, so allegedly vital medications? When I was younger, for several years I took a daily pill to prevent migraines. I didn't list it when I went on school trips because I didn't want to be branded as defective, nor did I fancy some scattered and inexpert chaperone playing gatekeeper where debilitating headaches were concerned. And, you know, I could be trusted not to suddenly swallow the whole bottle like they were delicious, if muted, Altoids.

Part of what's so irritating about this trend is that it seems to stem not so much from evidence-backed concerns over health and well-being–I don't know about contemporary campers, but myself and most of my cohorts managed to survive playing outside as children without an arsenal of antihistamines to defend against every other particle in the air–as from a desire for complete control and institutional reinforcement that everyone's choices are correct.

Take the recent uproar over the government-funded ad campaign touting breast-feeding by implying alternatives are risky and detrimental to babies' health. Its "negative framing" is making its point, but upsetting mothers who formula-feed. Apparently, even if the science says breast-feeding is healthier, to shield the self-esteem of all the validation-seeking mothers who opted for different methods, officials are supposed to hand everyone a figurative cookie and report in the same breath that not breast-feeding is just as well.

Though medicine has long been deemed both an art and a science, the classical humanities hold little sway relative to the vocal contemporary culture of entitlement and control freakdom when it comes to determining when and how to interfere with the inner machinery of body and mind.

And though scenes change and tones shift, classics tend to stay classic for a reason. And until someone invents a drug to combat "reckless procreation" or silence its wailing spawns, I'm holding my applause.

Making the world safe for democracy, again

Upon reading this opinion piece in the New York Times, I got to thinking–how much sense does spreading democracy really make as a means of fighting terrorism from rogue states or fanatical free agents?

(Not to mention the home-grown nuts it inspires to send faux anthrax to newspapers exercising their democratic rights.)

If you're a terrorist and you're angry at a democratic country's leadership, striking its civilians is perfectly legitimate. For as democratic rulers are so fond of proclaiming, probability holds that the majority of citizens you hit will have voted for that leadership. (Remember those elementary school math problems with the bags of red and blue marbles? How's that for applied arithmetic.)

And for the slow members of the jihadist class, America has given them a simple bicolor map to boot. And mock the vintage John Kerry stickers clinging tenaciously to their smattering of Subaru bumpers all you like, Bush/Cheney '04 gloaters, but perhaps dwindling fuel economy shouldn't be your only international petrol-politics worry.

Now, nobody is going to tell you attacking citizens in stead for their government is a morally praiseworthy thing to do (or as productive or entertaining as, say, channeling one's anti-American rage into making subversive YouTube videos). But though Dear Abby has yet to rule on the issue, it seems well within the realm of proper militant etiquette for the post-Sept. 11 world our leaders love to remind us we're living in every time they're caught seeking to restrict rights and freedoms.

Still, the fact remains that amid all this concern about combating stateless terrorists and rewriting the old "laws of war," a little-mentioned side effect of spreading democracy is that more people are brought to share in the violence.

From a purely tactical, preventive standpoint not colored by practical, economic or ideological concerns, we should be implanting systems with leadership as concentrated as possible, at least in figurehead form. That way, the boys with the guns can play their games with a series of jolly little coups and counter-coups for the official leadership posts while leaving life for the rest of us in relative peace, security and, you know, aliveness.

Just a thought. And as George Carlin might say, one that demonstrates succinctly why I'm spending my Saturday night alone.

But at least now we know why all those Mule Day Parades and lame community attractions might have been on last week's much-mocked state terrorism target list: high concentrations of red marbles.

July 14, 2006

Week in review (with presidential squirrels)

Damn, I go away for a week to bask in the corruption and deceit permeating our nation's capital and my soon-to-be residence and what happens, I return to find Time magazine declaring "the end of cowboy diplomacy" while the administration flip-flops and tries to claim the penned-in evildoers at Gitmo have basic human rights under the Geneva Conventions that have been honored all along (by appropriately narrow definitions it's now trying to delineate, of course) and turns all namby-pamby, sharing-is-caring on us by claiming it will allow "limited judicial review" of its eavesdropping programs. Weak.

Also in recent days came the shocking revelations that boys are dumb and Homeland Security is a bad, bad joke. Plus, in the spy scandal that just won't die, the merry band of leakers comprised of Dick, Karl and Scooter is now getting sued by outed CIA-operative Valerie Plame Wilson.

And President Bush, ever the poised ambassador to foreign lands, seemingly neglected to pack his Ritalin and got far, far too excited over devouring a roasted pig carcass on a diplomatic visit to Germany.

In other livestock news, the biggest story on the local front seems to have been a few heinous acts of cow-tipping targeting the tacky yet happily temporary "cow parade" sculpture installation grazing about downtown.

And on the unnecessary neuphemism front, I've discovered much to my chagrin that I'm not merely a "singleton," but I'm also "pre-pregnant." Lovely.

Now that we're all caught up on the news, don your unimaginative neckties and star-spangled visors, fellow patriots, because it's time for a frivolous Washington DC mini photo safari!

After walking past the Justice Department daydreaming that maybe, just maybe Ashcroft had once done likewise, scaling the steps of the Lincoln Memorial in stilettos and dropping by the well-manicured abode of my bestest pal in the whole wide world, I spotted some fluffy little squirrelly squirrel squirrels scampering about the White House lawn out front, quite plainly plotting a spot of markedly unpatriotic villainy:



A pretentious, artsy shot of a bloom on one of the city's many magnolia trees, this one near the Eisenhower office building:




The sun sets on the grand symbol of oversized phallic power that is the Washington Monument (Later, a full moon rose dramatically adjacent to it, and Dick Cheney turned into a bat and circled its peak gorging on nocturnal insects and hissing at onlookers.):




And unfortunately I couldn't get a picture without causing a fiery pile-up, but I laughed harder than I've had reason to in months when I saw a road sign for the CIA headquarters, symbolically named the "George Bush Center for Intelligence," in honor of H.W. but no less amusing.

July 05, 2006

Don't do anything crazy

And that goes for readers and leaders, for Freak Typography is going on a brief hiatus. I'm going to be busy engaging in an extended binge of drunken debauchery in honor of President Bush's 60th birthday. Not really. But I can dream.

July 03, 2006

Malapropisms are like candy to terrorists

It pains me to say it, but I have something in common with our executive branch in that I'm very, very disappointed in the New York Times for its careless printing of material that threatens some of our most fundamental societal securities.

Not for disclosing "secret" international bank-monitoring programs the officials themselves held press conferences touting earlier in the "war on terror," but for this sentence, which appeared near the end of one of the most-emailed articles over the weekend, a piece on an airfare price-predicting Web site:
If Farecast tells you that fares are going to go down, it would be smart to check everyday until Farecast changes its advice to buy.
If you see nothing wrong with that sentence, look again. If you still see nothing wrong, you are a horrible person who is perpetuating wanton dictional assault, for which I hope you are thoroughly ashamed of yourself.

You see, anyone who's dabbled in copy editing has his or her grammatical and stylistic pet peeves (like how the English language lacks a gender-neutral singular pronoun, for instance)–those small yet grating errors that abound in popular prose that no one else seems to realize are, in fact, errors.

Mine is the use of "everyday" where "every day" is correct (and the rare and elusive but doubly god-awful vice versa).

To clear up this madness once and for all: "Everyday" is an adjective. You know, those words that precede and describe other words, like "crunchy" or "loquacious." It means common, ordinary, usual, run-of-the-mill. "Every day" means, literally, each day.

In sum, if the New York Times were to print such error-riddled drivel every day, the English language would surely suffer as a respected and widely disseminated news source compounds the everyday scourge of poor grammar.

See, it ain't that hard, kiddies. Next time, somebody gets flogged–because freedom dies, just a little, each time you unnaturally merge your syntactical units.

Trimming the rhetorical fat

It appears as though the CDC is considering tossing its "fuzzy language" labeling overweight and obese children as "at risk of overweight" and "overweight," respectively, and applying the traditional adult classifications to the nation's not-so-little ones.

While the child-specific terms had been used to shield tender if tubby ears from negativity, doctors fear it has served to downplay a serious health threat to millions of kids for the sake of sparing feelings.

For of course, science must ever be weighed against the all-important self-esteem. And for those who think the "overweight/obese" distinctions are indeed too insensitive, buck up–it could be worse...
Top 10 rejected CDC classifications for obese children:

10. paunchy young patriots

9. gross

8. exceedingly well-fed citizen youth of the greatest, most prosperous nation on Earth

7. fat sack-of-craplets

6. distended with freedom

5. Karl Rove's extra plump, extra special main course

4. real kids

3. Cheneys-in-training

2. husky moppets

1. unless they shape up, no good to the armed forces