Thursday, January 19, 2012

"Cuffs With Pleats? Really?"

You would think that having a tailor replace some blazer buttons and hem a pair of trousers wouldn't require arguing with the tailor, but you'd be wrong.

As part of a recent and long overdue clothes buying spree, I bough a new blue blazer and a pair of gray flannel trousers. I have a set of blazer buttons I wanted taken off my old blazer and put on the new one. But the new blazer has three buttons on each sleeve and I have only two buttons per sleeve. I tell the guy where I want the two buttons sewn on each sleeve and he starts arguing with me, says it won't look right (never mind that's how they were on the old jacket!) and I pretty much have to promise not to tell anyone I meet who sewed them on the way I want them lest his reputation as the finest tailor along this dilapidated strip mall in a seedy part of town suffer.

Then I have to tell him that, yes, I do want the original blazer buttons back, as the day will come when I retire this blazer, too, and he finds the whole idea extremely odd as though I asked for the shreds of thread they'd been sewn on with back as well.

On to the slacks. They're a 36" waist which is a bit tight on me but a 38" is too loose. Maybe I should have ordered them in a 38" to be taken in but I'd prefer to think I'll be losing weight this year, not gaining any more, so slight snugness is part of my dieting strategy. He seems dubious about this strategy and on this point I can't say I blame him.

So I'm standing on the platform (why do tailors need a platform barely four inches higher than the floor?) and without asking he starts to mark them with no break and no cuffs. I tell him I want a slight break and 1 1/4" cuffs as God intended man to wear dress slacks. He says "But the pants don't have pleats." Relieved, since I didn't order pleated slacks, I agreed with him but reiterated I wanted cuffs. Apparently, that's not the current fashion. So again I assume the risk of being unfashionable, reassure him that I'll never ever mention the name of Hana Tailor on Garland Avenue to a living soul and we're done.

I tell you, buying off-the-rack clothes can be sheer hell!

Monday, January 16, 2012

Twitter!

I won't go so far as to say I've abandoned this blog entirely, but a cursory perusal of post dates should suffice to demonstrate that it's no longer the passion it once was.

Having participated as a blog commenter, as a member of several group blogs, most of which ultimately ended because of technical catastrophes (but one of which, still flourishing, I was summarily fired from before I so much as managed to make more than a couple of posts), I also tried the solo blog route for a while. Some people can maintain a blog and not care how often or, from what I can gather, what they post. I found it was either too much work to make it the way I wanted it to be or not worth bothering with at all if I was only going to post infrequently.

Still, the damned thing is still up and I'm not entirely ashamed of everything I've written here, so I've linked it to my new Twitter account should anyone care to read some of my longer than 140 character scribblings. Should you have found your way here recently, be welcome, enjoy poking around but don't expect much new content.

As for the Twitter account, well, we'll see. My objective there for the time being is to post the occasional aphorism of the sort Oscar Wilde or Dorothy Parker might have written if they were, you know, alive.

And talentless.

Sunday, December 18, 2011

Next Up, Fearless Leader?

And just in time for Christmas, too! Dictator Kim Jong "I'm So Ronery" Il is, if there's a just God, already roasting in Hell, dead at 69 and not one moment too soon.

The Associated Press reports:

Kim Jong Il inherited power after his father, revered North Korean founder Kim Il Sung, died in 1994. He had been groomed for 20 years to lead the communist nation founded by his guerrilla fighter-turned-politician father and built according to the principle of "juche," or self-reliance.

In September 2010, Kim Jong Il unveiled his third son, the twenty-something Kim Jong Un, as his successor, putting him in high-ranking posts.

From the "Great Leader" to the "Dear Leader" to... well, you get the joke.

Friday, December 2, 2011

How Many Bloggers to Change a Light Bulb?

My daily driver is a 2005 Honda Accord. The driver's side low beam headlight was out. I checked the fuse, no such luck. Went to the auto parts store and bought a new bulb. Came home, pulled out the owner's manual and found the section about changing said bulb. Turns out you must turn the front wheels as far left as possible so that you can (1) remove two plastic rivets that hold the fender's plastic inner lining, (2) pull out but not entirely off (because it doesn't come off entirely and if it did you would almost certainly never be able to get it back on) but open enough to locate the bulb socket, (3) reach up some ten to twelve inches with your forearm, scraping same to the point of drawing blood along various rough edged steel obstacles between you and said socket, (4) turn said socket 1/4 turn counterclockwise, (5) replace bulb, being careful not to touch the glass of the new bulb lest it burn out prematurely, (6) reinsert socket, turning 1/4 clockwise to secure it, (7) remove arm, further scratching same, tuck plastic lining back and secure with plastic rivets.

This procedure worked splendidly, especially the part that wasn't in the manual about scratching your arm all to hell. Except for the part about getting the socket out to replace the bulb. No such luck again. No matter how hard I tried, I couldn't turn it one iota, let alone the multiple iotas necessary to turn it to nine o'clock. Nor, of course, were the plastic rivets of any actual use when I finally gave up in frustration and tried to secure the inner liner as instructed.

I wondered if my fellow Accord owners had experienced similar frustrations and so I did a bit of internet surfing until I came to a forum where an alternative method of removing the bulb was described. Unfortunately, this method required removing the battery first so you could reach the bulb socket, this time reaching down from the open hood instead of up from the tire well. Removing the battery results in three bits of consequential annoyance. First, the four digit navigation system code must be entered after you've put the battery back in. Second, you must enter the radio anti-theft code, a different four digit code even though the audio system and navigation system are essentially part of the same control panel. Then you must reset your AM, FM and XM radio channels because holding such information in a flash memory chip or some such would have cost Honda at least a quarter or two more to built the car.

These codes, by the way, are written on one piece of paper that came with the car. Fortunately, I still have that piece of paper in my files. Knowing what the codes are, however, does not suffice because the method of entering the navigation code is different from the method of entering the radio code. Instructions for each is in the owner's manual.

Well, I figured, that's a bit of hassle, but better than taking the car to those blood-sucking, thieving bastards otherwise known as the nearest Honda dealership to have a light bulb changed. So I take the battery out and, sure enough, although reaching the bulb socket is still awkward, it is comparatively easier. The only problem is that the socket will still not turn. I could not turn it by hand and I could not turn it by wrench, at least with as much torque as I thought I could get away with before the plastic socket -- had I mentioned it was plastic? -- might well just break off. So I put everything back together, closed the hood, started the car and spent however long it took to enter the navigation system code, the radio code and reset the six AM stations, the twelve FM stations and the twelve XM stations.

I surrendered to the bitter recognition that I was not going to be able to change a light bulb by myself and that I would have to take the car to a shop presumably equipped with a hydraulic lift and whatever other tools and such the job apparently required. (Not, I should add, to the blood sucking, thieving bastard Honda dealership but to a local mechanic we've had reasonably good service from.) Which I did yesterday evening. And, lo and behold, the shop called late this morning with the 'good news' that the light bulb had been replaced. Total cost: sixty dollars.

I go to pay the bill and pick up the car and as I'm leaving the shop the mechanic says, "Oh, by the way, we had to take the battery out."

Wednesday, May 11, 2011

John Hannah (JsubD), RIP

John D. Hannah, known in various online libertarian circles as JsubD, died several days ago. He was fifty-five years old. A retired Navy Master Chief Petty Officer and a widower who years later still mourned the death of his wife, John lived in the Detroit area and was reportedly homeless and living in shelters the last few months of his life. He would, according to one shelter operator, walk over to Wayne State University library every day, presumably to use their computers.

Detroit Free Press columnist Mitch Albom wrote of John's death here. Ironically, had he not decided to write about John and "the death of being forgotten," those of us who knew John as well as one can know online friends, knew of his terminal illness and were distraught by his failure in the last several months to tell us of his status might never have learned of his death. One member of the discussion forum we all frequented, a woman known among her friends for her relentless tenacity and in general for her feral genius, found the Albom column. From there we had only to perform the sad task of correlating what Albom wrote with what we knew of JsubD to confirm it was one and the same person.

John may have died homeless or, in any case, having chosen to live at the shelter for several months prior to his death. He was, however, a highly intelligent, knowledgeable and articulate man who was also clearly aware of his situation and the options available to him when we heard from him last on that discussion forum. I have every reason to believe, in other words, that the circumstances of his final months were significantly of his own choosing.

Far from forgotten, John had simply become temporarily disconnected from his family and his many online friends, one of the inevitable risks of exclusively internet friendships. I counted him as a friend, I mourn his passing and I will remember him.

Thursday, December 31, 2009

The Decade In Review

2000 (UN International Year for Declaring Peace Is Better Than War) – Stunned by the near collapse of civilization caused by the Y2K disaster, the dot.com bubble bursts, as millions of investors lose billions of dollars in what turn out to be essentially worthless assets. Fortunately, however, this sobering experience teaches both personal investors and the financial community a lasting lesson in the constant need for due diligence and sound investment in real and lasting value and the financial markets have grown steadily except for a minor downturn in 2007 ever since. Hillary Rodham Clinton furthers the cause of feminism by becoming elected a senator from New York on the strength of her personal qualifications alone. The Jamaican luge team mistakenly arrives for the Summer Olympics.

2001 (UN International Year of Kubrick Films) – Outraged at the discovery of a towering obsidian slab appearing in the midst of one of their millions of holy sites, a handful of Middle Eastern religious fanatics with financial and ideological ties to Saudi Arabia commandeer several commercial airplanes and crash them into the Pentagon and New York's World Trade Center, killing thousands. An even more outraged America supports President George W. Bush's decision to retaliate by laying plans to invade several neighboring Middle Eastern countries lacking the Bush family's close personal ties to the Royal Saudi family. Bush also explains to the American people that the terrorists “hate us for our freedoms” and that the only way to stop future acts of terrorism is for the U.S. government to take away as many of those freedoms as possible. Following emergency legislation by Congress, American pundits are required by law to write at least one version of the “bombing Afghanistan into the stone age would be an upgrade” joke as a prelude to the invasion. The U.S. then invades Afghanistan, a nation that had fought off the Soviet Union for a decade, and defeats it in a war lasting approximately thirty seven minutes. The Taliban is destroyed and never heard from again.

2002 (UN International Year of Ecotourists Wearing Crocs) – Twelve member states of the European Union convert their national currencies to the Euro, replacing portraits of national rulers and nationalist symbols with astrological symbols and a picture of Orangina on one side and a portrait of Bono on the other. Tourism booms as Americans discover they now need to convert only one foreign currency into “real money.” Except, of course, in England which retains the Cadbury bar as its official unit of currency. The No Child Left Behind Act is signed into law by President Bush. Parochial schools briefly consider but decide against a similar policy entitled No Child's Behind Left. Citing the lack of preparedness of most high school graduates, the School of Hard Knocks founds the Light Slaps Preparatory Academy. John Allen Muhammad (nee Williams), aka The Beltway Sniper, stages a seemingly random series of attacks in the Washington D.C. area, eventually killing at least ten people. Upon his final capture, white people are relieved to discover that finally, at long last a serial killer turns out not to be one of them.

2003 (UN International Year of Fresh Water Fishing) – Hu Jintao becomes president of the People's Republic of China, running on the campaign slogan “Let a million factories to supply Wal-Mart bloom.” Secretary of State Colin Powell explains to the United Nations that the U.S. has irrefutable proof that former President George H.W. Bush mislaid his Skull & Bones key ring somewhere in Iraq and urges the UN to authorize his son to “send in a few troops to look for it.” As it turns out, the key ring turns up several years later in a desk drawer in the Oval Office inside a box of stale cigars with a cryptic Post-it note reading “To the Big Guy, Love, Monica.” While they're there, though, U.S. troops stumble onto Osama bin Laden in a hidden bunker playing pinochle with his best friend, Saddam Hussein and a vast store of weapons of mass destruction including the collected works of Pauly Shore. Despite Hussein's former close personal ties to the Bush family, he meets an untimely, in the sense of overdue, death in a failed bungee jump attempt. After a stern talking to, Bin Laden admits that “this whole terrorist business has gotten out of hand” and retires to one of his family's summer caves on the Pakistani border. Al-Qaeda is disbanded and never heard from again. Gasoline prices plummet after Iraq enthusiastically embraces democracy and remain around 25 cents a gallon ever since.

2004 (UN International Year of Rice-A-Roni, “The San Francisco Treat!”) – Ronald Reagan dies. As the six day long state funeral generates the highest Nielsen ratings of his career, Congress narrowly fails to pass legislation renaming America the United States of Reagan. George W. Bush is re-elected or, if you prefer, finally elected president with an overwhelming mandate of 50.7% of the popular vote. 230,000 people are killed when a tsunami hits Indonesia, Sri Lanka and other states in the Indian Ocean. Fortunately, most of the survivors have federal flood insurance policies and quickly rebuild. The Vatican gains full UN membership except voting rights and immediately introduces a resolution condemning Israel. The Montreal Expos move to Washington, D.C., restoring the city's proud tradition of a perennial cellar dwelling Major League Baseball team.

2005 (End of the UN International Decade of the World's Indigenous People, “You're on your own now, natives!”) – The Kyoto Protocol to combat global warming is established. Eventually, 187 nations ratify the agreement, none of which then proceed to comply with its terms. Hurricane Katrina strikes coastal areas from Louisiana to Alabama, killing over 1,800 people. Outraged Americans, most of whom were completely oblivious to last year's tsunami, accuse the federal government of failing to meet its constitutional responsibility to provide good weather. President Bush flies over the flooded area in Air Force One to observe the damage. Later it is discovered the flooded area just happened to be on the flight path to his Texas ranch, anyway. Controversial drawings of Muhammad playing poker with Jesus, Moses, Buddha and Elvis lead to protests across the Muslim world and at Graceland.

2006 (UN International Year of Tasty Desserts) – North Korea (motto: Powerful and Prosperous Nation) performs its first successful nuclear test. North Korean President Kim Jong-il explains that the weapons are necessary to protect North Korea from foreign enemies who “hate us for our freedoms.” Hugo Chavez is re-elected president of Venezuela, vowing to continue the socialist struggle against the imperialist powers who “hate us for our freedoms.” Despite cries of outrage from major stockholders in the Disney Company, Pluto is demoted to “dwarf planet” status. Millions of elementary school science tests are retroactively downgraded. Vice President Dick Cheney officially opens the Lawyer Season in southern Texas.

2007 (UN International Year of Flipper) – The United States population reaches the 300 million mark. 350 million if illegal aliens are counted. The U.S. also experiences a minor economic downturn, apparently the result of a small number of residential mortgages for which the lenders failed to confirm that the borrowers had, in fact, finished the basements and remodeled the kitchens as they said on their applications. Following news of the death of Marcel Marceau, millions pay their respects with a moment of silence. While President Bush undergoes a colonoscopy, Dick Cheney serves for 2 ½ hours as Acting President, during which time he orders a preemptive nuclear strike on North Korea, France and Massachusetts. Luckily, however, the launch codes had been misplaced and are only later discovered in the course of the president's colonoscopy.

2008 (UN International Year of Mr. Potato-Head) – Barack Obama is elected president of the United States, proving once and for all that a modern, Harvard educated, charismatic son of a white woman could be elected president even if his name wasn't Kennedy. Fidel Castro resigns as president of Cuba, declaring in a 3 ½ hour speech that the time has come for a younger man named Castro to take over the helm and continue the socialist struggle against the imperialist powers who “hate us for our freedoms.” Iceland goes bankrupt, considers renaming itself to Greenland II to promote tourism. The Channel Island of Fark officially abolishes feudalism. As the Writers Guild of America strike continues, the Golden Globe Awards are canceled because no one could be found to write the winners' names on those little cards. New York Governor Eliot Spitzer announces his resignation so he can “spend more time with his rented family.”

2009 (UN International Year of Natural Fibers (No, really!)) – Following a botched initial attempt by Chief Justice Roberts to read a short paragraph of simple English correctly, Barack Obama is finally inaugurated as president of General Motors. A member of Congress accuses the president of lying, a situation roughly analogous to a streetwalker accusing a call girl of promiscuity. Bolivia becomes the first South American country to declare the right of indigenous people to govern themselves, following which Bolivian natives open the largest gambling casino in South America. Alaska Governor Sarah Palin resigns “to spend more time with Eliot Spitzer's family.” Late Night host David Letterman reveals an extortion plot “threatening to disclose he'd been spending time with Sarah Palin's family.” After an eight month long election contest, Al Franken is declared the winner of Minnesota Idol and takes his seat in the World's Greatest Debilitative Body. Michael Jackson dies, prompting resurgent sales of Jackson's “Thriller,” almost catching up with the Beatles “1” as the best selling album of the decade and further proving the pointlessness of any popular music recorded less than 25 years ago. Avatar proves to be the most annoyingly unwatchable movie of the decade, edging out The Hottie & the Nottie.

The decade ends, thankfully and at long last, with a Blue Moon.

Thursday, September 18, 2008

And Just Where Is The Constitutional Authority For FEMA, Congressman Paul?

My statement back during the time of Katrina, which was a rather risky political statement: why do the people of Arizona have to pay for me to take my risk... less people will be exposed to danger if you don't subsidize risky behavior... I think it's a very serious mistake to think that central economic planning and forcibly transferring wealth from people who don't take risks to people who take risks is a proper way to go. -- Ron Paul, The Charles Goyette Show, March 30, 2007

Herewith, a notice from Congressman Ron Paul's office assuring constituents in the Texas 14th Congressional District, which by the way includes Galveston, that "getting help to everyone affected [by Hurricane Ike] is his utmost priority."
The Congressman’s office is acting as a liaison between Federal agencies and constituents to ensure that available assistance is as accessible as possible, and that FEMA and other government agency activities are appropriate, efficient and helpful to Texans.

You can, of course, make an argument even as a libertarian -- I know this because I make it, myself, from time to time -- that standing on principle is sometimes simply foolish. I, for one, will gladly accept any federal largess that comes my way, too. I'm a libertarian, not an idiot.

Even so, it might be amusing to hear the good congressman explain the differences between federal aid following Katrina and federal aid following Ike.

Saturday, September 13, 2008

Anarchy, State and Ignorance - Part II

The whole point of certified public accountancy is the notion that a business cannot be expected or trusted to perform an objective accounting of its performance, at least not sufficiently free of the risk of conflict of interests to satisfy current or potential investors or creditors. The hallmark of a just judiciary is disinterested objectivity. People trust the compliance certification services of Underwriters Laboratories and give greater weight to product reviews and comparisons from Consumer Reports because they understand that the very raison d'être of these organizations is their objectivity and lack of conflicts of interest.

That is not to say that any of these organizations or activities are perfectly or completely bias free. Rather, insofar as the absence of bias is an ideal objective, it is merely the case that they approach it far better, on average, than organizations and institutions that are trusted not only to provide a product or service but also to self-certify the quality of their product or performance.

If you want a diverse, competitive market collectively striving for excellence in education at all levels, separate teaching from testing.

If you want the testing and certifications of academic achievement as free from bias and conflict of interests as possible, separate the testing and certifying function not only from the teaching function, itself, but also from government at all levels.

I doubt I’ll get any serious argument on this blog when I merely assert without arguing that the U.S. Department of Education is a captive regulator to all intents and purposes controlled by the education industry, specifically including state departments of education, university schools of education and, of course, the public teachers’ unions. Similarly, state and local public school systems and individual school PTA’s and such are to all intent and purposes controlled by the very personnel they are supposed to be governing or monitoring. If you want to argue against these assertions, feel free. But I take them as a given.

(It must be said, however, that state departments of education have not always been entirely captive regulators. Indeed, I’m no economist or political scientist but my best guess is that many if not most governmental regulatory agencies, the politics motivating their creation aside, began as relatively disinterested organizations. Corruption typically takes time; however, I believe it eventually, inevitable will occur.)

Anyway, say what you will about the No Child Left Behind program (and I’ll gladly join you in various criticisms), every time I hear a teacher, any teacher (including the good ones) complain about “teaching to the test” I want to jump up and down shouting for joy. Sure, standardized tests have all sorts of problems and, yes, deciding what should constitute the core curriculum in many subjects is a contentious and ultimately subjective matter. I might prefer that every high school graduate read, say, Hamlet and Twelfth Night rather than Macbeth and The Tempest, but I’d sure as hell prefer that they have read one or the other rather than neither.

If we looked not to diplomas and degrees from schools that have, to put it mildly, all sorts of conflicts of interest but to independent testing agencies, different in important ways from and yet similar to the organizations that administer standardized college and professional school exams now, we would go a long way toward creating an entirely different sort of educational system. Such a system would be largely indifferent to how you learned (or how much time you spent learning) algebra or, yes, let’s get it out and be done with it, biology, English literature or conversational Spanish, focusing only on whether you passed whatever standard (and therefore admittedly somewhat arbitrary) benchmark involved. It wouldn’t matter whether you were home schooled, publicly educated or attended the Toniest of upper class prep schools. Oh, and I’ll save the argument for another day, but I’d say roughly the same sort of system should apply to higher education, as well.

I continue to believe in a system of tax funded, voucher supported, primarily privately operated schools, contra what appears to be at least one of my co-bloggers position on the subject. To be sure, we are all here capable of educating our own children or, at least, of paying for someone else to do it, but it isn’t the fault of children born in the inner city or squalid, rural trailer parks or, for that matter, of legal immigrants who will eventually join the middle class or better but whose children need education today. I would no more condemn them to ignorance than deny them food, shelter or medical attention simply because they are unfortunate enough to have parents who cannot or will not provide better.

On the other hand, I also firmly believe that the overwhelming majority of parents want the best education for their children they are capable of receiving and that, given even the minimal required resources to do so, that self-same overwhelming majority are best situated to determine how best to accomplish that. It doesn’t bother me in the slightest that many will opt to include rigorous religious education as part of their children’s overall education, nor that I would disagree with much of that religious education, nor that some of it might well conflict with evolutionary theory. You want certification that you have studied introductory biology? Take and pass the test. (Or one of several available tests in a market similar in that sense to the alternative availability of the ACT and SAT.) Potential employers, universities, etc. could and would establish their own standards based on such test results for purposes of employment, admissions, etc. Indeed, employers and schools would have good reason to care about the integrity and independence of the testing agencies and the rigor of their tests and the market pressures to maintain and improve that objectivity and rigor would tend to prevent educators’ inevitable attempts to co-opt the tests.

I may write a third post providing some more detail of the system I envision. By way of shortstopping certain sorts of criticism for now, let me just say that I don’t see this as a panacea but merely as a preferable system to the one we now have There are, no doubt, all sorts of details to be worked out and problems obvious even to me in this alternative approach. Feel free to name them if you wish. What I would be particularly interested in reading, however, is anyone who wishes to argue that the present system, the one we have now, is preferable, and why they believe that is so.

Thursday, September 11, 2008

9/11 Remembered

I have told this story before, but I was in the Pentagon at the time of the attack. As it happens, I was far enough away from the site of the crash that I couldn't say for sure that I actually heard or felt anything at the moment of impact. A few minutes earlier, although there wasn't a television set or radio handy, rumors of the attack at the World Trade Center were already circulating throughout the building and we were trying to get more information through the internet.

What I did finally hear and pay attention to only moments later was the sound of other people rushing down the corridor, heading for the nearest exit. I still didn't know what had happened, but if they all thought leaving the building was a good idea, well, you know. I joined the crowd and literally less than two minutes I was out in the South Parking lot, walking rapidly away from the building.

The South Parking side of the Pentagon is to the south of the Heliport side where the airplane hit. I couldn't see anything over there except a huge and rapidly growing plume of jet black smoke. The most likely inference at that point was a helicopter crash causing a fire, which was what I assumed. As people continued to pour out of the Pentagon, however, it also became clear that it would probably take at least an hour or two before the "all clear" signal was given and the crowd of some 25,000 people could re-enter the building. My car was parked not far away, so I simply kept walking to it and then drove off.

It was only when I turned on the car radio as I pulled out of the parking lot that I discovered what had happened. In fact, as I took the ramp exit to I 395 South / Washington Blvd., I could finally see the burning crater in the side of the Pentagon where the airplane hit. I could hear sirens approaching from every direction as I drove away in the opposite direction.

Not that it would have done me any good, but I didn't have a cell phone on September 11, 2001. (I own one now, at my wife's insistence, and that is frankly one more thing I hold against the terrorists, trivial as that is.) I drove to my wife's office and we decided, since we had no idea how extensive the attacks were or whether there would be more, to pull our children from school and then determine from there whether to leave the immediate Washington, D.C. vicinity. As it happened, we remained at home glued to the television. I would do exactly the same thing if the same situation were to occur again.

Obviously, the situation at the World Trade Centers was vastly worse. Still, I went back to the Pentagon the next day and entered long enough to witness the incredible smoke damage even as far away from the point of attack as I had been the previous morning. While none of the victims were personal friends, a number were people with whom I had done business over the years.

Mine isn't, therefore, a particularly dramatic, let alone tragic story. More like a brush with history, actually. It's worth remembering, though, how much the U.S. has changed since and because of 9/11. Normal is whatever you grow up with or grow used to. America's continuing psychological sense of siege in what increasingly seems not only to be a long but a perpetual war against terrorism feels more and more "normal" all the time. Surely, that is a far greater harm than even the terrible death and destruction of seven years ago.

Wednesday, September 10, 2008

Anarchy, State and Ignorance

Your children are not your property. They’re not mine, either, thank Gawd, and just as important, they’re not the state’s property, either.

One of the problems of framing political theory in terms of fundamental or natural property rights (the naturalist fallacy aside) is that once we begin thinking of a person as having property rights in himself, it’s a small leap to thinking that one person can have some sorts of property rights in someone else. (Yes, I know, there are ways around this, but that doesn’t make it any less a problem, and an entirely avoidable one, at that, if we just abandoned the notion of property existing outside a legal system, itself a function of the ideally minimal state. But that’s another rant for another thread.)

Positive Liberty readers will have noted a certain amount of crankiness lately when it comes to schooling, education, creationism, Intelligent Design theory, Darwinian evolutionary theory, home schooling, etc. People do care about what is taught in schools and people do care about their children’s education and want excellent schools. Tempers flare, intemperate statements are made, feelings get hurt, my jokes get even dumber than usual, and so on.

Of course, when I say “people” I don’t mean everyone. There are many people who really don’t give a damn about excellent schools (we call these people NEA members) and there really are parents who don’t give a damn about their children's education.

There are people who believe that the Bible is the inerrant word of God not only about matters spiritual but matters historical, too, including natural history. And there are people who believe that with the empirical sciences in one hand and Occam’s straight razor wielded deftly enough in the other they can whittle down language and the reality to which it ideally relates to a tidy little material ontology with a surprisingly handy analytic framework undergirding and making sense of both. We call the first sort fundamentalists and we call the second sort Richard Dawkins. They have much in common, not the least of which is an almost invincible ignorance of each other’s area of interest and expertise. But that’s another rant for another thread.

Friday, September 5, 2008

Suzanne Scholte Wins Seoul Peace Prize

I’m very pleased to report here that Suzanne Scholte, a friend, fellow William & Mary graduate and the wife of my college roommate, has been chosen as the ninth winner of the biennial Seoul Peace Prize. As the linked article notes, several former winners have subsequently been selected to receive the Nobel Peace Prize as well. My heartfelt congratulations to Suzanne and to her family.

And... They're Off!

Based on what little of the Republican National Infomercial I managed to catch (read: failed to avoid), their message is strong and clear: America needs a president whom only the Republicans can provide – a man who can make America once again safe, secure, prosperous and free after eight years of a disastrous and failed, um, Republican presidency. While not quite rising to the remorseful, tear-soaked morning-after promises thuggish husbands tell their battered wives, there’s nonetheless something that’s almost as thrillingly brazen as it is breathtakingly desperate about this gambit.

And it just might work.

Mind you, as far as I can tell, Barack Obama is an empty vessel with paper-thin qualifications (if any are really necessary, which I doubt) into which voters foolish enough to expect good things from government can pour their hopes and dreams. He’s a smooth talkin’ son-of-a-gun and mighty good lookin’, too. Just the sort of guy for the nation to get its next teenage girl crush on. And just as likely to end in heartbreak as all the others before him, too, but never mind all that! The guy’s a dreamboat!

In fact, Obama’s major qualification as a candidate is precisely that he is (still!) an unknown. (Libertarian Party VP candidate Wayne Allyn Root is the sort of guy who gives the LP the reputation it so richly deserves, but this is both funny and weirdly significant.) Hey, even if it does turn out that there really isn’t that much there there, that to hardly know him is to know him well, well, better the devil you don’t know, sometimes. After all, that’s how we got Bill Clinton and does anyone honestly think he wouldn’t still be in office but for that pesky 22nd Amendment? (My guess is that at this point we'd not only welcome him back but lure him with a lifetime supply of kneeling interns if that's what it took.)

Meanwhile, did anyone even so much as mention George W. Bush at the Republican bash? I don’t know, I really didn’t follow it all that much, but it felt like being at a family reunion where, on the one hand, everyone avoids mentioning Uncle Fred ever since his NAMBLA membership became public knowledge but, on the other, everyone feels a bit of silent relief they no longer have to pretend he really isn’t a pervert. (And let’s not even get started about Vice President "He-Who-Must-Not-Be-Named.")

Back to the Republicans’ message, though: The world is a dangerous place (and McCain intends to see to it that it stays that way), taxes are too high (most Americans are so crippled by their tax burden that they can actually remember how many homes they own, or used to), federal programs are too intrusive and expansive (except maybe when it comes to money pouring into Alaska and restrictions on the funding of political speech), all life is sacred (at least until it’s born), borders should be open to the free flow of goods (but not people) and the rest of the world deserves American style democracy and John McCain is just the sort of guy to see to it that they get it, good and hard.

Meanwhile, I did tune in the other night to watch the rollout of their new 2008 Palin. Okay, o there wasn’t as much research, development or testing, either in the lab or the field, of this new major Republican brand as the federal government would require of something more dangerous than a Vice President like, say, a hair dryer or a child’s toy. But I disagree with some of my co-bloggers here and think the unveiling and initial product pitch went very well.

And then there’s John McCain, himself. The man’s a hero, there’s no question about that. He’s exactly like John Wayne was if only John Wayne really had been a hero and John McCain really could act. (Okay, so John Wayne really couldn’t act, either. But he did the best John Wayne in the business, and that’s pretty close to acting.) And so what if according to every single insider source McCain really does have the fly-off-the-handle temper problem of an abusive husband around staff and just about everyone else when the cameras aren't rolling? It isn’t like either the Republicans or the Democrats in Congress would just roll over and let the president go around, oh, say, invading other nations just because of a handful of bearded guys living in caves, is it?

I have no idea how Sarah Palin will play out over the next two months, but two months isn’t a long time. I remain frankly amazed that McCain hasn’t yet revealed his own darker side, so what do I know? So, too, I’d be among the first to acknowledge that Obama has some (Bill) Clintonesque charm and rhetorical skills that may dazzle come “debate” time. Biden? *shrug* I doubt he’ll help Obama all that much or hurt him much, either. Based on her acceptance speech, however, Palin’s addition to the McCain ticket raises the stakes on the vice presidential debate dramatically. As matters stand today, that might prove to be the pivotal campaign event. *yet another shrug* We’ll see.

Tuesday, September 2, 2008

Juneau

Todd “First Dude” Palin: Gov, Honey, I think it's best to just tell 'em.
Sarah “The Gov” Palin: I'm Pregnant.
Bristol Palin: Oh, God.
Sarah Palin: But, uh ah, I'm not going to give it up for adoption and I'm certainly not going to get an abortion. After all, I'm only in my mid-forties and the First Dude and I are the perfect couple. Just look at how well you two turned out. Besides, if I play my cards right with the Geezer, pretty soon the federal government will be paying for the medical expenses and everything. And, and in, what, um, 50 or so odd years when your dad and I are both dead you can just pretend that this never happened.
Track Palin: You're pregnant?
Sarah Palin: I'm sorry. I'm sorry... And if it is any consolation I have heartburn that is radiating in my knee caps and I haven't taken a dump since like Wednesday... morning.
Bristol Palin: I didn't even know that you and Dad were still sexually active.
Sarah Palin: I, uh...
Track Palin: Who is the kid?
Sarah Palin: The-the baby? I don't really know much about it other than, I mean, it has fingernails, allegedly.
Bristol Palin: Nails, really?
Sarah Palin: Yeah!
Track Palin: No, I know. I mean what’s its name going to be?
Sarah Palin: Umm... We haven't decided on a boy's name yet, but if it's a girl, it's going to be Juneau Palin
Track Palin: Juneau Palin?
Sarah Palin: What?
Track Palin: God, can’t you people ever come up with, like, a normal name?
Todd “First Dude” Palin: Huh?
Bristol Palin: Anyway, Mom... Dad... while we’re on the topic of shenanigans....

Saturday, August 30, 2008

Constant Viewer: Traitor and Babylon A.D.

Traitor is a slightly better than average suspense thriller with a significantly better than average performance by Don Cheadle in the lead role. Sadly, however, the same cannot be said of his co-star, Guy Pierce, whose American accent isn’t too awful until it is revealed through dialog along the way that he’s supposed to be a Southerner, too. Pierce is a good actor, but we might consider going back to those halcyon days when honest-to-goodness American actors, or at least Canadian ringers, were cast in such roles. Constant Viewer knows all about the wonderfully talented Hugh Laurie in House and all that, but enough is enough.

CV suspects Traitor may slip in and out of your local cineplex before you notice it was there, as it was not produced by one of the major studios and received precious little pre-release advertising. As the contemporary crop of Middle Eastern terrorists versus U.S. intelligence agency films go, Traitor is a perfectly respectable entry. If you like such movies but you waited to see it on DVD, though, you wouldn’t miss much at all.

* * * * * * * * * *

If you waited to see Babylon A.D. on DVD you wouldn’t miss much, either. Then again, that’s equally true if you don’t bother seeing it at all. Vin Diesel turns in an acceptable Vin Diesel performance in this hyperactive but unengaging road movie. The road in question leads from Russia over the Bering Straits, across which Diesel’s character must transport a young woman (Mélanie Thierry) and her governess (Michelle Yeoh) from Mongolia to Manhattan. There are nice performances in comparatively small parts here by Charlotte Rampling and Gérard Depardieu, but the plot is so tissue thin and the directing so uneven and distracting their efforts are largely wasted. As was CV’s time.

In Muted Defense of Gridlock

In Mr. Babka’s “Why I Don’t Want United Government,” reader Jeff Hebert makes some very interesting comments, including the following excerpt:
I find it very surprising that anyone seriously concerned with libertarian issues would support a Republican President this time around. The Bush Administration has had a sustained, hard push over the last eight years to make the “Unitary Executive” doctrine the de facto law of the land. It’s hard for me to imagine anything worse for our liberties than a chief executive with the powers and privileges of a monarch, and yet that’s exactly what Cheney, Bush, Yoo, and company have been working steadily towards.

John McCain has surrounded himself with people who hold the most extreme neo-conservative views in the party. He’s not just going to be four more years of Bush, he’s going to be four more years of the worst parts of Bush. If the idea of “anything’s legal if the President does it” doesn’t scare you way, way worse than universal health care (plenty of other Western countries have it and yet shockingly their nations have not imploded), expanded union power (ditto), and some changes to the way the FCC works, then I would respectfully suggest that your priorities are way out of whack.

We’ve been witness to a full frontal assault on the concept of separation of powers and the enshrinement of a monarchical executive, largely unnoticed by the vast majority of the country. When asked what he would do with his first 100 days in office, Obama said “I would call my attorney general in and review every single executive order issued by George Bush and overturn those laws or executive decisions that I feel violate the constitution.” That’s exactly what I want to hear.

I certainly agree with much of Mr. Hebert’s characterization of both the Bush Administration and of John McCain. If we were discussing Obama versus a third Bush term, I'd be more inclined to agree as well with more of Mr. Hebert’s reasoning. As matters stand, however, and subject to change on a daily basis, Democrats are likely to increase their control of both the House and the Senate, so the question becomes not which candidate successfully pursuing his agenda poses the greater threat but which candidate is least likely to successfully push his agenda.

I'm no McCain fan or supporter. The man is an autocrat and, from just about every insider report I've ever heard, one of his many homes is in Cloud Cuckoo Land. The question is whether giving McCain yet another residence, this time on Pennsylvania Avenue is more likely to perpetuate or worsen Bush’s imperial presidency versus what sort and how much damage is likely to occur in an Obama Administration.

I continue to believe that what genuinely troubles the Democratic Party is not the immense and increasing power of the presidency but merely the fact that it’s not currently theirs to use. I agree with Mr. Hebert that there are worse things than socialized medicine. Perpetual war, for example, springs to mind. Furthermore, as offensive as a return of the Fairness Doctrine would be, it isn’t exactly like John McCain is a staunch defender of free speech. But, whining from progressives aside, there is absolutely nothing I know about Obama to lead me to believe he would be cautious in his use of executive power once it was given to him or, frankly, that he would not pursue a far more leftist agenda than he has so far proposed. Like McCain, he is not a man who tosses and turns late at night fretting with self-doubt.

Speaking of which, does Mr. Hebert really like to hear a candidate promise to “overturn those laws or executive decisions that I feel violate the constitution”? Feel? Okay, so maybe Obama was speaking somewhat informally or imprecisely. But just how does he plan to go about overturning not merely executive decisions but laws as well? Let's at least hope not by fiat.

The key to winning the presidential election in the U.S. continues to lie in campaigning sufficiently close to whatever the political middle happens to be to wrest away swing state (electoral) votes from your competition. If Obama announced his intention to press for legislation requiring universal “public service” nonmilitary conscription of 18 year olds and a 50% increase in all marginal tax rates, he’d win Massachusetts just the same but he’d lose Virginia for sure and probably Ohio, too. If McCain announced his intention to reinstate the military draft and abolish the Department of Education, he’d still probably win Mississippi and Arizona but lose Virginia and Ohio. Okay, so maybe my examples can be argued, but there are dead certain red states and dead certain blue states and a slowly shifting handful of swing states where the battle will be waged.

But none of that has anything to do with how the winner governs. Politicians all lie. Maybe not all the time but whenever necessary. If McCain really gave a damn about Obama’s lack of experience he sure as hell wouldn’t have picked Palin as his running mate. If Obama really gave a rodent’s hindquarters about change he wouldn’t have picked long-term Washington insider Joe Biden. They’ll both do and say what they believe they need to do and say in order to get elected. Once elected, they’ll do what they bloody well want to do.

Unless another branch of government stops them.

I hesitate to make the next point, but sooner or later it must at least be put on the table. If it is true, and it is, that Obama’s race is a factor in the election, then it is also almost certainly true that Obama’s race would be a factor in Congress’s relationship with his presidency. I don’t know how that would play out and I am not accusing Obama of anything so crass as “playing the race card” either now or should he be elected. I do think, however, that members of Congress will have to weigh one more factor in any decision to oppose or criticize a President Obama and that is whether such criticism or opposition even hints of racial animus.

Perhaps not. Perhaps even raising the issue shows a disconcerting oversensitivity to such matters on my part. Even so, all other factors being equal, I must believe there is a far greater likelihood of a Democratically controlled Congress standing up to a white Republican president than a black Democratic president regardless of the merits of whatever issue is under consideration.

Libertarians aren’t going to get minimal government any time in the foreseeable future, so minimally damaging government is the best they can hope for. Minimally damaging government tends invariably to be government that does the least regardless of how big it already is, so maximal gridlock is the best possible outcome from a libertarian perspective.

But best possible outcomes can be nearly as far removed from ideal outcomes as worst possible outcomes. I haven’t decided to vote for or otherwise support McCain. Far from it, in fact. But I certainly can understand how other libertarians, perhaps reasoning as I have here, might decide that, contra Mr. Hebert’s comments, voting for McCain is the most proactively libertarian thing they can do this time around. Doubtlessly (well, hopefully, anyway), they’ll be holding their noses as they do so.

This much I do know, though. If monopolies and collusive oligopolies really are bad for the general public, then undivided government and political “bipartisanship” ought to be prohibited on antitrust grounds. And that would still be true even if the president and every member of Congress were self-styled “Libertarians.”

Jitters Bugged?

Old joke:

Two Roman Catholic theologians, one a Jesuit and the other one a Dominican, are arguing about prayer and smoking. (Hey, I said it was an old joke. This was before smoking became a secular sin only slightly less heinous than child abuse.)

So, anyway, the Jesuit says there’s nothing wrong with praying and smoking at the same time, while the Dominican is equally adamant that it’s disrespectful to God and thus sinful. The argument goes on and on and finally they decide to submit the question to the Vatican, which they both do.

Months pass as, left to their own devices, months will, and finally the Jesuit and Dominican meet. As they see each other big smiles break out on both their faces. “I told you so!” the Dominican almost shouts. “What are you talking about?” the Jesuit says, “I just got word back from Rome recently that I was right.” “That’s impossible,” the Dominican says. “I just got word back from Rome telling me that I was right.” The two theologians stand there silently and bewildered.

Finally, the Jesuit smiles. “Wait a minute,” he says. “What exactly did you ask?” “I asked exactly what we were arguing about. I asked if it was a sin to smoke while you were praying.” “Ah ha!” the Jesuit exclaimed. “I thought so! That’s the problem. You see, I asked if it was a sin to pray while you were smoking!”

To borrow from Wittgenstein, while we may not constantly be bewitched by language, we are always in danger of being misled by some sort of linguistic stage magic, and this is true even though much of it is unintentional and some is even self inflicted. How we characterize something (e.g., “pro choice” or “pro life”) already inclines us to one sort of judgment versus others.

But that’s not simply to note that words have emotional connotations as well as objective denotations. Wittgenstein, again. “Can one play chess without the queen?” What question is being asked? Certainly not whether one can continue playing chess after one or both queens are captured. What then? Whether one could play a game like chess except without queens? Again, ignoring how good a game it might be, the question fairly obviously is yes. What about whether such a game still ‘really’ was chess or still ‘should’ be called chess? Is that a factual question? One that perhaps still requires more data to resolve or, as is typically true in philosophical disputes, one that calls more for a decision which, in turn, will depend on how we go about weighing this consideration versus that?

So, also, are performance enhancing drugs in athletic competitions per se unfair? Doesn’t it depend on how and why they enhance performance? Philosopher / physician Carl Elliott raises that question in a current Atlantic piece, arguing that, at the very least, what counts as performance affects out answer to that question. Is the ability to perform in public under intense pressure an integral part of the very athletic ability being judged, or should an otherwise gifted athlete’s greater sensitivity to pressure and higher state of anxiety be considered irrelevant?

Beta-blockers (a common class of anti-hypertension drugs), for example, tend to reduce the physiological effects of anxiety. Not the anxiety, itself, mind you, but only of some of its outward effects such as hand tremors. Thus, their use is banned in some competitive sports, but the validity of the rationale for their ban depends on whether we’re talking about smoking while at prayer or praying while having a smoke. Elliott:
Beta blockers are banned in certain sports, like archery and pistol shooting, because they're seen as unfairly improving a user’s skills. But there is another way to see beta blockers—not as improving someone’s skills, but as preventing the effects of anxiety from interfering with their skills. Taking a beta blocker, in other words, won’t turn you into a better violinist, but it will prevent your anxiety from interfering with your public performance. In a music competition, then, a beta blocker can arguably help the best player win..... The question is whether the ability to perform the activity in public is integral to the activity itself.

I have no dog in this fight. (By way of Truth In Bloggistry disclosure, it happens that I take beta blockers for hypertension, but I’m not inclined to public performance anxiety and, besides, there are no performance enhancers of any sort that would make me an athlete. If instead of Dr. Bruce Banner I’d gotten the gamma rays, the Hulk would have been an overgrown but still uncoordinated doofus.) I don’t care whether either amateur or professional athletes are permitted to take beta blockers or, for that matter, any other performance enhancing drugs. My only point here is that how one answers these sorts of questions depends in large measure on how one frames the questions in the first place.

That settled, feel free to take out your prayer beads now and, oh, yeah, smoke ‘em if you’ve got ‘em.

Friday, August 29, 2008

♫ Who are those (not so) tall, (not so) dark strangers there? ♫

Okay, so it isn’t quite official yet, but major news outlets are reporting that McCain has picked Alaska Governor Sarah Palin to be his vice presidential running mate.

I admit, between having an African American presidential candidate and a female vice-presidential candidate who isn’t the laughably inept Geraldine Ferarro, this race suddenly looks more interesting than the average TweedleDeemocrat versus RepubliDumbican contest. (In as much fairness as I'm ever likely to grant Ferarro, if Walter Mondale had picked the Pope as his running mate in 1984, he probably wouldn't have carried the Vatican.) Geez, who’d a thunk the Libertarian Party ticket represented the only traditional offering of two middle-aged white guys?

Palin has next to no experience, making even Obama look like a senior statesman by comparison, but both Carter and Ford proved decades ago and George W has since confirmed that there’s no such thing as minimum required qualifications, the Constitution aside, for serving as president.

Meanwhile, I was amused that some accounts claim Palin is also a self-described “maverick.” I hope James Garner is getting royalties for this.

Monday, August 25, 2008

Just Wonderin'

Mind you, I don't pay any credence to the rumors over presumptive Democratic presidential nominee, presumptive president, presumptive messiah and just plain presumptive Barack Obama's citizenship qualifications, but if by any stretch of the imagination it turned out after he won that he wasn't constitutionally a natural born citizen, shouldn't that mean the Republicans can run this guy in 2012?

On With The Show!

Wait a minute! You mean I missed the Olympics? (Who won the prenatal gymnastics medal?) Dayum! And here I was so much looking forward to watching people of every gender, race, creed, color, sexual orientation and nationality vie against one another in a bogus spirit of brotherhood and good will!

Oh, that’s right. I can get the same thing watching the Democratic National Convention, another mostly staged event, this week.

I vaguely remember, no, not the beginning of American political parties, but a time in the 50s and 60s when some honest-to-gawd political business other than marketing was conducted at these conventions. Mind you, much of that business was conducted behind closed doors in (ah, the good old days!) smoke-filled rooms and not on the almost equally smoky convention floor. Still, deals were cut, party platform planks (mostly meaningless even then) were bickered over and sometimes even who the candidates were going to be was decided by multiple ballot. Sadly, however, conventions have shifted from political Super Bowls to World Wrestling Federation championship events. Except, of course, that the WWF has the good sense not to tell the viewers in advance who will win.

A Positive Liberty reader recently commented sarcastically on another thread discussing the legacy of the 1968 Chicago Democratic Convention, saying with his tongue planted firmly in his cheek that “1968 was the pivotal moment in all of human history, past and future.” Speaking on behalf of my terminally self-important Baby Boomer generation, I will note only that America’s major political parties did begin to conduct their business differently after 1968. Not so much because of the protests (“Yippie!”) outside the convention center -- after all, it isn’t like a guy named Richard Daley would be mayor of Chicago forever, is it? -- but because of the resulting McGovern-Fraser Commission and the subsequent shift to state primaries as the method of deciding delegates and, thus, selecting candidates.

Another “lesson” from 1968 was the increasing importance of television and therefore the need to control convention and convention related events as much as possible. I don’t think Nixon beat Humphrey in 1968 simply because of the violence in the streets of Chicago during the convention, but it sure as hell didn’t help Humphrey, either.

Needless to say, I won’t be watching either the Democratic or the Republican National Conventions in real time. Any really juicy gaffs or other “must-see” moments will be on YouTube before the evening wrap-up, so I’ll catch Ted Kennedy’s likely swan song, Hillary’s dagger-eyed stares, McCain being reminded how many homes he owns and where he left the keys, etc. in TiVo time.

Sunday, August 24, 2008

Selfishness, Egoism and Altruistic Libertarianism

It is a cliché among many psychologists and economists that human beings behave self-interestedly. Moreover, since Adam Smith’s somewhat theological, somewhat anthropomorphic “invisible hand” metaphor, it has been almost an article of faith within the latter discipline that the collective, societal result of individual self-interested behavior is ironically salubrious.

It is a faith to which I also ascribe, although like all but the most zealous of religious fanatics I season that faith with the occasional heresy here and there. Crucially, however, it needs to be noted at the outset that not just any sort of self-interested behavior contributes to the common wealth and greater good. Specialization and trade, voluntary association, bargained-for exchanges, common rules and some sort of enforcement mechanism to address rule breaking are all necessary elements for human society to flourish economically, for the invisible hand to prove, as it were, optimally dexterous.

Most importantly, “self-interested” is not synonymous with “selfish.”

Discussions about selfishness elsewhere on this blog got me thinking about these things. I am no Ayn Rand scholar, nor do I purport to be an Objectivist. Undoubtedly, however, Rand’s followers constitute a significant and vocal segment of the libertarian community. (It’s a non-gated community, after all, noted for its lack of zoning regulations, restrictive covenants or entrance requirements.) Anyway, given that Rand published a collection of essays entitled The Virtue of Selfishness: A New Concept of Egoism, it should be clear just from the title’s use of the word “egoism” that she or Nathanial Brandon, as the case may be, intended to give the word “selfishness” a special, technical meaning in the overall context of Rand’s worldview.

But selfishness and egoism are two separate things, a fact I assume Rand understood perfectly well when she deliberately invoked the apparent contradiction of selfishness as a virtue for its rhetorical impact. Whatever Rand’s standing as an intellectual and participant in the history of political philosophy, she was also certainly a polemicist with a particular political agenda in opposition to what she correctly perceived as the 20th century’s greatest threat to humankind; namely, the threat of collectivism. You simply cannot read Rand fairly without bearing that in mind.

The important point is that selfishness is a common language concept, not a technical term. Anyone fluent in English knows what it means and knows, more importantly, that it entails a negative moral judgment. Selfishness is by definition a bad thing. It’s using up all the hot water in the shower when others are waiting, eating up all the cookies instead of sharing them with friends or family, and so forth. (Except, perhaps, at the Ayn Rand School for Tots, although Ms. Sinclair couldn’t have really been much of an Objectivist since the first thing she did was violate Maggie's pacifier property rights.)

Selfishness moreover logically entails and presupposes that there is some preexisting community to which the individual belongs and some moral commitment to that specific community. I, for example, live with my family in a household where there is a finite supply of hot water and cookies. If I stand in the shower for an hour shoving one increasingly soggy chocolate chip cookie after another into my mouth until both supplies are exhausted, I am acting selfishly relative to my family. It is less clear that I am being selfish when I buy the last package of cookies at the store, thus depriving the next cookie junkie from his or her fix, or when I purchase the big, heavy-duty water heater for my house. It is less clear, still, that it is properly called selfishness to eat any of those cookies or use any of that hot water knowing that many millions of people across the globe have neither cookies to eat nor any hot water to shower with.

To be sure, there are those who claim that the last is selfish, although the overwhelming majority don’t really believe it based on how they, themselves, actually live. The notion that we as individuals have moral obligations to humanity at large is, to put it mildly, very problematic. The point, in any case, is that we wouldn’t be inclined to call all sorts of behavior like eating a cookie selfish simply because every cookie eaten is, necessarily, a cookie no one else can eat. The morality of sharing does not require splitting my cookie into several billion pieces so everyone can have some.

Egoism, by contrast, is not an ordinary language word or concept. Mothers don’t scold their children for being egoists when they selfishly eat the last cookie. Indeed, if you peruse its Stanford Encyclopedia of Philosophy entry you will discover that there is not even a single technical sense of the term.

We pause now while I grind a philosophical axe for a moment. There is a critical difference between, on the one hand, the theory of psychological egoism, the theory that claims it is simply a fact that human beings always and under all circumstances behave self-interestedly and, on the other, ethical or rational egoism. These theories contend that morally right behavior or rational behavior, respectively, simply is self-interested behavior.

These latter may be right or wrong and are certainly subject to criticism, but at least they both admit of the possibility of unethical or irrational behavior. That is to say, the ethical egoist acknowledges that people are capable of behaving other than self-interestedly, she simply argues that they shouldn’t. So, too, the rational egoist doesn’t claim that we always act rationally, i.e., self-interestedly, but only that we should or that it is only when we do that our actions deserve the appellation “rational.”

Psychological egoism, by contrast, obliterates the normative force of self-interested behavior, whether for good or bad. Indeed, it obliterates normative considerations in the same way all strong forms of determinism do: if “ought” implies “can” but one cannot act differently than one does then it is absurd to claim that one ought to have acted differently. Moreover, if all behavior is, by definition, self-interested, then it is a fair question to ask of this non-falsifiable metaphysical theory what sort of substantive claim, if any, it really is making.

Axe grinding concluded, I’m reasonably confident that Rand was an egoist in both the ethical and rational egoism senses. In retrospect, however, it is perhaps unfortunate that she chose to use “selfishness” as a rhetorical device to describe her egoism because it opens both Objectivism in particular and libertarianism in general to the sort of prejudicial criticisms Mr. Hanley recently bemoaned.

In fact, Rand aside, there is nothing at all incompatible about libertarianism and altruism. Not, at least, if altruism is understood not as Rand technically used the term but simply as the opposite of mere selfishness. It is hardly altruistic, in the ordinary sense of the term, to coerce other people to behave in supposedly selfless ways in order to achieve your personal vision of the greater collective good even if that greater good is thereby realized. But it is unarguably immoral to coerce others using that rationale when, in fact, it becomes painfully obvious that the exact opposite results.

Indeed, if we’re looking for a single lesson from the history of the 20th century, we could do much worse than conclude that, no matter how noble their advocates’ intentions may have been, collectivist social and economic orders yield disastrous results. Obviously, therefore, noble intentions are no guarantee of success. Libertarianism has never claimed that in a libertarian world order everyone will win and "all must have prizes." In fact, as far as I know, only utopian collectivists and Lewis Carroll's Dodo have made that claim.

But then Carroll, of course, knew he was talking nonsense.

Saturday, August 23, 2008

Democratic ’08 Ticket: O.- B., But No GYN

Two or three semi-random thoughts on Obama’s selection of Joe Biden. First, my son’s intelligence (read: information, not I.Q.) from working this summer on a “Blue Dog” Democrat’s re-election campaign turned out to be entirely accurate. (Note to Self: Remember to listen to son occasionally in the future.)

Second, given Biden’s solidly liberal record, Obama has determined that he does not need to position himself to appear closer to the political middle in order to win. (Yes, I know there are even more liberal Democrats Obama might have chosen, but a quick perusal of the infallible, inerrant and entirely trustworthy Wikipedia entry leads me to the conclusion that a “moderate liberal” is someone who purports to oppose the Castro regime in Cuba.) It suggests, also, that Obama thinks (I think correctly) that he is vulnerable regarding foreign affairs and that Biden will provide additional credibility.

Most intriguingly, however, is that Obama chose a man. Hey, black men got the vote before white women did, too, so he’s just being traditional, right? Seriously, though, and aside from ensuring that Hillary Clinton will now work tirelessly, day and night, to see to it that Obama loses in November, does Obama believe that too much demographic “change we can believe in” is a loser in the general election? Does he believe (I suspect correctly) that liberal white women can be taken for granted come November just as black voters have historically been taken for granted by the Democratic Party? Does he believe that there really aren’t any sufficiently qualified women out there? (Hillary included?)

Finally, does he really believe Joe Biden is the best qualified man not merely to help him win the White House but to serve as Vice President? Nah, whatever else is going on, it sure as hell couldn’t be that. Could it?

Friday, August 22, 2008

Constant Viewer: The House Bunny

Constant Viewer had never seen or at least never noticed Anna Faris before today, and a quick review of her career to date makes it pretty clear why not. CV isn’t exactly part of the target audience for the Scary Movie franchise, after all, and he simply didn’t notice or remember her from Lost In Translation. Apparently, however, she has a loyal and growing fan base, so CV was a bit disappointed today when he saw her performance in The House Bunny. Okay, so the material was predictable, crudely directed and, worst of all, not all that funny for extended periods of time. CV had read, however, that Faris’s performance shines above this otherwise indifferent movie. Perhaps so, but not all that much above and, frankly, that’s damning with very faint praise at best. Comparisons to Reese Witherspoon’s Legally Blond flicks are pretty much unavoidable in any consideration of The House Bunny, and neither Ms Faris nor this new movie fare well in that comparison. Still, CV would very much like to see her in something better than this mostly failed effort, the sort of movie that might, at most, be worth a viewing from one of those supermarket $1 video rental booths.

"Be wary of strong drink. It can make you shoot at tax collectors ... and miss." *

There shouldn’t be a minimum legal drinking age, although I probably wouldn’t mind too much if it were set at, oh, say, six. If Mothers Against Drunk Driving and the rest of the Uber-Nannies out there want to keep pre-schoolers from bellying up to the bar, well, okay. After all, it’s for the children.

Syndicated columnist and (inexplicably) frequent reason contributor, Steve Chapman offers scraps of arguments against a proposal from an advocacy group called Choose Responsibility to lower the legal drinking age to 18. To date, the proposal has been signed by over 120 college presidents, predictably incurring the irrational wrath of MADD and other quasi-professional scolds.

Chapman’s arguments, such as they are, pretty much boil down to the assertion that many people under the age of 21 are too immature to drink and that more of them will drink and suffer problems as a result. As a corollary, if 18 year olds can buy alcohol, those under the age of 18 are more likely to have more ready access to booze because high school seniors will buy it for sophomores and freshmen, etc.

Here, however, is the money quote from Chapman’s lamentable column:
Why permit 18-year-olds to vote but not drink? Because they have not shown a disproportionate tendency to abuse the franchise, to the peril of innocent bystanders.

Mr. Chapman, if you don’t think 18 year olds who vote for Republican or Democratic candidates are imperiling innocent bystanders like me, you obviously haven’t been paying attention.

Seriously, though, there’s so much wrong with this mindset it’s hard to know where to begin in rebutting it. Here, however, is the principal objection:

The mere fact that something is dangerous or harmful to some members of a group is never sufficient justification to prohibit all members of a group from using or having access to it. The fact that some members of group X will abuse access such that members of the general population are harmed is equally insufficient to prohibit all members of that group from having access.

I accept the fact that institutional rights and privileges, e.g., voting, driving on public roads, necessarily involve some sometimes arbitrary regulation. Moreover, I certainly accept the fact that libertarianism is, for the most part, an NC-17 rated show. Children do require restrictions on their liberty for their own good. The question, however, is whether the default agent responsible to impose such restrictions should be the state or their parents. Admittedly, some parents sometimes fail in those responsibilities and the state must then intercede. See, however, the immediately prior paragraph as to why that fact alone does not justify depriving all parents of properly parental authority.

Serving your 16 year old daughter a half glass of wine at Thanksgiving or sharing a beer or two with your 17 year old son as you both watch the game or accepting the fact that your 19 year old college student may well get drunk on campus as opposed to driving off into the woods with friends specifically to go binge drinking, thus creating an even more dangerous situation isn’t an abrogation of parental responsibility. Imposing a universal prohibition to reduce abuse by a few and inadvertently but predictably creating such even more dangerous situations is.

Moreover, effectively arguing that it should be easier for the typical high school student to buy illegal drugs (never mind that they should be legal, too) than a six-pack of beer is, at best, a fairly odd case on utilitarian grounds as to why eighteen year olds shouldn't be permitted to drink. If Mr. Chapman doesn't understand these things, I trust the rest of the good folks over at reason do.

(* - Robert Heinlein)

Wednesday, August 20, 2008

Nibble, Nibble, Little Mouse! Who's That Burglaring My House?

Leda Smith heard someone breaking into her home, so she found the revolver kept by her bed, confronted the burglar and forced him at gunpoint to call 911. Then she and the seventeen year old intruder waited until the state police arrived to take him away.

Leda Smith is eighty-five years old.

Thursday, August 7, 2008

Ezra Levant Update

Back in January, I urged readers to check out the blog of Canadian journalist Ezra Levant. Levant was subjected to a year-long investigation by the Alberta Human Rights and Citizenship Commission following a complaint by the Edmonton Council of Muslim Communities over his publication in the Western Standard of the Danish Muhammad cartoons that had so many other publishers cringing in fear. I'm happy to report that the complaint has finally been dismissed and, as a friend at a forum site I frequent said, Canadians are at least tentatively embracing free speech.

As can never be noted too often, speech about which we already approve doesn't need legal protection.

More to the point, I would refer readers again to Mr. Levant's web site and specifically to his taping of the complaint hearing interview available here. I will repeat what I said originally: Levant’s responses to the bureaucrat seated across the table from him during the taped hearing is precisely how free people should deal with government officials under such circumstances.

Congratulations, Mr. Levant.

"Who Can I Sue?"

Soon, you'll be just a mouse click away from the answer!

I have very conflicted feelings about this sort of thing. Feelings, I might point out, that are not widely shared by my fellow libertarians, the majority of whom I believe fail to appreciate the value in principle of a rigorous and easily accessible civil litigation system.

Still, there is no denying that the system as it is currently structured and operated is in dire need of reform. I have no problem with lawyer advertising (its frequent tackiness aside) or with the actual (and actually harmed) plaintiffs acting as unofficial attorneys-general and, when appropriate, winning punitive damages judgments far in excess of their actual damages. I do have a hard time accepting the plaintiffs' bar (aka, trial lawyers) reaping 40% of those judgments, and don't even bother with arguments about how speculative these lawsuits are and how much risk these law firms undertake. Such firms rarely take clients on a contingent-fee basis unless they have already determined that the likelihood of a settlement or judgment in their client's favor is good.

There's gotta be a better way, though I admit to not knowing what it is. Meanwhile, "Who Can I Sue," websites do not strike me as a step in the right direction.

Tuesday, August 5, 2008

Patent Nonsense

One of the things that distinguishes intellectual property from the more intuitively obvious tangible variety is that the very notion of intellectual property requires a justification in the sense that tangible property almost never does. Utopians of one variety or another have tried, almost always with disastrous consequences, to abolish the institution of private property, but as far as I know there has never been a society that has denied the existence or necessity of property rights of any sort at all. Typically, their alternative has been to assert some sort of collectivist or communitarian ownership; but while it may be that the clan or the tribe “own everything in common” or the “people (collectively) own the means of production,” woe be any rival clan or tribe or people who happen by and start asserting similar property rights in the same stuff. Wars have been known to start that way even in utopia.

The obvious thing about tangible property is that, being stuff, it’s there whether we call it property or not. That is, whether ♫ This land is my land (or) this land is your land ♫, this land is here whether we say so or not, let alone whether ♫ This land was made for you and me. ♫ And so are its flora and fauna and minerals and water running through it or beneath its surface, etc.

How human society has gone from the realization that the world is filled with stuff to the notion that some of it is our stuff (or your stuff or, most importantly, my stuff) is an interesting topic, but not one with which I wish to concern myself here in any detail. If you care, I’ll merely note in passing that I reject all “natural right” theories of property, personally, especially including the so-called Lockean “labor + stuff = property” theory.

Still, I constantly run across fellow self-described libertarians who believe in one sort of natural rights theory or another and a fairly large number of them believe that their theory justifies the notion of tangible personal property (whether, forgive the legalism here, real or chattel) but not intellectual property. Intellectual property – by which I mean here the usual unholy trinity of patents, copyrights and trademarks – is on this account the equivalent of a state enforced and, worse yet, state created monopoly. To which I respond:

Yes, that’s true. Exactly like the state-created and state-enforced monopoly any owner of any sort of property whatsoever enjoys versus any non-owner. To be sure, the land would still be there with or without a state enforced legal system, but it wouldn’t be anyone’s property. Not in anything like the sense we mean by property now, that is. All of our philosophical twaddle about what should or shouldn’t or can or can’t be deemed property aside, the ownership of a patent or trademark is no different from the ownership of an automobile or a condominium. They are all creatures of the state or, more specifically, of a state enforced legal system one of the principle justifications for is the sorting out of competing claims over the same resources.

Ah, say my opposition, but land and the stuff we find and trap or kill or take and make new stuff out of on the land (and sea) are quintessential examples of real resources; namely, natural resources. Patents and trademarks and copyrights are mere fictions.

I agree. But they are highly useful fictions, and if my libertarian confrères would get off their pseudo-Kantian high horses about absolute right and wrong and concentrate instead on the far more useful questions of pragmatic good or bad, I think they’d be more inclined to agree with my perspective. Which is as follows:

(1) The state of the law of intellectual property is in need of serious reform, but (2) we would all be better served by, for example, a reformed law of patents than by the entire abolition of patents. For you theorists, I will add (3) there are no serious theoretical reasons, ethical or otherwise, precluding us from, as it were, saving the baby even as we throw out the dirty bathwater here.

By way of giving an example of the sort of unnecessary and counterproductive infanticide I have in mind here, let me quote extensively from a recent Kevin Carson piece over at Art of The Possible. Carson makes his point by quoting a commenter there, and because I am too lazy to edit extensively I will do the same, as follows:
2) Eliminate drug patents. Patents are often justified by the allegedly high cost of developing drugs. But as frequent AoTP commenter quasibill observed, the main source of the expense is not developing the version of the drug that is actually marketed, but gaming the patent system. He challenged the popular misimpression, encouraged by smarmy drug company ads,
that what big pharma is researching is cancer meds. It’s not. In the rare instances that big pharma produces and markets such medicines, it has purchased them from small start-ups that themselves are the result normally of a university laboratory’s work. When big pharma cites to billions of research costs, what it is talking about is the process whereby they literally test millions of very closely related compounds to find out if they have a solid therapeutic window. This type of research is directly related to the patent system, as changing one functional group can get you around most patents, eventually. So you like to bulk up your catalogue and patent all closely related compounds, while choosing only the best among them, or, if you’re second to market, one that hasn’t yet been patented.

This work is incredibly data intensive, and requires many Ph.D’s, assistants, and high powered computers and testing equipment to achieve. But it is hardly necessary in the absence of a patent regime. In the absence of patents, (and of course the FDA), you could just focus on finding a sufficient therapeutic window, and cut out the remaining tests.

Patents also grossly distort the market, leading drug companies to focus most of their research on “me too” drugs that tweak an existing formula just enough to enable it to be repatented, and use it to replace the older version that’s about to go generic. Then the drug reps hit the hospitals and clinics, drop off some free samples and pamphlets, and (most M.D.s relying on drug industry handouts for their information on drugs that come out after they leave med school) the “me, too” drug becomes the new standard form of treatment.
The license cartels and drug patents are two examples of essentially the same phenomenon: First, the government creates a honey pot by enforcing a monopoly and making particular forms of service artificially lucrative. Then the market skews toward where the money is, as practitioners adopt the more lucrative business model and crowd out affordable alternatives.

Okay, so let’s clear the air here a bit. In the first place, whatever may be the truth about the claim that “[p]atents are often justified by the allegedly high cost of developing drugs,” the better question is whether we will have more and better drug research and development with patents or without them regardless of whether those patents go to “big pharma” or to “small start-up firms.” That is, we shouldn’t really care who the incentive of profitable patent rights is spurring on to do research, and that is true whether such research is on cancer drugs or toe fungus drugs.

If Mr. Carson or his commenter believe that there are better ways to encourage such research, they should by all means argue for them. I, however, know of no better incentive than self interest and until I am shown fairly compelling evidence to the contrary, I am not inclined to believe that removing the profit motive from drug research is likely to produce a better, more readily available or affordable pharmacopeia.

Now, that said, no one bothering to read this far should leave thinking I’m an apologist for the pharmaceutical companies. Their successful efforts some years back to retroactively extend the life of patent protection (and similar so-called “reforms” in copyright for the entertainment industry) constitutes nothing more than massive theft and the politicians who voted for such theft should all be horsewhipped. They all created and / or invented whatever they did when the state of the law provided a certain term of proprietary rights and they should enjoy the benefit of that bargain, but nothing more. If the case could be made for patents or copyrights of longer duration, whether for drugs or novels or whatever, fine. But such revised laws should take effect only prospectively. Retroactive extension deprives the public (you and me) of our rightful future expectations with regard to these properties, future expectations we have been paying for throughout the life of the original patents or copyrights. Moreover (okay, go ahead and get back on your Kantian high-horse for a moment here), fair’s fair and a bargain is a bargain.

I don’t deny that the current state of patent law should be extensively reformed (starting with repealing the patent extensions granted “big pharma” in the recent past). It is also true that, to use Mr. Carson’s phrase, patents “distort the market ... [skewing it] toward where the money is.” But, ignoring the emotive connotations of “distort,” it is true of all property schemes that they provide incentives toward certain sorts of behavior and against others.

Perhaps the current system does encourage gaming of sorts which we want to discourage, instead. Perhaps we permit new patents on new drugs that are too closely similar to previously developed drugs. I say perhaps. In fact, I don’t know whether it does or not. The point, however, is that there are all sorts of ways of changing the existing system short of simply abolishing it.

And replacing it with what? The milk of human kindness as a spur to research or, what I fear is the real intended replacement, more massive government control and funding?

Do you want more invention and innovation or less? Do you want more creative works of art or fewer? Those, I think, are the critical questions in any useful discussion of intellectual property. And at the risk of repeating myself, details aside, I know of no better means of getting more of both than by encouraging self-interest through the creation of private property interests in the fruits of such invention and creativity.

Do you?

Sunday, August 3, 2008

Constant Viewer Ponders The Movie Business

Not so very long ago a movie had to gross $100 million to be considered a bona fide summer blockbuster. Today, however, $200 million is the new $100 million and a movie that grosses a mere tenth of a billion doesn’t even hit the top 400 all-time domestic grossing movies. That’s not adjusting for inflation, by the way. Gone With The Wind grossed a mere $198 million dollars, but, hey, they were 1939 dollars and a dollar bought just a teeny bit more back then. (In round inflation adjusted numbers, GWTW grossed around $1.5 billion.)

The summer of 2008 has had its fair share of blockbusters, in any case, even at the new $200 million threshold: Wall-E, Kung Fu Panda, Hancock, Indiana Jones and the Kingdom of the Crystal Skull, Iron Man and The Dark Knight, the last three having already grossed over $300 million each and several, especially including The Dark Knight, still raking in the box office cash.

The interesting question to Constant Viewer at this point is how far The Dark Knight can go. Obviously, it’s got sprinter’s legs, having beaten Mummy III this weekend and stayed in the #1 slot in its third week out. But, let’s face it, Mummy III is probably the weakest of this summer’s big movies. Still, earning so far just $5 million shy of the $400 million mark, The Dark Knight now ranks 8th all-time in domestic gross, probably marking the first time Warner Brothers has had a film in such rarefied company since Bogart. (Okay, CV just made that up. Basically, however, aside from the Harry Potter franchise, WB hasn’t exactly been a major player for a long, long time. And CV has the handfull of Time-Warner shares to prove it, too!)

This isn’t going anywhere, in case you were wondering. CV simply finds the business of show business, the industry part of the film industry, interesting in and of itself. So when a movie like The Dark Knight comes along (and CV actually plunks down the purchase price of a ticket twice for it!) he wonders just how big it might end up being.

One thing’s for sure. The Dark Knight is not going to come anywhere close to striking range of, oh, say, Titanic. Here’s a Box Office Mojo page devoted to comparing the two, together with Shrek 2 and Star Wars: The Phantom Menace just for good measure. Notice that Titanic (a) didn’t open all that big, but (b) ended up with a domestic gross of over $600 million. That makes it the biggest PG-13 movie and roughly the fifth or sixth highest (inflation adjusted) grossing movie of any sort, period. Why was it so big?

Because it was a romance men didn’t mind going to see. Or it was an action / disaster movie women didn’t mind going to see. Take your pick. But the next huge, history making movie isn’t likely to involve superheroes or animated characters of any sort and it won’t have to be rated PG or G, either. Somewhere in Hollywood someone is studying Titanic and figuring out that romantic adventure, not romantic comedy, is where the money’s at. At least that's Constant Viewer's best guess. Now, if only he could figure out a cleverly tragic, romantic way for the hero to die in front of his lover in the last act of his screenplay!

Saturday, August 2, 2008

Constant Viewer: The Mummy: Curse of the Dragon Emperor

The Mummy: Curse of the Dragon Emperor is not, rest assured, a French movie. In fact, it is in many respects an anti-French movie. It’s dumb and it knows it’s dumb. It may even be a little proud of how dumb it is as it revels in over-the-top action scenes and dazzling special effects. None of its characters have anything like an introspective or existential identity crisis or, for that matter, would know it if they did. There’s never a moment when the viewer has any reason to suspect that the writers or director or cast seriously thought “Oh no! We can’t do that! It would be too preposterous. The audiences will never buy it!” Nope, Mummy III knows it's all about the cheap thrills and delivers them up by the pallet load.

Brendan Fraser is the poor man’s Tom Hanks, assuming Hanks was dumb enough to try his hand as an action hero, eminently likable in large measure precisely because he’s an everyman type and not an action hero type. That he’s made a fairly nice film career playing against that obvious fact only goes to prove, as William Goldman so deftly put it, that in Hollywood nobody knows anything.

Jet Li makes a fine bad guy here and the rest of the cast are likewise as plausible as you’re likely to find in so implausible a movie. It’s all Raiders of the Lost Ark meets Lost Horizons meets every CGI battle scene made in the last ten years meets every zombie movie made in the last 20 years, and if the comedic touches sometimes wander into farce territory at least there’s not a single scene where someone languorously smokes a cigarette wondering what it’s all about.

In passing, you might wonder why on earth Mummy III and so many other movies in the last five or ten years have been centered in or at least had a major scene or two shot in China. There are no Chinese mummies, after all. Are there? Well, whether there are or not, this much is clear. There are a whole hell of a lot more Chinese than Egyptians and nowadays, unlike back in the old Red China days, more and more of them go to the movies or rent or buy DVDs. And here you round-eyed devils thought you were still the target audience!

------

In response to a few comments from CV’s loyal readers about his recent evisceration of French filmmaking, it should be noted that CV’s theory of movie reviews is that it’s just practical emotivism. You find a reviewer whom you discover yells "Boo!" at the same movies you dislike and "Hurray!" at the same movies you like or even vice versa and then you've got a fairly reliable guide to help you pick what to see. Of course, it has to be tarted up a bit, but there's really nothing more to it than that.

There've been several mentions of noir, aka film noir, too, which is of course a French critical invention (film criticism being to movie reviews what prescriptivism is to emotivism). Hollywood just thought it was turning out B-movie gangster stories back then. Then again, Hollywood is almost always oblivious about those rare occasions when it accidentally creates art, too.

The thing about film noir is that it almost entirely contradicts the auteur theory if both are taken seriously. In the first place, these were almost all quintessentially studio movies, not directorial statements of any sort. None of the supposed genre’s directors set out to make a noir movie the way others set out, say, to make a screwball comedy or, for that matter, some socialist or communist writers were in fact trying to promote certain political themes in various post-war movies. (N.B., this isn’t an implicit defense of the notorious Hollywood Blacklist but simply an acknowledgment that some of the writers of that era were, in fact, intentionally polemical.)

These movies were all shot in black and white because, well, duh, just about all cheap movies were shot in black and white in the late 40s and 50s. Their cinematographic technique relied heavily on shadows and skewed camera angles because that was discovered to be a (cheap!) way to build psychological suspense and, frankly, just because it was trendy then in the same way those damned "let's swing the camera around the subject three or four times like an orbiting moon" shots are practically required by law in every movie made today.

Sure, there were a few movies of that era in which the female lead was a conniving vixen leading the poor, gullible protagonist to ruin, but you'd be hard pressed to make that claim about many of the most classic noir movies, e.g., Sunset Boulevard or even The Third Man. Finally, two of the greatest ‘noir’ movies of all time – Blade Runner and Chinatown – fit none of the noir theorists' criteria except the most important one: mood.

The fact is that the film noir genre is a garment that fits few movies of the era very well regardless of how many movies it will more or less badly fit here or there. It is, in the end, a hole that is neither round nor square nor any definite shape at all into which very, very few movie pegs can be fitted easily but just about any drama or movie of suspense can be pounded into with a heavy enough rhetorical hammer. So much for French theory, too.