As the summer season begins, site hits have begun to slack off, indicating either that readers have better things to do during the summer than read blogs or that readers are wising up to me and finding better things to do than read this blog regardless of the season. I'm going with "either" for the time being and will do a bit of sluffing off, myself, in July and August, what with family vacations and what not.
______
Retired Header ‘Quotes’:
March 2007 – “Pay no attention to that man behind the veil!” – the Wizard of Rawls
April 2007 – “Whereof one cannot speak, thereof one may still post on the internet.” – Ludwig Blogenstein
May 2007 - “Don't follow leaders, watch the blog hit meters!" - Blog Dylan
Saturday, June 30, 2007
Constant Viewer: Ratatouille
Nothing Constant Viewer says about Ratatouille will make any difference to you, Dear Reader, especially if you have children young enough that you look forward to any G Rated movie you think you might actually be able to suffer through, yourself. Rest assured, however, you won't suffer at all through Ratatouille -- you'll enjoy it as much as your kids will. If you don't have kids, go see it anyway.
We are living in a Golden Age of animation, a fact you'd never discover watching Cartoon Network or any of the current Saturday morning nonsense. The reason, quite simply, is Pixar, which has almost single-handedly raised the bar to where the best animated features today are as beautiful as classic Disney features, as funny as classic Warner Bros. Looney Tunes and as sophisticated as classic Max Fleischer cartoons. While Disney has had tremendous commercial success with such PC dreck as The Lion King and Pocahontas, only DreamWorks has given Pixar a serious run for its money in terms of quality, hence the House the Mouse Built serves chiefly as a distributor these days for Pixar's creative genius. With serious kudos to writer / director Brad Bird, Ratatouille is no exception. It's a gem.
Ratatouille tells the story of a rat named Remy (Patton Oswalt) who yearns to be a great cook forming a symbiotic relationship with a young man named Linguini (Lou Romano) in a famous Parisian restaurant fallen on hard times since its master chef died. Remy's family, suspicious of humans, the scheming new chef trying to capitalize on the restaurant's fading reputation and a deliciously malevolent food critic (wonderfully voiced by Peter O'Toole) provide all the required plot complications.
But the plot here is truly little more than a vehicle for a moving story about family loyalties, friendships, ambitions and dreams. It's accessible enough for small children and sophisticated enough for adults and it's carried all the while with great comic timing and a brilliant sense of how animation at its best works where a CGI effect in a live movie would look absurd.
Before the feature begins, Lifted, a new Pixar short adds to the fun. CV's only criticism is that, counting the previews and the short, Ratatouille runs a bit long for the attention span of pre-schoolers, several of whom were figgeting behind CV for most of the last hour. If Constant Viewer had his way, pre-schoolers wouldn't be taken to the movies at all; but he doesn't, so beware of your neighbors.
A few days ago, Constant Viewer wrote in a review of Live Free or Die Hard that if a better summer movie awaited us, it would be one hell of a summer. Of course, the two movies are apples and oranges; nonetheless, it's official: it's one hell of a summer at the movies.
We are living in a Golden Age of animation, a fact you'd never discover watching Cartoon Network or any of the current Saturday morning nonsense. The reason, quite simply, is Pixar, which has almost single-handedly raised the bar to where the best animated features today are as beautiful as classic Disney features, as funny as classic Warner Bros. Looney Tunes and as sophisticated as classic Max Fleischer cartoons. While Disney has had tremendous commercial success with such PC dreck as The Lion King and Pocahontas, only DreamWorks has given Pixar a serious run for its money in terms of quality, hence the House the Mouse Built serves chiefly as a distributor these days for Pixar's creative genius. With serious kudos to writer / director Brad Bird, Ratatouille is no exception. It's a gem.
Ratatouille tells the story of a rat named Remy (Patton Oswalt) who yearns to be a great cook forming a symbiotic relationship with a young man named Linguini (Lou Romano) in a famous Parisian restaurant fallen on hard times since its master chef died. Remy's family, suspicious of humans, the scheming new chef trying to capitalize on the restaurant's fading reputation and a deliciously malevolent food critic (wonderfully voiced by Peter O'Toole) provide all the required plot complications.
But the plot here is truly little more than a vehicle for a moving story about family loyalties, friendships, ambitions and dreams. It's accessible enough for small children and sophisticated enough for adults and it's carried all the while with great comic timing and a brilliant sense of how animation at its best works where a CGI effect in a live movie would look absurd.
Before the feature begins, Lifted, a new Pixar short adds to the fun. CV's only criticism is that, counting the previews and the short, Ratatouille runs a bit long for the attention span of pre-schoolers, several of whom were figgeting behind CV for most of the last hour. If Constant Viewer had his way, pre-schoolers wouldn't be taken to the movies at all; but he doesn't, so beware of your neighbors.
A few days ago, Constant Viewer wrote in a review of Live Free or Die Hard that if a better summer movie awaited us, it would be one hell of a summer. Of course, the two movies are apples and oranges; nonetheless, it's official: it's one hell of a summer at the movies.
Classic TV Finales, Palestinian Style
Farfour is dead. In the final Hamas-affiliated Al Aqsa TV episode, the Mickey Mouse knockoff who preached Islamic domination was, wait for it... beaten to death by an Israeli Jew.
This whole "beaten to death by Jews" idea for TV series finales could save Hollywood a whole lot of needless time and trouble. Just think, for example, how much easier it would have been to write the final episode of The Sopranos if, by long standing tradition, audiences understood that Tony and his two families would be beaten to death in the end by Hyman Roth's avenging descendants. Of course, such a tradition would have to have started decades ago, so here's what the final episodes of some old TV shows would have gone like if those Farfour writers had been in charge:
Howdy Doody - Clarabell never spoke a word for 13 years until the final minutes of the last show when Buffalo Bob read a note from the clown. "Why, I can't believe it!" Bob exclaimed. "Clarabell can talk! Is this true?" Clarabell nodded. "Well", Bob said, "Go ahead. Say something!" "JEWS!" the clown screamed as Jewish thugs beat the entire cast, crew and kiddie audience to death.
The Fugitive - Just before being stoned to death for killing his wife (ordinarily just a misdemeanor, but she was the Imam's daughter), Dr. Raji Kimble escapes, only to be pursued for years by the relentless police Lt. Mustafa Gerard. Just as Gerard is about to capture Kimble, the One-Armed Jew is discovered lurking in the shadows. Kimble and Gerard catch him and beat him to death, Kimble's name is cleared and the Imam declares a Great Victory and gives Kimble two more of his daughters as a reward.
The Mary Tyler Moore Show - While Mary and the gang at WJM-TV have one final group hug, the phone rings and they discover that their contracts, drawn up by crafty Jewish lawyers, are air-tight and they can't be fired after all. The evil Jewish station owners, outraged, burst into the newsroom and beat them all, except for Ted Baxter, to death.
M*A*S*H: "Goodbye, Farewell and Allahu Akbar" - After the rest of the 4077th bugs out while Jewish North Koreans sweep through the front lines beating to death everyone they encounter, B.J. takes the still recuperating Hawkeye to a waiting helicopter. Once the helicopter is aloft, Hawkeye opens and reads a note from B.J. that reads "My initials stood for ben Judah, you fool!" Realizing he's been duped by evil Jews, Hawkeye is nonetheless too weak to fight back as the evil Jewish helicopter pilot throws him from the chopper to his death. As he plummets, the last thing he sees is where B.J. formed a huge Star of David from stones on the hillside.
Newhart - Bob gets into an altercation with handymen Larry, Darryl and Darryl and they knock him unconscious with a Moose head. When he awakens, he discovers he is in his apartment bedroom lying next to his wife, Rhoda. "What's the matter, Bob?" she asks him, "You've been tossing and turning like a meshugener!" He tells her about his dream and she says, ""That's the last time you nosh on pastrami before bed. You kept me up all night, you putz!" She then beats him to death.
Seinfeld - For no reason at all, the entire cast beat each other to death.
This whole "beaten to death by Jews" idea for TV series finales could save Hollywood a whole lot of needless time and trouble. Just think, for example, how much easier it would have been to write the final episode of The Sopranos if, by long standing tradition, audiences understood that Tony and his two families would be beaten to death in the end by Hyman Roth's avenging descendants. Of course, such a tradition would have to have started decades ago, so here's what the final episodes of some old TV shows would have gone like if those Farfour writers had been in charge:
Howdy Doody - Clarabell never spoke a word for 13 years until the final minutes of the last show when Buffalo Bob read a note from the clown. "Why, I can't believe it!" Bob exclaimed. "Clarabell can talk! Is this true?" Clarabell nodded. "Well", Bob said, "Go ahead. Say something!" "JEWS!" the clown screamed as Jewish thugs beat the entire cast, crew and kiddie audience to death.
The Fugitive - Just before being stoned to death for killing his wife (ordinarily just a misdemeanor, but she was the Imam's daughter), Dr. Raji Kimble escapes, only to be pursued for years by the relentless police Lt. Mustafa Gerard. Just as Gerard is about to capture Kimble, the One-Armed Jew is discovered lurking in the shadows. Kimble and Gerard catch him and beat him to death, Kimble's name is cleared and the Imam declares a Great Victory and gives Kimble two more of his daughters as a reward.
The Mary Tyler Moore Show - While Mary and the gang at WJM-TV have one final group hug, the phone rings and they discover that their contracts, drawn up by crafty Jewish lawyers, are air-tight and they can't be fired after all. The evil Jewish station owners, outraged, burst into the newsroom and beat them all, except for Ted Baxter, to death.
M*A*S*H: "Goodbye, Farewell and Allahu Akbar" - After the rest of the 4077th bugs out while Jewish North Koreans sweep through the front lines beating to death everyone they encounter, B.J. takes the still recuperating Hawkeye to a waiting helicopter. Once the helicopter is aloft, Hawkeye opens and reads a note from B.J. that reads "My initials stood for ben Judah, you fool!" Realizing he's been duped by evil Jews, Hawkeye is nonetheless too weak to fight back as the evil Jewish helicopter pilot throws him from the chopper to his death. As he plummets, the last thing he sees is where B.J. formed a huge Star of David from stones on the hillside.
Newhart - Bob gets into an altercation with handymen Larry, Darryl and Darryl and they knock him unconscious with a Moose head. When he awakens, he discovers he is in his apartment bedroom lying next to his wife, Rhoda. "What's the matter, Bob?" she asks him, "You've been tossing and turning like a meshugener!" He tells her about his dream and she says, ""That's the last time you nosh on pastrami before bed. You kept me up all night, you putz!" She then beats him to death.
Seinfeld - For no reason at all, the entire cast beat each other to death.
Labels:
Entertainment,
Foreign Affairs,
Parody,
Television
What the Hell? This Blog is "Rated R"?
With a hat tip to Lance over at A Second Hand Conjecture, it turns out that my little blog thingie here is "Rated R" according to a clever marketing ploy by an online dating site. The site explains, in my case:
So, what they're doing is crawling or spidering or whatever the hip web term is for it over a site, finding instances of certain words and cranking out a rating.
That is, I am sure, a small part of how the MPAA goes about its film rating business, but knowing the film industry (even now that Jack Valenti is dead), I'm betting the MPAA comes after the site for trademark infringement. Why? Here's a picture of a "widget" they offered me:

Them movie industry boys are serious about intellectual property rights, and this dating service site didn't even have the sense to remove the MPAA logo from the picture?
Two other points. First, if Live Free or Die Hard is any indication at all, these guys have set the bar way too low for an R rating and I deserve no worse than a PG-13.
Second, as porn sites found it useful to "voluntarily" use or cooperate with filter services like Net Nanny and so forth, I sadly predict it's only a matter of time before all internet web sites do have some sort of rating category assigned to them. (I say this, by the way, as a father of primary school children who have access to the internet.) Slowly, perhaps, but surely nonetheless, the wild, wild west days of the internet are coming to a close. God forbid, after all, that some sixteen year old should land at this site and read words like "hell" and "shoot."
This rating was determined based on the presence of the following words:
* hell (6x)
* crack (4x)
* cocaine (3x)
* suicide (2x)
* shoot (1x)
So, what they're doing is crawling or spidering or whatever the hip web term is for it over a site, finding instances of certain words and cranking out a rating.
That is, I am sure, a small part of how the MPAA goes about its film rating business, but knowing the film industry (even now that Jack Valenti is dead), I'm betting the MPAA comes after the site for trademark infringement. Why? Here's a picture of a "widget" they offered me:

Them movie industry boys are serious about intellectual property rights, and this dating service site didn't even have the sense to remove the MPAA logo from the picture?
Two other points. First, if Live Free or Die Hard is any indication at all, these guys have set the bar way too low for an R rating and I deserve no worse than a PG-13.
Second, as porn sites found it useful to "voluntarily" use or cooperate with filter services like Net Nanny and so forth, I sadly predict it's only a matter of time before all internet web sites do have some sort of rating category assigned to them. (I say this, by the way, as a father of primary school children who have access to the internet.) Slowly, perhaps, but surely nonetheless, the wild, wild west days of the internet are coming to a close. God forbid, after all, that some sixteen year old should land at this site and read words like "hell" and "shoot."
Friday, June 29, 2007
Could The Rule Of Law Be Making A Comeback?
Strange and perhaps even unprecedented doings at the Supreme Court, vacating an earlier order denying review and requesting new briefs from the parties in two Guantanamo Bay detainee cases. SCOTUSblog provides good commentary, but the fact is that there isn't enough public information to get a good sense of what the Court is up to. After his thoroughgoing defeat over immigration reform, there isn't much left for Bush to lose on at this point beyond his sweeping assertion of war powers. Pelosi may be right -- he may not be worth impeaching. His (and Cheney's) continued administration can do nothing at this point but aid the Democratic Party however much the nation may suffer in the meanwhile.
Scientists Confirm Cats Domesticated Humans
No, not really. Humans, after all, aren't entirely domesticated yet and cats, as it turns out, domesticated themselves. That's the thinking, anyway, as reported today in the Washington Post. The short version here, according to researchers from the National Cancer Institute and the University of Oxford, is that a particular species of wildcats (Felis sylvestris) started populating the outskirts of the Near East's Fertile Crescent when large scale grain agriculture and grain storage began attracting a large rodent population some 12,000 years ago.
Our old friend natural selection gave an adaptive advantage to the feral wildcats who dared the occasional human contact, as humans recognized the symbiotic advantages of the cats in rodent control. These cats thus became the ancestors of all domestic cats today worldwide. Had wild cats been domesticated by humans as was the case with other animals, it supposedly would have been much more likely that, e.g., China's and Europe's different wildcat species would show up at the trunk of the family tree. Not so, however, leading to the single-species self-domesticating thesis.
[BROKEN LINK - LOLCat Picture]
This explains much about our longstanding relationship with cats. Remaining mysteries include where and why they are getting cameras and computers for all those annoying posters on the internet and why, if they're smart enough to do all that, they haven't learned yet how to use the spell checker.
UPDATE: LOLCat idea thanks to Stevo Darkly. (See comments.)
Our old friend natural selection gave an adaptive advantage to the feral wildcats who dared the occasional human contact, as humans recognized the symbiotic advantages of the cats in rodent control. These cats thus became the ancestors of all domestic cats today worldwide. Had wild cats been domesticated by humans as was the case with other animals, it supposedly would have been much more likely that, e.g., China's and Europe's different wildcat species would show up at the trunk of the family tree. Not so, however, leading to the single-species self-domesticating thesis.
[BROKEN LINK - LOLCat Picture]
This explains much about our longstanding relationship with cats. Remaining mysteries include where and why they are getting cameras and computers for all those annoying posters on the internet and why, if they're smart enough to do all that, they haven't learned yet how to use the spell checker.
UPDATE: LOLCat idea thanks to Stevo Darkly. (See comments.)
Thursday, June 28, 2007
Constant Viewer: Live Free or Die Hard
Live Free or Die Hard is preposterously, outrageously over the top and wonderfully, satisfyingly entertaining. Constant Viewer doesn't use phrases like "kickin' it old school," but if he did that's exactly the sort of thing he'd say about this movie. The dialog is crisp, witty and, oh by the way, proof that today's PG-13 would have earned an R rating the last time we saw New York City cop John McClane, which was twelve years ago. The acting and directing are brisk and energetic and the plot and special effects, absurd though they both are, rise appropriately to audiences' ever higher expectations for action films. If a better summer movie awaits us, it's going to be one hell of a summer.
Bruce Willis at 52 teaches many younger action heroes a thing or two about how to do it right in Live Free or Die Hard. This is certainly his strongest performance in the franchise since his first outing as John McClane in 1988 brought him instant film stardom. IMDb reports that Willis was the fifth choice for the main character, offers originally going to Arnold Schwarzenegger, then Sylvester Stallone, then Burt Reynolds, then Richard Gere before Willis finally got it. Had he not gotten the role, chances are pretty good he'd be known today as "the guy who played against Cybill Shepherd in Moonlighting."
This time around, McClane gets tasked by the FBI to retrieve computer hacker Matt Farrell (Justin Long), suspected of involvement with a computer terrorist plot that begins to shut down the Eastern United States. When McClane's daughter (Mary Elizabeth Winstead) gets taken hostage by principle bad guy Thomas Gabriel (Timothy Olyphant), things get personal. The plot-required relationship between McClane and Farrell could easily have turned the film into a bad buddy movie, but it doesn't. What's more, Winstead gives a bravura performance in her small but significant role as McClane's estranged daughter and Kevin Smith fans will enjoy his character's small but important role in helping McClane and Farrell track down Gabriel.
Of course, Live Free or Die Hard is completely formulaic and, sure, you've seen variations on its theme and plot and character interaction dozens of times before. But there's a good reason for that. When action movies work, really work, they're just about the best reason there is for going out to see a movie just for the sheer fun of it. Live Free or Die Hard really works and really delivers. Go, see it on the big screen and take the older kids with you. Let 'em see how it was done "old school" before the good guys all wore spandex.
Bruce Willis at 52 teaches many younger action heroes a thing or two about how to do it right in Live Free or Die Hard. This is certainly his strongest performance in the franchise since his first outing as John McClane in 1988 brought him instant film stardom. IMDb reports that Willis was the fifth choice for the main character, offers originally going to Arnold Schwarzenegger, then Sylvester Stallone, then Burt Reynolds, then Richard Gere before Willis finally got it. Had he not gotten the role, chances are pretty good he'd be known today as "the guy who played against Cybill Shepherd in Moonlighting."
This time around, McClane gets tasked by the FBI to retrieve computer hacker Matt Farrell (Justin Long), suspected of involvement with a computer terrorist plot that begins to shut down the Eastern United States. When McClane's daughter (Mary Elizabeth Winstead) gets taken hostage by principle bad guy Thomas Gabriel (Timothy Olyphant), things get personal. The plot-required relationship between McClane and Farrell could easily have turned the film into a bad buddy movie, but it doesn't. What's more, Winstead gives a bravura performance in her small but significant role as McClane's estranged daughter and Kevin Smith fans will enjoy his character's small but important role in helping McClane and Farrell track down Gabriel.
Of course, Live Free or Die Hard is completely formulaic and, sure, you've seen variations on its theme and plot and character interaction dozens of times before. But there's a good reason for that. When action movies work, really work, they're just about the best reason there is for going out to see a movie just for the sheer fun of it. Live Free or Die Hard really works and really delivers. Go, see it on the big screen and take the older kids with you. Let 'em see how it was done "old school" before the good guys all wore spandex.
Indoctrinate Who?
I haven't yet seen Indoctrinate U, a supposedly Michael Moore-esque but libertarian documentary about leftist university speech codes and such, but efforts to find a general distributor have been abetted, however unintentionally, by a bit of nonsense over at the New York Times, nonsense shown for what it is at The Volokh Conspiracy, FIRE, Power Line and by Evan Coyne Maloney, the filmmaker himself.
Documenting intolerance to non-leftist ideas or the expression thereof on American college campuses isn't far removed from documenting racial bias in the Klan, except of course that the Kluxers acknowledge that they're racists. That's hardly to say that all or even most university faculty members oppose free speech (though many apparently do) or that university administrations are generally intolerant to conservative or libertarian perspectives (though many apparently are) or even that the vocal majority of leftist organizations on most campuses (the various demographically aggrieved or special interest whiners) oppose free speech -- oh, wait a minute, yes it does.
True, most university faculty members are liberals or leftists of one sort or another. The good news here, though, is that many if not most students pay little attention to their professors beyond listening either for confirmation of their preexisting political prejudices or evidence of deviant speech that might fuel their self-righteous indignation. The quest for diversity has university administrations pretty much cowed, for there are few fates worse than getting stuck with a reputation of being a hostile environment to women and minorities. Hence, however approvingly some faculty may look on such nonsense, much of the hothouse political correctness of the schools these days is self-inflicted by students, themselves, with an assist from administrators who care far more about attracting the right demographics for their freshman class than whatever the students experience or learn once there.
I have no idea whether Indoctrinate U is a good film or whether it will succeed in finding a wider audience. I can predict, however, that its attempted showing on college campuses themselves, is certain to result in howls of protest from the usual suspects on those campuses and that they will be utterly oblivious to the irony of it all.
Documenting intolerance to non-leftist ideas or the expression thereof on American college campuses isn't far removed from documenting racial bias in the Klan, except of course that the Kluxers acknowledge that they're racists. That's hardly to say that all or even most university faculty members oppose free speech (though many apparently do) or that university administrations are generally intolerant to conservative or libertarian perspectives (though many apparently are) or even that the vocal majority of leftist organizations on most campuses (the various demographically aggrieved or special interest whiners) oppose free speech -- oh, wait a minute, yes it does.
True, most university faculty members are liberals or leftists of one sort or another. The good news here, though, is that many if not most students pay little attention to their professors beyond listening either for confirmation of their preexisting political prejudices or evidence of deviant speech that might fuel their self-righteous indignation. The quest for diversity has university administrations pretty much cowed, for there are few fates worse than getting stuck with a reputation of being a hostile environment to women and minorities. Hence, however approvingly some faculty may look on such nonsense, much of the hothouse political correctness of the schools these days is self-inflicted by students, themselves, with an assist from administrators who care far more about attracting the right demographics for their freshman class than whatever the students experience or learn once there.
I have no idea whether Indoctrinate U is a good film or whether it will succeed in finding a wider audience. I can predict, however, that its attempted showing on college campuses themselves, is certain to result in howls of protest from the usual suspects on those campuses and that they will be utterly oblivious to the irony of it all.
Wednesday, June 27, 2007
We Can't Pick Our Relatives, But We Can Pick Our Friends
At City Paper, John Leo examines Harvard political scientist Robert Putnam’s latest research on diversity, which is interesting on several grounds. Putnam, author of Bowling Alone, has apparently discovered (surprise, surprise) that societal heterogeneity, a.k.a., "diversity" is disruptive and destructive of “the social capital, fabric of associations, trust, and neighborliness that create and sustain communities.” Suffice it to say for the moment that this is not the sort of speaking truth to power the folks in Moscow on the Charles had in mind.
Ordinarily, this is where I’d dish up the usual disclaimer about reading too much from a mere review, itself from a mere summary of these results without examining the data directly. Therein, however, lies the second interesting point here – there’s nothing to look at.
Now isn't that interesting?
Race, as between African Americans and Americans of European descent, is a special case for all the obvious reasons, but ethnic strife in general is as American as it gets. Little wonder, since the entire history of America has been the New World recapitulation of the ethnic and tribal warfare of humanity since time immemorial. We like people who look and act like we do, tolerate minor differences reasonably well, major differences not well at all. Whatever its evolutionary adaptive advantage may once have been, such lingering but deeply seated suspicions and animosities are rationally irrelevant in contemporary society; but human beings aren’t merely rational animals, so such irrelevance is, itself, irrelevant.
In the Bad Old Days (and here is where African Americans were, involuntarily, the most glaring exception) the American solution to this was assimilation. You want to be an American? Learn English, if you don’t already speak it, forsake huge chunks of whatever culture you physically left behind and get with the program of baseball, hot dogs, apple pie and Chevrolet. Oh, wait a minute. What’s that smell? Can I have a taste of that? Okay, forget the hot dogs and give me the recipe for that. Hey, someday it could become as American as pizza pie.
Without a doubt, assimilation works better from the dominant culture’s perspective than from the assimilatee’s. Typically, it’s a generational thing. Worse yet, it has tended to work best for the assimilating group only when they’ve been here long enough for the next boatload of huddled masses to hit the shores. Still, last time I checked all the “No Dogs or Irish” signs have been taken down in Boston.
Given the opportunity, people still self-segregate and the fact has far less to do with whether they are living in an open, welcoming society or not. Racial and ethnic strife are way lower today than they were when I was a teenager, but check out any public high school in America to see self-segregation in action. Or hit a few churches on Sunday morning.
Groups don’t integrate, the efforts of social planners to shuffle them together like a deck of cards notwithstanding. Individuals do that, one person at a time, over time, voluntarily. Of course, on balance immigration is a net benefit to the United States. It isn’t the mere fact of immigration that endangers the social fabric nearly so much as the cognitive dissonance driven theories of how such immigration should be processed. Leo again:
Putnam can hope all he wants, for whatever little good it will do him personally. The solution to the problem lies obviously and exactly in a rejection of the delusions of our multicultural, anti-assimilation era.
Ordinarily, this is where I’d dish up the usual disclaimer about reading too much from a mere review, itself from a mere summary of these results without examining the data directly. Therein, however, lies the second interesting point here – there’s nothing to look at.
Putnam has long been aware that his findings could have a big effect on the immigration debate. Last October, he told the Financial Times that “he had delayed publishing his research until he could develop proposals to compensate for the negative effects of diversity.” He said it “would have been irresponsible to publish without that,” a quote that should raise eyebrows. Academics aren’t supposed to withhold negative data until they can suggest antidotes to their findings.... Nor has Putnam made details of his study available for examination by peers and the public. So far, he has published only an initial summary of his findings....
Now isn't that interesting?
Race, as between African Americans and Americans of European descent, is a special case for all the obvious reasons, but ethnic strife in general is as American as it gets. Little wonder, since the entire history of America has been the New World recapitulation of the ethnic and tribal warfare of humanity since time immemorial. We like people who look and act like we do, tolerate minor differences reasonably well, major differences not well at all. Whatever its evolutionary adaptive advantage may once have been, such lingering but deeply seated suspicions and animosities are rationally irrelevant in contemporary society; but human beings aren’t merely rational animals, so such irrelevance is, itself, irrelevant.
In the Bad Old Days (and here is where African Americans were, involuntarily, the most glaring exception) the American solution to this was assimilation. You want to be an American? Learn English, if you don’t already speak it, forsake huge chunks of whatever culture you physically left behind and get with the program of baseball, hot dogs, apple pie and Chevrolet. Oh, wait a minute. What’s that smell? Can I have a taste of that? Okay, forget the hot dogs and give me the recipe for that. Hey, someday it could become as American as pizza pie.
Without a doubt, assimilation works better from the dominant culture’s perspective than from the assimilatee’s. Typically, it’s a generational thing. Worse yet, it has tended to work best for the assimilating group only when they’ve been here long enough for the next boatload of huddled masses to hit the shores. Still, last time I checked all the “No Dogs or Irish” signs have been taken down in Boston.
Given the opportunity, people still self-segregate and the fact has far less to do with whether they are living in an open, welcoming society or not. Racial and ethnic strife are way lower today than they were when I was a teenager, but check out any public high school in America to see self-segregation in action. Or hit a few churches on Sunday morning.
Groups don’t integrate, the efforts of social planners to shuffle them together like a deck of cards notwithstanding. Individuals do that, one person at a time, over time, voluntarily. Of course, on balance immigration is a net benefit to the United States. It isn’t the mere fact of immigration that endangers the social fabric nearly so much as the cognitive dissonance driven theories of how such immigration should be processed. Leo again:
Though Putnam is wary of what right-wing politicians might do with his findings, the data might give pause to those on the left, and in the center as well. If he’s right, heavy immigration will inflict social deterioration for decades to come, harming immigrants as well as the native-born. Putnam is hopeful that eventually America will forge a new solidarity based on a “new, broader sense of we.” The problem is how to do that in an era of multiculturalism and disdain for assimilation.
Putnam can hope all he wants, for whatever little good it will do him personally. The solution to the problem lies obviously and exactly in a rejection of the delusions of our multicultural, anti-assimilation era.
Monday, June 25, 2007
A Farewell to Claws?
With a hat tip to Reason's Katherine Mangu-Ward, the (U.K.) Daily Mail reports:
Ah, yes, I can see it now ...
From "A Clean, Well-Heated Tank," by Ernest Lemmingway:
"What did he want to kill himself for?"
"How should I know?"
"How did he do it?"
"He plunged into a pot of boiling water."
"Who pulled him out?"
"The cook."
"Why did they do it?"
"Twenty-three dollars a pound."
People could be prosecuted for being cruel to pet spiders, octopuses and restaurant lobsters under animal welfare plans being considered by the Government....
While it is illegal to mistreat a goldfish, there is nothing to stop people mistreating pet tarantulas or lobsters kept in restaurant aquariums....
While [restaurants] would still be able to boil the crustaceans alive to kill them, they would have to make sure they are kept in clean, warm uncrowded tanks up to that point.
Ah, yes, I can see it now ...
From "A Clean, Well-Heated Tank," by Ernest Lemmingway:
"What did he want to kill himself for?"
"How should I know?"
"How did he do it?"
"He plunged into a pot of boiling water."
"Who pulled him out?"
"The cook."
"Why did they do it?"
"Twenty-three dollars a pound."
Thanks Again to Our "Good Friends and Loyal Allies," the Saudis
How is the fight against Radical Islam going?
All things considered, not badly at all. At least not according to an excellent, must-read article by Fareed Zakaria in Newsweek. Zakaria makes a number of excellent points about how unlike the perceived "global threat" of Islamist extremism is the reality of how small, disorganized, dispersed, unconnected and increasingly inwardly focused the majority of such groups are and how, with few exceptions, the very reason they pursue their objectives with violence is because they have no hopes of swaying the larger Islamic world to their fanaticism.
Still, almost in a throw-away paragraph toward the end, Zakaria mentions that "[t]he current issue of Britain's Prospect magazine has a deeply illuminating profile of the main suicide bomber in the 7/7 London subway attacks, Mohammed Siddique Khan..."
If one seeks, perhaps not the root cause, but certainly both the financial and ideological life-line of Islamist terrorism, one need look no further than the devil's bargain between the Wahhabi movement and our "good friends and loyal allies," the House of Saud.
All things considered, not badly at all. At least not according to an excellent, must-read article by Fareed Zakaria in Newsweek. Zakaria makes a number of excellent points about how unlike the perceived "global threat" of Islamist extremism is the reality of how small, disorganized, dispersed, unconnected and increasingly inwardly focused the majority of such groups are and how, with few exceptions, the very reason they pursue their objectives with violence is because they have no hopes of swaying the larger Islamic world to their fanaticism.
Still, almost in a throw-away paragraph toward the end, Zakaria mentions that "[t]he current issue of Britain's Prospect magazine has a deeply illuminating profile of the main suicide bomber in the 7/7 London subway attacks, Mohammed Siddique Khan..."
... who at first glance appeared to be a well-integrated, middle-class Briton. The author, Shiv Malik, spent months in the Leeds suburb where Khan grew up, talked to his relatives and pieced together his past. Khan was not driven to become a suicide bomber by poverty, racism or the Iraq War. His is the story of a young man who found he could not be part of the traditional Pakistani-immigrant community of his parents. He had no memories of their Pakistani life. He spoke their language, Urdu, poorly. He rejected an arranged marriage in favor of a love match. And yet, he was also out of place in modern British culture. Khan was slowly seduced by the simple, powerful and total world view of Wahhabi Islam, conveniently provided in easy-to-read English pamphlets (doubtless funded with Saudi money). The ideology fulfilled a young man's desire for protest and rebellion and at the same time gave him a powerful sense of identity. By 1999—before the Iraq War, before 9/11—he was ready to be a terrorist.
[Emphasis added.]
If one seeks, perhaps not the root cause, but certainly both the financial and ideological life-line of Islamist terrorism, one need look no further than the devil's bargain between the Wahhabi movement and our "good friends and loyal allies," the House of Saud.
Labels:
Foreign Affairs,
Government,
Politics,
Religion
A Lawyer Who Presses His Own Suit Has A Fool For a Launderer
While Supreme Court decisions are further muddling constitutional law left and right today, the big news on the legal front is that D.C. Judge Roy Pearson's $54 million suit against his dry cleaners has resulted in a decision (1) that he is entitled to absolutely no relief whatever and (2) that he is now liable to pay the defendants' court costs. He may soon face legal action to recover their legal fees and may even lose his $96,000 job as a D.C. administrative law judge, to boot.
Good.
Good.
"Wii Admitted We Were Powerless Over Video Games -- That Our Lives Had Become Unmanagable."
Given the choice between, oh, say, saving his life by fleeing a burning building or staying a bit longer to reach the next level in whatever video game he was playing at the time, I'm reasonably sure my younger son would, reluctantly, flee. Lower the stakes, however, and the video game would almost certainly win. So, is he addicted?
At least for now, the American Medical Association is saying no. That's the right call, which is no guarantee it will remain the AMA's position and that Video Game Addiction won't find itself next to alcoholism and drug addiction in the next edition of Diagnostic and Statistical Manual of Mental Disorders (DSM) in five years.
Whatever one thinks about the science underlying the notion of addiction as a medical disease or disorder, it is unlikely that the supposed Video Game Addiction "sufferer" will be affected greatly one way or the other by the medical experts' decision. What would change is that "[s]uch a move would ease the path for insurance coverage of video game addiction." Follow the money, in other words.
With the usual disclaimer about how, among the myriad other things I'm not, medical expert is high among them, here's my take. Issues of actual chemical dependence aside, and we'll get back to that later, when you strip the concept of addiction of any sort from its negative connotative baggage it reduces to little more than the notion that people like doing what they like to do and, as a corollary, when they like doing something very, very, very much it is hard for them to refrain from doing it. So hard, in fact, that in many cases those who have decided they would be better off not doing it so much find it much easier not to do it at all rather than trying to try to do it in moderation.
I'm not arguing that a life spent shooting heroin or smoking crack cocaine or drinking a quart or more of whiskey a day or gorging on food constantly or betting the rent money or even playing video games for six to eight hours a day is a life well spent. It is unquestionably dysfunctional by any reasonable objective standard. Also, some addictions are more dysfunctional than others, posing serious health and even life threatening risks.
But we don't manage our own lives by reasonable objective standards, though. We might think we do, but the best we can do is to view ourselves by reasonable subjective standards, the reasonable part being our adaptive success at learning from our own and others' experience and changing, when necessary, accordingly. I can't say how much satisfaction you get from your preferences, you can't really know how much satisfaction I get from my preferences. We can, of course, figure out what each other's preferences are and maybe even what their ordinal value to others are by how they in fact behave' but it doesn't follow that your number one preference, say, smoking crack, isn't so far ahead of number two and the rest that you're not making a rational decision by hanging out at the crack house no matter what I may think of your decision.
That said, many people who come to believe that their lives are being ruined by their alcohol or drug habits or whatever, can and do manage to stop. They probably need help doing so in most cases, but I don't know of a single person who stopped what others deemed an addictive habit solely because those others wanted him to do so. Society -- family, friends, employers, etc. -- can raise the stakes, but that's it. As the old psychology joke goes, the light bulb has to want to change.
Chemical dependency, a reality in some addictions despite what some naysayers claim, isn't really the issue. If it were, alcoholics and drug addicts could quite cheaply and quickly be detoxed and that would be that. It is the psychological dependence that is the far tougher nut to crack. No doubt that, too, is physiological. Perhaps the brain gets hard wired with "memories" of how much fun the alcohol or cocaine or, let's not forget, nicotine is. That's why the patient has to want not only to quit but to keep quit. That, too, is why there is some logic to lumping, say, a gambling addiction into the same psychiatric category as alcoholism or drug addiction. The problem is, the logic leads to precisely the wrong conclusion. What we should be learning from our ever expanding list of "addictions" is that, medical assistance with the chemical dependence aspect of certain such addictions during withdrawal aside, they are no different than the rest, which is to say that they're not really diseases or, in the medical sense, disorders at all.
Which, in turn, is why it isn't ongoing (expensive) medical treatment but recourse to Twelve Step programs and their like that tend to be the most effective method for people who want to quit and keep quit of whatever their addiction may be. My title here is a play on the First Step of such programs: substitute "Wii" with "We" and "Video Games" with "Alcohol" and you have the version of the granddaddy of all such programs, Alcoholics Anonymous. What they claim, at least for themselves, is that coming to believe that they really did have a problem and couldn't fix it by themselves was the first, essential step toward its solution. There may be many other ways for people who want to stop drinking or gambling or playing video games incessantly to do so. But at least in this one celebrated and successful (and, sure, highly criticized and controversial) way we're back to the psychologist and the light bulb; that is, it is the individual, himself, who must want to change; otherwise, all is for naught.
Meanwhile, my son is indulging his "addiction" at the moment playing some sort of video game in the other room and temporarily safe from being clinically labeled even as I type this (indulging in my, um, computer "hobby"). Just as we don't set out a big bowl of candy in the living room and tell our kids to help themselves as much as they want, we limit his access to the video games. He plays other, i.e., non-electronic, games and sports, does his homework (grudgingly) during the school year and so forth. I'm sure he'd rather play video games than do his math homework and, left to his own devices, would do precisely that. But he's a child, our home is ruled by (benign) despots and he doesn't get that choice. Not yet. Libertarianism is for grown-ups.
It would be bad enough in the case of adults that we infantilize them by telling them they can't do what harms only themselves because they have a disease that gives us power over them "for their own good," but it's worse than that. It doesn't even work until, ironically, they want to change anyway. Unlike light bulbs, there's little point in screwing with them until the light has already come on.
At least for now, the American Medical Association is saying no. That's the right call, which is no guarantee it will remain the AMA's position and that Video Game Addiction won't find itself next to alcoholism and drug addiction in the next edition of Diagnostic and Statistical Manual of Mental Disorders (DSM) in five years.
Whatever one thinks about the science underlying the notion of addiction as a medical disease or disorder, it is unlikely that the supposed Video Game Addiction "sufferer" will be affected greatly one way or the other by the medical experts' decision. What would change is that "[s]uch a move would ease the path for insurance coverage of video game addiction." Follow the money, in other words.
With the usual disclaimer about how, among the myriad other things I'm not, medical expert is high among them, here's my take. Issues of actual chemical dependence aside, and we'll get back to that later, when you strip the concept of addiction of any sort from its negative connotative baggage it reduces to little more than the notion that people like doing what they like to do and, as a corollary, when they like doing something very, very, very much it is hard for them to refrain from doing it. So hard, in fact, that in many cases those who have decided they would be better off not doing it so much find it much easier not to do it at all rather than trying to try to do it in moderation.
I'm not arguing that a life spent shooting heroin or smoking crack cocaine or drinking a quart or more of whiskey a day or gorging on food constantly or betting the rent money or even playing video games for six to eight hours a day is a life well spent. It is unquestionably dysfunctional by any reasonable objective standard. Also, some addictions are more dysfunctional than others, posing serious health and even life threatening risks.
But we don't manage our own lives by reasonable objective standards, though. We might think we do, but the best we can do is to view ourselves by reasonable subjective standards, the reasonable part being our adaptive success at learning from our own and others' experience and changing, when necessary, accordingly. I can't say how much satisfaction you get from your preferences, you can't really know how much satisfaction I get from my preferences. We can, of course, figure out what each other's preferences are and maybe even what their ordinal value to others are by how they in fact behave' but it doesn't follow that your number one preference, say, smoking crack, isn't so far ahead of number two and the rest that you're not making a rational decision by hanging out at the crack house no matter what I may think of your decision.
That said, many people who come to believe that their lives are being ruined by their alcohol or drug habits or whatever, can and do manage to stop. They probably need help doing so in most cases, but I don't know of a single person who stopped what others deemed an addictive habit solely because those others wanted him to do so. Society -- family, friends, employers, etc. -- can raise the stakes, but that's it. As the old psychology joke goes, the light bulb has to want to change.
Chemical dependency, a reality in some addictions despite what some naysayers claim, isn't really the issue. If it were, alcoholics and drug addicts could quite cheaply and quickly be detoxed and that would be that. It is the psychological dependence that is the far tougher nut to crack. No doubt that, too, is physiological. Perhaps the brain gets hard wired with "memories" of how much fun the alcohol or cocaine or, let's not forget, nicotine is. That's why the patient has to want not only to quit but to keep quit. That, too, is why there is some logic to lumping, say, a gambling addiction into the same psychiatric category as alcoholism or drug addiction. The problem is, the logic leads to precisely the wrong conclusion. What we should be learning from our ever expanding list of "addictions" is that, medical assistance with the chemical dependence aspect of certain such addictions during withdrawal aside, they are no different than the rest, which is to say that they're not really diseases or, in the medical sense, disorders at all.
Which, in turn, is why it isn't ongoing (expensive) medical treatment but recourse to Twelve Step programs and their like that tend to be the most effective method for people who want to quit and keep quit of whatever their addiction may be. My title here is a play on the First Step of such programs: substitute "Wii" with "We" and "Video Games" with "Alcohol" and you have the version of the granddaddy of all such programs, Alcoholics Anonymous. What they claim, at least for themselves, is that coming to believe that they really did have a problem and couldn't fix it by themselves was the first, essential step toward its solution. There may be many other ways for people who want to stop drinking or gambling or playing video games incessantly to do so. But at least in this one celebrated and successful (and, sure, highly criticized and controversial) way we're back to the psychologist and the light bulb; that is, it is the individual, himself, who must want to change; otherwise, all is for naught.
Meanwhile, my son is indulging his "addiction" at the moment playing some sort of video game in the other room and temporarily safe from being clinically labeled even as I type this (indulging in my, um, computer "hobby"). Just as we don't set out a big bowl of candy in the living room and tell our kids to help themselves as much as they want, we limit his access to the video games. He plays other, i.e., non-electronic, games and sports, does his homework (grudgingly) during the school year and so forth. I'm sure he'd rather play video games than do his math homework and, left to his own devices, would do precisely that. But he's a child, our home is ruled by (benign) despots and he doesn't get that choice. Not yet. Libertarianism is for grown-ups.
It would be bad enough in the case of adults that we infantilize them by telling them they can't do what harms only themselves because they have a disease that gives us power over them "for their own good," but it's worse than that. It doesn't even work until, ironically, they want to change anyway. Unlike light bulbs, there's little point in screwing with them until the light has already come on.
Saturday, June 23, 2007
What's In A Name?
The naming of things, especially things like people, is an awesome power. Adam got to name all the animals (Genesis 2:19-20), no doubt to the eternal chagrin of the duck-billed platypus. The Gospel of John, the most philosophically informed of the New Testament books, likens Christ with the "Word," in Greek, Logos, a term far richer in content than a mere linguistic sign or signifier (and, btw, first used by the Greek philosopher Heraclitus). Fast forward to the opening part of Wittgenstein's Philosophical Investigations where he takes issue with Augustine's account of how language is learned as a child by the naming of things, a process Wittgenstein contended could be done as Augustine described only by someone who was already a fairly advanced user of language.
Some of us give inanimate objects names (my first guitar was named Maggie), but most of us beyond a certain age confine such doings to the naming of pets and, of course, of our children. As Robin Williams once noted that cocaine is God's way of telling you you have too much money, the baby naming business that has emerged in recent years is God's way of telling you you have far too much time (and money) on your hands.
Namer's remorse, indeed!
Names do, after all, signify quite a bit, if not about us at least about our parents and sometimes about our ancestors, too. Charles and John are very popular given names in the Ridgely family tree, the former at least because of a couple of prominent (read: rich) Charles's down at its colonial trunk and, for all I know, a few more along its English roots as well. Years ago, a Jewish friend was surprised to learn we named our first child after my still living father, apparently considered bad luck by Jewish tradition. Anglos would never think of naming their baby boys Jesus, a very popular choice in Latino cultures. I like to think of our Bible-Belt, Mexican border wall builders laboring to keep Jesus out of America. A white stand-up comic working before a black audience quipped "I wish I was black so I could name my baby any damned thing I wanted!" In fact, or so I am told, considerable thought and effort goes into finding unique and pleasing names for African American children these days. Apparently, they're not the only ones.
Celebrities, of course, are a factor here. Once upon a time it was de rigeur for Hollywood studios to rename their actors. Hence, Marion Morrison became John Wayne, Leonard Slye became Roy Rogers, Frances Ethel Gumm became Judy Garland and Archibald Leach became Cary Grant. (Michael Keaton also had to change his name, there already being a Michael Douglas in the business.)
Now, by contrast, actors keep their given names and indulge themselves with colorful (read: tasteless) names for their many out-of-wedlock offspring. Back in the 1960s and 70s, long before odd celebrity child names became so trendy, I looked forward to the day when Frank Zappa's daughter Moon Unit and Grace Slick's son god tied the knot. Surely god and Moon Unit Zappa-Slick would have been a couple for the millennium. Of course, back then George Foreman was still a heavyweight boxer and foe of Muhammad Ali (nee Cassius Clay) and not today's multi-millionaire grill-meister and father to George Foreman, George Foreman, George Foreman, George Foreman, George Foreman and, let us not forget, George Foreman. (There is no truth to the rumor his daughters are all named Georgia, though it was probably on his mind.)
Then, too, there are names to be avoided. Germans still shy away from naming their son's Adolf (I'm not sure the same holds in Argentina), and there are any number of old-fashioned names like Bertha, Myrtle and so forth that parents in hopes of grandchildren would probably not opt to give their daughters. Before the rise of the Governator, Arnold was the sort of name destining its bearer to a childhood of playground beatings. Sure, there was golfing legend Arnold Palmer, but Arnold Stang was the better known Arnold of my childhood.
Names are magical, but only because we believe they are. A primitive tribe might worship the Morning Star but curse the Evening Star, unaware that both are Venus. So, too, the thought that the lives of our children are much affected by the names we give them isn't far removed from the notion that, among their other possible perlocutionary functions, words used as names can bless or curse their bearers. Our fate, of course, lies not in our stars or in our names, but, these lyrics aside, in ourselves.
Some of us give inanimate objects names (my first guitar was named Maggie), but most of us beyond a certain age confine such doings to the naming of pets and, of course, of our children. As Robin Williams once noted that cocaine is God's way of telling you you have too much money, the baby naming business that has emerged in recent years is God's way of telling you you have far too much time (and money) on your hands.
Denise McCombie, 37, a California mother of two who's expecting a daughter this fall, spent $475 to have a numerologist test her favorite name, Leah Marie, to see if it had positive associations. (It did.) This March, one nervous mom-to-be from Illinois listed her 16 favorite names on a tournament bracket and asked friends, family and people she met at baby showers to fill it out. The winner: Anna Irene.
Sean and Dawn Mistretta from Charlotte, N.C., tossed around possibilities for five months before they hired a pair of consultants -- baby-name book authors who draw up lists of suggestions for $50. During a 30-minute conference call with Mrs. Mistretta, 34, a lawyer, and Mr. Mistretta, 35, a securities trader, the consultants discussed names based on their phonetic elements, popularity, and ethnic and linguistic origins -- then sent a 15-page list of possibilities. When their daughter was born in April, the Mistrettas settled on one of the consultants' suggestions -- Ava -- but only after taking one final straw poll of doctors and nurses at the hospital. While her family complimented the choice, Mrs. Mistretta says, "they think we're a little neurotic."
Karen Markovics, 36, who works for the planning department in Orange County, N.C., spent months reading baby books and scouring Web sites before settling on Nicole Josephine. But now, four years later, Mrs. Markovics says she wishes she'd chosen something less trendy -- and has even considered legally changing her daughter's name to Josephine Marie. "I'm having namer's remorse," she says.
Namer's remorse, indeed!
Names do, after all, signify quite a bit, if not about us at least about our parents and sometimes about our ancestors, too. Charles and John are very popular given names in the Ridgely family tree, the former at least because of a couple of prominent (read: rich) Charles's down at its colonial trunk and, for all I know, a few more along its English roots as well. Years ago, a Jewish friend was surprised to learn we named our first child after my still living father, apparently considered bad luck by Jewish tradition. Anglos would never think of naming their baby boys Jesus, a very popular choice in Latino cultures. I like to think of our Bible-Belt, Mexican border wall builders laboring to keep Jesus out of America. A white stand-up comic working before a black audience quipped "I wish I was black so I could name my baby any damned thing I wanted!" In fact, or so I am told, considerable thought and effort goes into finding unique and pleasing names for African American children these days. Apparently, they're not the only ones.
Celebrities, of course, are a factor here. Once upon a time it was de rigeur for Hollywood studios to rename their actors. Hence, Marion Morrison became John Wayne, Leonard Slye became Roy Rogers, Frances Ethel Gumm became Judy Garland and Archibald Leach became Cary Grant. (Michael Keaton also had to change his name, there already being a Michael Douglas in the business.)
Now, by contrast, actors keep their given names and indulge themselves with colorful (read: tasteless) names for their many out-of-wedlock offspring. Back in the 1960s and 70s, long before odd celebrity child names became so trendy, I looked forward to the day when Frank Zappa's daughter Moon Unit and Grace Slick's son god tied the knot. Surely god and Moon Unit Zappa-Slick would have been a couple for the millennium. Of course, back then George Foreman was still a heavyweight boxer and foe of Muhammad Ali (nee Cassius Clay) and not today's multi-millionaire grill-meister and father to George Foreman, George Foreman, George Foreman, George Foreman, George Foreman and, let us not forget, George Foreman. (There is no truth to the rumor his daughters are all named Georgia, though it was probably on his mind.)
Then, too, there are names to be avoided. Germans still shy away from naming their son's Adolf (I'm not sure the same holds in Argentina), and there are any number of old-fashioned names like Bertha, Myrtle and so forth that parents in hopes of grandchildren would probably not opt to give their daughters. Before the rise of the Governator, Arnold was the sort of name destining its bearer to a childhood of playground beatings. Sure, there was golfing legend Arnold Palmer, but Arnold Stang was the better known Arnold of my childhood.
Names are magical, but only because we believe they are. A primitive tribe might worship the Morning Star but curse the Evening Star, unaware that both are Venus. So, too, the thought that the lives of our children are much affected by the names we give them isn't far removed from the notion that, among their other possible perlocutionary functions, words used as names can bless or curse their bearers. Our fate, of course, lies not in our stars or in our names, but, these lyrics aside, in ourselves.
Friday, June 22, 2007
"Muslim peer compares Rushdie to 9/ll bombers"
Or so the title of the (U.K.) Telegraph reads. The member of the British House of Lords in question, Lord Ahmed of Rotherham, is quoted as follows:
Good question, Nazir, old boy.
But here's an even better question: What does one say in response to a member of the British House of Lords who refers to the 9/11 terrorists as "martyrs"?
This honour is given in recognition of services rendered to Great Britain.
Salman Rushdie lives in New York. He is controversial man who has insulted Muslim people, Christians and the British. He does not deserve the honour.
Two weeks ago Tony Blair spoke about constructing bridges with Muslims. What hypocrisy.
What would one say if the Saudi or Afghan governments honoured the martyrs of the September 11 attacks on the United States?
Good question, Nazir, old boy.
But here's an even better question: What does one say in response to a member of the British House of Lords who refers to the 9/11 terrorists as "martyrs"?
Shoes News, Part Deux!
Credit where credit is due: however else failed Glenn Greenwald believes the Bush Administration may be, if George W. Bush's wearing a pair of Crocs -- complete with snazzy black anklet socks, no less! -- causes sales of these fashion monstrosities to plummet, I shall doff my hat to him in thanks.
Speaking of which -- presidents, sartorial standards and hats, that is -- John F. Kennedy shall always have a place of honor in fashion hell for the death of men's hats in American society.
(Yes, I know Snopes says this is urban legend, but they fail to account for JFK's continued hatlessness accelerating this terrible fashion trend.)
Speaking of which -- presidents, sartorial standards and hats, that is -- John F. Kennedy shall always have a place of honor in fashion hell for the death of men's hats in American society.
(Yes, I know Snopes says this is urban legend, but they fail to account for JFK's continued hatlessness accelerating this terrible fashion trend.)
Review: A Tragic Legacy by Glenn Greenwald
A Tragic Legacy: How A Good vs. Evil Mentality Destroyed the Bush Presidency by Glenn Greenwald. Crown, 320 pp.
There is an oratorical tone to Glenn Greenwald’s A Tragic Legacy, the rhythms and word choices of a trial lawyer making his case to the jury from opening statement to presentation of the evidence to closing statement. The defendant here, George W. Bush, is charged with a failed administration both proximately and primarily caused by his unbending Manichean world view, more about which in a moment.
Greenwald was an appellate attorney before turning author by way of political blogger, and appellate briefs do not admit the rhetorical flourishes of a trial; but it is the rare lawyer of any sort who does not at least fanaticize himself in command of the courtroom, mesmerizing the rapt jury. (The real aim of law school, after all, is to turn natural born anal retentives into oral aggressives and vice versa. Learning the law comes later.) In any case, A Tragic Legacy reads neither like the quiet work of a scholar nor the brisk, adjective starved prose of a professional journalist but, well, like the work of a lawyer who writes more clearly and interestingly than the average lawyer.
Okay, so there is a bit of damning with faint praise in that last, but for those who enjoy current events / political analysis books Greenwald's contribution is at least as worthy as the vast majority of the rest and better than more than a few I've suffered through in recent years. It needs to be said, however, that such books are not my cup of tea, lest the reader here take my somewhat tepid endorsement as more negative than intended. Disclaimer done, back to the book.
Manichaeism is the belief that the world is a battleground between roughly equal forces of Good and Evil, between the two of which there is no ground for compromise. A third century Persian religion, Manichaeism’s influences on Christianity were quickly deemed heretical (Satan may indeed exist but is surely no equal to God in orthodox Christian theology), but it is not the doctrinal Manichaeism that Greenwald accuses Bush of so much as the Manichaeism mind-set. As far as it goes, it strikes me as a fair charge. The question occurs, however, whether Greenwald required over three hundred pages to make his case or whether, more to the point, the reader requires plowing through same to be convinced that Bush’s simplistic moral absolutism has led to disastrous effect.
Here, striped of its quasi-theological trappings and with a few liberties of my own taken along the way, is a far shorter version of Greenwald’s thesis: Moral ambiguity and nuance are not George W. Bush’s strong suits. Raised in privilege, Bush has never had to suffer the consequences of his bad decisions nor even to abide, let alone compromise with those who were disloyal or who even merely disagreed with him. His conversion at age 40 to evangelical Christianity fitted him out not so much with a moral as with a moralistic lens processing the world in black and white, good and evil terms that were and still are largely indifferent to such trivialities as political theory or the rule of law. The enormity of 9/11 gave Bush both tremendous political popularity and thus political power but, more importantly, it became his blindingly bright focal point in the battle between good and evil with, of course, the Islamist terrorists on the side of evil and America and himself on the side of good. Public opinion, now that it has turned against him, be damned – Bush sees himself on God’s side and will not waiver in fighting the good fight.
Greenwald’s secondary agenda is a critique and criticism of contemporary conservatism and especially of what, I think over-broadly, he includes under the rubric of neoconservatism. Such conservatives (neo- or not) both cheered on and, among Bush’s inner circle of advisors, manipulated his policy and decision making at first. Now, however, the pundits, at least, have increasingly abandoned Bush rather like, to keep the religious metaphor going, Peter repeatedly denied Christ once things got ugly.
There is some truth to this, too, though I think rather less than Greenwald would have us believe. To cite, say, National Review’s Rich Lowery endorsing Bush for a second term as evidence of conservatives' belief in Bush’s conservative bona fides is a bit of a stretch. Political rhetoric is political rhetoric, and Bush was the better choice for conservatives in 2004, notwithstanding his manifold sins and transgressions against conservativism in his first term. The lesser of two evils is still the better choice and it would simply be naïve to expect advocacy journalists not to engage in, well, advocacy, especially in the midst of an election.
I don't recall there ever being a time when Bush wasn't the target of serious and often scathing criticism especially from economic / small government conservatives, nor will it do to conflate all conservatives of any sort who ever supported the war in Iraq as members in good standing of the neoconservative movement of the past few decades. Moreover, people do, after all, change their minds, the occasional disingenuousness in that fact which Greenwald accurately notes among some right-wing writers aside.
Indeed, one of the weaker points of the book is Greenwald’s heavy reliance on block quotes from various conservative pundits, both those who have continued to support Bush publicly (whatever they may believe in private) and those who have changed their public views, to make his case. There is a damned if you do, damned if you don’t quality about Greenwald’s take here, not to mention the very short shrift paid to the genuine differences and ongoing arguments inside what might broadly be called the American Right for far longer than the Bush years.
Though he takes some trouble at the onset to distinguish general conservative theory and principles from the actual policies of self-identified conservative office holders, Greenwald takes too little account of the differences between, say, Burkean or social conservatives and Hayekian or economic conservatives, nor do his occasional and arguably gratuitous swipes at Ronald Reagan’s administration take adequate account of the political realities precluding Reagan from dismantling more of the Great Society he inherited.
Speaking of which, Greenwald’s concluding comparison of Bush to Lyndon Johnson is insightful and, up to a point, quite apt. Johnson’s administration will forever be judged through the prism of the Viet Nam war which, unlike Bush, Johnson did not instigate but did significantly escalate. On the domestic front, his economic policies were doomed to failure because they were bad economics, but Johnson also did what no Kennedy could or Nixon would ever have done. This unlikeliest of civil rights champions pushed passage of civil rights legislation through force of will and a political ruthlessness and singlemindedness that would have made Richard Nixon, let alone George W. Bush, blush. That is Johnson’s real and lasting legacy. So what, then, is Bush's?
Greenwald concludes that Bush’s legacy will forever be not only his failed war in Iraq (and perhaps, worse yet, in Iran) with all the damage to constitutional law and America’s standing in the world it has wrought but also his failure, because of his Manichean obsession with terrorism, to accomplish anything on the domestic front beyond the unintended and tattered remains of the conservative movement in America.
Perhaps. Surely, much damage has been done to the American republic in these past six and a half years. As for Bush’s legacy in terms of his historical standing among other presidents, however, who cares?
For the Judeo-Christian theists among us, there is also a recurring theme in the Old Testament of God’s wrath being visited upon his errant people over and over again, nevertheless always sparing a righteous remnant for a new beginning. Theology aside and using that metaphor in purely political terms, especially for those of us who have always opposed the prospect of American Empire, it is at least worth suggesting that America is far better off now than it would have been had Bush’s holy war met with greater success. The Lord or, if you will, the Zeitgeist works in mysterious ways, after all.
There is an oratorical tone to Glenn Greenwald’s A Tragic Legacy, the rhythms and word choices of a trial lawyer making his case to the jury from opening statement to presentation of the evidence to closing statement. The defendant here, George W. Bush, is charged with a failed administration both proximately and primarily caused by his unbending Manichean world view, more about which in a moment.
Greenwald was an appellate attorney before turning author by way of political blogger, and appellate briefs do not admit the rhetorical flourishes of a trial; but it is the rare lawyer of any sort who does not at least fanaticize himself in command of the courtroom, mesmerizing the rapt jury. (The real aim of law school, after all, is to turn natural born anal retentives into oral aggressives and vice versa. Learning the law comes later.) In any case, A Tragic Legacy reads neither like the quiet work of a scholar nor the brisk, adjective starved prose of a professional journalist but, well, like the work of a lawyer who writes more clearly and interestingly than the average lawyer.
Okay, so there is a bit of damning with faint praise in that last, but for those who enjoy current events / political analysis books Greenwald's contribution is at least as worthy as the vast majority of the rest and better than more than a few I've suffered through in recent years. It needs to be said, however, that such books are not my cup of tea, lest the reader here take my somewhat tepid endorsement as more negative than intended. Disclaimer done, back to the book.
Manichaeism is the belief that the world is a battleground between roughly equal forces of Good and Evil, between the two of which there is no ground for compromise. A third century Persian religion, Manichaeism’s influences on Christianity were quickly deemed heretical (Satan may indeed exist but is surely no equal to God in orthodox Christian theology), but it is not the doctrinal Manichaeism that Greenwald accuses Bush of so much as the Manichaeism mind-set. As far as it goes, it strikes me as a fair charge. The question occurs, however, whether Greenwald required over three hundred pages to make his case or whether, more to the point, the reader requires plowing through same to be convinced that Bush’s simplistic moral absolutism has led to disastrous effect.
Here, striped of its quasi-theological trappings and with a few liberties of my own taken along the way, is a far shorter version of Greenwald’s thesis: Moral ambiguity and nuance are not George W. Bush’s strong suits. Raised in privilege, Bush has never had to suffer the consequences of his bad decisions nor even to abide, let alone compromise with those who were disloyal or who even merely disagreed with him. His conversion at age 40 to evangelical Christianity fitted him out not so much with a moral as with a moralistic lens processing the world in black and white, good and evil terms that were and still are largely indifferent to such trivialities as political theory or the rule of law. The enormity of 9/11 gave Bush both tremendous political popularity and thus political power but, more importantly, it became his blindingly bright focal point in the battle between good and evil with, of course, the Islamist terrorists on the side of evil and America and himself on the side of good. Public opinion, now that it has turned against him, be damned – Bush sees himself on God’s side and will not waiver in fighting the good fight.
Greenwald’s secondary agenda is a critique and criticism of contemporary conservatism and especially of what, I think over-broadly, he includes under the rubric of neoconservatism. Such conservatives (neo- or not) both cheered on and, among Bush’s inner circle of advisors, manipulated his policy and decision making at first. Now, however, the pundits, at least, have increasingly abandoned Bush rather like, to keep the religious metaphor going, Peter repeatedly denied Christ once things got ugly.
There is some truth to this, too, though I think rather less than Greenwald would have us believe. To cite, say, National Review’s Rich Lowery endorsing Bush for a second term as evidence of conservatives' belief in Bush’s conservative bona fides is a bit of a stretch. Political rhetoric is political rhetoric, and Bush was the better choice for conservatives in 2004, notwithstanding his manifold sins and transgressions against conservativism in his first term. The lesser of two evils is still the better choice and it would simply be naïve to expect advocacy journalists not to engage in, well, advocacy, especially in the midst of an election.
I don't recall there ever being a time when Bush wasn't the target of serious and often scathing criticism especially from economic / small government conservatives, nor will it do to conflate all conservatives of any sort who ever supported the war in Iraq as members in good standing of the neoconservative movement of the past few decades. Moreover, people do, after all, change their minds, the occasional disingenuousness in that fact which Greenwald accurately notes among some right-wing writers aside.
Indeed, one of the weaker points of the book is Greenwald’s heavy reliance on block quotes from various conservative pundits, both those who have continued to support Bush publicly (whatever they may believe in private) and those who have changed their public views, to make his case. There is a damned if you do, damned if you don’t quality about Greenwald’s take here, not to mention the very short shrift paid to the genuine differences and ongoing arguments inside what might broadly be called the American Right for far longer than the Bush years.
Though he takes some trouble at the onset to distinguish general conservative theory and principles from the actual policies of self-identified conservative office holders, Greenwald takes too little account of the differences between, say, Burkean or social conservatives and Hayekian or economic conservatives, nor do his occasional and arguably gratuitous swipes at Ronald Reagan’s administration take adequate account of the political realities precluding Reagan from dismantling more of the Great Society he inherited.
Speaking of which, Greenwald’s concluding comparison of Bush to Lyndon Johnson is insightful and, up to a point, quite apt. Johnson’s administration will forever be judged through the prism of the Viet Nam war which, unlike Bush, Johnson did not instigate but did significantly escalate. On the domestic front, his economic policies were doomed to failure because they were bad economics, but Johnson also did what no Kennedy could or Nixon would ever have done. This unlikeliest of civil rights champions pushed passage of civil rights legislation through force of will and a political ruthlessness and singlemindedness that would have made Richard Nixon, let alone George W. Bush, blush. That is Johnson’s real and lasting legacy. So what, then, is Bush's?
Greenwald concludes that Bush’s legacy will forever be not only his failed war in Iraq (and perhaps, worse yet, in Iran) with all the damage to constitutional law and America’s standing in the world it has wrought but also his failure, because of his Manichean obsession with terrorism, to accomplish anything on the domestic front beyond the unintended and tattered remains of the conservative movement in America.
Perhaps. Surely, much damage has been done to the American republic in these past six and a half years. As for Bush’s legacy in terms of his historical standing among other presidents, however, who cares?
For the Judeo-Christian theists among us, there is also a recurring theme in the Old Testament of God’s wrath being visited upon his errant people over and over again, nevertheless always sparing a righteous remnant for a new beginning. Theology aside and using that metaphor in purely political terms, especially for those of us who have always opposed the prospect of American Empire, it is at least worth suggesting that America is far better off now than it would have been had Bush’s holy war met with greater success. The Lord or, if you will, the Zeitgeist works in mysterious ways, after all.
Labels:
Blogs,
Foreign Affairs,
Government,
Journalism,
Politics
Thursday, June 21, 2007
We Interupt Rush Limbaugh for This Important Message...
It should come as little surprise that, the likes of Air America having failed so miserably, the "progressive" left is abandoning (yet again) the notion that it can compete in that metaphorical “marketplace of ideas” better known as talk radio. Their solution?
So, in a nutshell, the perspective here is that because leftist talk fails to “compete” in one particular news and information medium regardless of how predominant it may be in all the rest, the public interest justifies greater regulation to require its inclusion, the public's actual preferences be damned.
On a personal note, I’m not a big fan of most of the programming from the big radio chains (or, for that matter, conservative talk radio) any more than I am of the boringly identical mall chain stores and restaurants one finds everywhere in America these days. I search out Mom & Pop restaurants when I am traveling and I miss the (often cheesy) programming of independent radio and television stations of my youth. But whether my preferences are optimally served by changes in these markets is irrelevant to whether this poses some sort of action requiring public interest crisis.
But what I find most amusing about these recommendations is the third. Thank you, Center for American Progress, for frankly, albeit indirectly, acknowledging that “public broadcasting” is an essentially left wing enterprise.
...any effort to encourage more responsive and balanced radio programming will first require steps to increase localism and diversify radio station ownership to better meet local and community needs. We suggest three ways to accomplish this:
-- Restore local and national caps on the ownership of commercial radio stations.
-- Ensure greater local accountability over radio licensing.
-- Require commercial owners who fail to abide by enforceable public interest obligations to pay a fee to support public broadcasting.
So, in a nutshell, the perspective here is that because leftist talk fails to “compete” in one particular news and information medium regardless of how predominant it may be in all the rest, the public interest justifies greater regulation to require its inclusion, the public's actual preferences be damned.
On a personal note, I’m not a big fan of most of the programming from the big radio chains (or, for that matter, conservative talk radio) any more than I am of the boringly identical mall chain stores and restaurants one finds everywhere in America these days. I search out Mom & Pop restaurants when I am traveling and I miss the (often cheesy) programming of independent radio and television stations of my youth. But whether my preferences are optimally served by changes in these markets is irrelevant to whether this poses some sort of action requiring public interest crisis.
But what I find most amusing about these recommendations is the third. Thank you, Center for American Progress, for frankly, albeit indirectly, acknowledging that “public broadcasting” is an essentially left wing enterprise.
Wednesday, June 20, 2007
Fascinating, If True: "The World In America"
I can't vouch for its accuracy (can D.C. really have the same GDP as New Zealand? What the hell does it produce?), but Andrew Sullivan has linked to a map of the U.S. showing how the fifty states and D.C. compare in Gross Domestic Product (GDP) to various nations of the world. (You may find linking from Sullivan's blog to the larger map version helpful. I did.) As with most such things, there is probably both more and less information there than "meets the eye," but it is fascinating, nonetheless.
"Hey Sayyid, Grab Me a Brewski While You're Up!"
Here, in an Oniony nutshell (or a nutty onionskin -- I blog, you decide) is the real long term outcome, regardless of what any future administration does short of surrender, of our "war" against Islamist terrorism.
Tuesday, June 19, 2007
So, Why Would God's Design Include Malaria In The First Place?
University of Chicago biologist Jerry Coyne reviews Lehigh University biochemist Michael J. Behe's latest attempted defense of Intelligent Design, The Edge of Evolution: The Search for the Limits of Darwinism. To no one's surprise, he is unimpressed.
Science, Sanity and the Law
If you have never entertained, however fleetingly, the prospect of killing your children, you're probably not spending enough time with them. Fortunately for the species, few of us ever act on such feelings. So few, in fact, that the rare parent like Andrea Yates, who in 2001 killed her four small children, strikes us immediately as monstrous or insane or both.
Reason's Brian Doherty posts a very fine article today about our struggles as a society with the notions of sanity, responsibility, free will and the law. The legal so-called insanity defense continues to fascinate us precisely because it touches so many deep mysteries about life, typically arising under the most gruesome and horrifying of situations. "Insanity" is a term long ago abandoned by the psychiatric profession, but the relationship between what is, at bottom, a legal defense justified on moral grounds and what purports to be increasing scientific evidence against the notion of free will of any sort continues to lie at the heart of the issues raised.
It is a basic tenet of ethics that ought implies can; that is, that holding someone blameworthy (or praiseworthy) for an act can be meaningful and justifiable only if that person could have done other than he did, in fact, do. Logically, it must also hold that "ought not" implies that the person could have refrained from doing whatever was done. Those who deny the existence of volitional or intentional human agency (i.e., free will) but contend that society must nonetheless indulge in the useful fiction of contending otherwise and holding people 'responsible' for their 'acts' are not, I think, all that far removed from those who hold that belief in God is necessary for there to be any moral order. Of course, by their own theory they are incapable of holding contrary views, so perhaps we can forgive them this conceptually muddled attempt to have their determinist cake and freely eat it, too.
In fairness, one can make a case for the notion that in society at large 'pretending' that criminals could refrain from committing their crimes so to justify 'punishing' them may well have a general deterrent effect. That is to say, 'punishing' certain acts raises the known consequences of their commission and people in general respond to such incentives and disincentives, whether freely or not.
But at the fringes of "people in general" lie those whose minds are so deranged (or, if you will, whose brains are so disfunctional) that the notion of general deterrence breaks down completely. These are, ironically, the people who are the most likely candidates for the insanity defense. Put simply, punishing the truly psychotic is unlikely to have any effect on the behavior of other truly psychotic persons. Indeed, it is almost definitional that such persons do not respond to the world as you and I do.
If we are only play-acting at a belief in free will in our criminal justice system as it deals with ordinary people, then surely we must be indulging in a play within a play when we go through the motions of a criminal trial with such persons, grappling with questions such as the (in)famous M'Naghten test whether "...at the time of the committing of the act, the party accused was laboring under such a defect of reason, arising from a disease of the mind, as not to know the nature and quality of the act he was doing, or, if he did know it, that he did not know what he was doing was wrong."
Society must, of course, remove or restrain those who, for whatever causes or reasons, pose a deadly threat. But what possible difference can knowing what one is doing or knowing it is deemed wrong by others make if one cannot act otherwise anyway?
[EDIT: The first posted version read "scientific evidence of free will" and should have read and now does read "scientific evidence against the notion of free will."]
Reason's Brian Doherty posts a very fine article today about our struggles as a society with the notions of sanity, responsibility, free will and the law. The legal so-called insanity defense continues to fascinate us precisely because it touches so many deep mysteries about life, typically arising under the most gruesome and horrifying of situations. "Insanity" is a term long ago abandoned by the psychiatric profession, but the relationship between what is, at bottom, a legal defense justified on moral grounds and what purports to be increasing scientific evidence against the notion of free will of any sort continues to lie at the heart of the issues raised.
It is a basic tenet of ethics that ought implies can; that is, that holding someone blameworthy (or praiseworthy) for an act can be meaningful and justifiable only if that person could have done other than he did, in fact, do. Logically, it must also hold that "ought not" implies that the person could have refrained from doing whatever was done. Those who deny the existence of volitional or intentional human agency (i.e., free will) but contend that society must nonetheless indulge in the useful fiction of contending otherwise and holding people 'responsible' for their 'acts' are not, I think, all that far removed from those who hold that belief in God is necessary for there to be any moral order. Of course, by their own theory they are incapable of holding contrary views, so perhaps we can forgive them this conceptually muddled attempt to have their determinist cake and freely eat it, too.
In fairness, one can make a case for the notion that in society at large 'pretending' that criminals could refrain from committing their crimes so to justify 'punishing' them may well have a general deterrent effect. That is to say, 'punishing' certain acts raises the known consequences of their commission and people in general respond to such incentives and disincentives, whether freely or not.
But at the fringes of "people in general" lie those whose minds are so deranged (or, if you will, whose brains are so disfunctional) that the notion of general deterrence breaks down completely. These are, ironically, the people who are the most likely candidates for the insanity defense. Put simply, punishing the truly psychotic is unlikely to have any effect on the behavior of other truly psychotic persons. Indeed, it is almost definitional that such persons do not respond to the world as you and I do.
If we are only play-acting at a belief in free will in our criminal justice system as it deals with ordinary people, then surely we must be indulging in a play within a play when we go through the motions of a criminal trial with such persons, grappling with questions such as the (in)famous M'Naghten test whether "...at the time of the committing of the act, the party accused was laboring under such a defect of reason, arising from a disease of the mind, as not to know the nature and quality of the act he was doing, or, if he did know it, that he did not know what he was doing was wrong."
Society must, of course, remove or restrain those who, for whatever causes or reasons, pose a deadly threat. But what possible difference can knowing what one is doing or knowing it is deemed wrong by others make if one cannot act otherwise anyway?
[EDIT: The first posted version read "scientific evidence of free will" and should have read and now does read "scientific evidence against the notion of free will."]
Sunday, June 17, 2007
My Baby Just Wrote Me A Letter ... Now Where The Hell Is It?
Like the government itself, the U.S. Postal Service rarely misses an opportunity to be both arrogant and inefficient. Recently, as the Washington Post reports, the Post Office (it will always be the Post Office to me) has taken to “delivering mail to communal cluster boxes as a way to keep pace with booming residential growth while controlling labor costs” or, more simply, to keep providing less and less service for more and more money.

But see, Luvenia, that’s the problem. People don’t want a central location, they want their mail delivered right there to their homes, and not, by the way, by having the mailman – he or she will always be a mailman to me – traipsing all over our lawns, dammit! Sure, multiple residence locations have central mail delivery; but, um, you know, that’s one of the down sides of living in an apartment or condominium complex, not one of it’s friggin’ advantages. (Where do they get these spokesmen – they’ll always be spokesmen to me – from Microsoft, fergawdsakes?)
First class mail, the only sort we still care about at all, is less and less convenient and more and more expensive all the time. Perhaps it simply isn’t possible to run a mail delivery service the size and scope of the postal system efficiently or economically or well enough to satisfy customers. Possible or not, though, it sure as hell isn't going to get done by a government protected monopoly.
But first class mail is also less and less important all the time. There really aren’t any good arguments left why the postal system shouldn’t be entirely privatized and opened to competition. If the wiring infrastructure of telephone service and electricity isn’t a sufficient reason to preclude competition, there sure as hell is no reason why mailboxes should be.
Okay, so this isn’t exactly up there with Iraq, health care or immigration, but the candidate who was willing to say let’s get rid of the postal monopoly once and for all would certainly pick up more than a few votes from those suburbanites who have to go fetch their mail down the block.
"Instead of going from door to door, from lawn to lawn, from driveway to driveway, we have a central location," said Luvenia Hyson, a postal service regional spokeswoman.

But see, Luvenia, that’s the problem. People don’t want a central location, they want their mail delivered right there to their homes, and not, by the way, by having the mailman – he or she will always be a mailman to me – traipsing all over our lawns, dammit! Sure, multiple residence locations have central mail delivery; but, um, you know, that’s one of the down sides of living in an apartment or condominium complex, not one of it’s friggin’ advantages. (Where do they get these spokesmen – they’ll always be spokesmen to me – from Microsoft, fergawdsakes?)
First class mail, the only sort we still care about at all, is less and less convenient and more and more expensive all the time. Perhaps it simply isn’t possible to run a mail delivery service the size and scope of the postal system efficiently or economically or well enough to satisfy customers. Possible or not, though, it sure as hell isn't going to get done by a government protected monopoly.
But first class mail is also less and less important all the time. There really aren’t any good arguments left why the postal system shouldn’t be entirely privatized and opened to competition. If the wiring infrastructure of telephone service and electricity isn’t a sufficient reason to preclude competition, there sure as hell is no reason why mailboxes should be.
Okay, so this isn’t exactly up there with Iraq, health care or immigration, but the candidate who was willing to say let’s get rid of the postal monopoly once and for all would certainly pick up more than a few votes from those suburbanites who have to go fetch their mail down the block.
Saturday, June 16, 2007
Where's Balko's Pulitzer?
I'm pretty sure there isn't a Pulitzer Prize category for investigative blogging and, for that matter, that real journalist Radley Balko (and I say "real journalist" in the nicest possible way here) probably hasn't hit the radar yet of whoever decides those things. But if Balko's continued coverage at The Agitator and Reason of the Cory Maye travesty and the many other now almost weekly occurrences of some jackbooted police SWAT thugs busting in the wrong address and maiming, killing or otherwise brutalizing entirely innocent people isn't worthy of a Pulitzer, I'll be damned if I can think of what is.
Constant Viewer: Fantastic Four: Rise of the Silver Surfer
If you examine the IMDb entry for Fantastic Four: Rise of the Silver Surfer, will find it has already been nominated for an award; namely, the highly coveted MTV Movie Award for "Best Summer Movie You Haven't Seen Yet." In that preemptive spirit, Constant Viewer hereby nominates Silver Surfer for the even more highly coveted "Worst Movie You Haven't And, If There's A God, Won't See This Summer or Any Other Time Ever" award. It's only June and the nominations haven't closed yet, but only another Pokemon, Pauly Shore or M. Night Shyamalan movie could possibly come from behind to win at this point.
Comic book movies, by which Constant Viewer means movies based on popular comic books and not Hollywood's usual superficial fare, are all the rage these days. As a former comic book and especially Marvel Comics reader, Constant Viewer is generally pleased by this trend. The Spider-Man franchise, even if the third entry was too long and too complicated, are worthy versions of the original comic books, as have been some of the Batman and Superman films. When they work, they're great popcorn movies and, at least in the case of the first Batman film, sometimes great movies, period. When they don't work, however, as in the case of the Incredible Hulk and now both Fantastic Four films, they really stink up the screen.
The first Fantastic Four movie raked in tons of cash, at least in part because the original comic book heroes were once the flagship product of Marvel in its golden age (which would be the Silver Age of comics, go figure!) and thus bought a great deal of nostalgic good will. But the more realistic, highly flawed personalities of and fights between Johnny Storm and Ben Grimm that so set them apart from the cardboard characters of DC Comics' offerings in the 1960s, however groundbreaking, was really pretty much all they ever had going for them. Let's face it, "Mr. Fantastic" is not only a super-dorky name, his super power is equally dorky and Reed Richards, himself, is the ultimate dorky scientist. Stan Lee may be the Shakespeare of comic books, but the Fantastic Four is now and always has been on a par with Timon of Athens. As with the comic books, themselves, the villains are far more interesting than the heroes here.
Source material problems aside, Silver Surfer is just cheesy in every possible "Made for TV" sort of way. The plot stinks, the directing and editing stinks, the acting stinks, the dialog really stinks and the special effects are so-so at best. Ioan Gruffudd, who really is fantastic in his Horatio Hornblower outings, somehow manages to make Reed Richards an even bigger nerd in Silver Surfer than Stan Lee managed in the comics, and that is not a good thing. Michael Chiklis captures none of Ben Grimm's angst as the Thing and Julian McMahon's reanimated Victor Von Doom (one of the greatest Marvel villains of all time) is just wasted footage. In a particularly cheesy scene, Reed Richards confronts the bullying General Hager, whose subsequent fate is one of the very few satisfying moments in the film. This is supposed to be one of those hero defining, mano a mano moments you've seen hundreds of times in the movies, but here it's simply wince-inducing and made Constant Viewer want to bitch-slap Mr. Fantastic personally.
Another reviewer allowed as how Silver Surfer might be considered a good movie by an eight year old, but surely that is an insult to eight year olds everywhere unless he meant "made by an eight year old." Be a superhero, yourself and please, oh please don't go see this movie -- help save the world from any more of this cinematic bilge.
Comic book movies, by which Constant Viewer means movies based on popular comic books and not Hollywood's usual superficial fare, are all the rage these days. As a former comic book and especially Marvel Comics reader, Constant Viewer is generally pleased by this trend. The Spider-Man franchise, even if the third entry was too long and too complicated, are worthy versions of the original comic books, as have been some of the Batman and Superman films. When they work, they're great popcorn movies and, at least in the case of the first Batman film, sometimes great movies, period. When they don't work, however, as in the case of the Incredible Hulk and now both Fantastic Four films, they really stink up the screen.
The first Fantastic Four movie raked in tons of cash, at least in part because the original comic book heroes were once the flagship product of Marvel in its golden age (which would be the Silver Age of comics, go figure!) and thus bought a great deal of nostalgic good will. But the more realistic, highly flawed personalities of and fights between Johnny Storm and Ben Grimm that so set them apart from the cardboard characters of DC Comics' offerings in the 1960s, however groundbreaking, was really pretty much all they ever had going for them. Let's face it, "Mr. Fantastic" is not only a super-dorky name, his super power is equally dorky and Reed Richards, himself, is the ultimate dorky scientist. Stan Lee may be the Shakespeare of comic books, but the Fantastic Four is now and always has been on a par with Timon of Athens. As with the comic books, themselves, the villains are far more interesting than the heroes here.
Source material problems aside, Silver Surfer is just cheesy in every possible "Made for TV" sort of way. The plot stinks, the directing and editing stinks, the acting stinks, the dialog really stinks and the special effects are so-so at best. Ioan Gruffudd, who really is fantastic in his Horatio Hornblower outings, somehow manages to make Reed Richards an even bigger nerd in Silver Surfer than Stan Lee managed in the comics, and that is not a good thing. Michael Chiklis captures none of Ben Grimm's angst as the Thing and Julian McMahon's reanimated Victor Von Doom (one of the greatest Marvel villains of all time) is just wasted footage. In a particularly cheesy scene, Reed Richards confronts the bullying General Hager, whose subsequent fate is one of the very few satisfying moments in the film. This is supposed to be one of those hero defining, mano a mano moments you've seen hundreds of times in the movies, but here it's simply wince-inducing and made Constant Viewer want to bitch-slap Mr. Fantastic personally.
Another reviewer allowed as how Silver Surfer might be considered a good movie by an eight year old, but surely that is an insult to eight year olds everywhere unless he meant "made by an eight year old." Be a superhero, yourself and please, oh please don't go see this movie -- help save the world from any more of this cinematic bilge.
Friday, June 15, 2007
Be Happy! Pay Taxes!
New "research" indicates (?) we feel better when we pay taxes! Giving to charity makes us even happier! (That much I believe.)
Well, now. This little tidbit made the memeorandum aggrigator, so I took a peek. Sooooo.... let's take a look at the experiment here: Nineteen (count 'em, 19!) all female university students at the University of Oregon? $100 at stake? Yeah, that sounds to me like a tightly controlled experiment, a representative sample and statistically significant results.
That the researchers did detect some correlation between MRI readings and revealed preferences, I don't doubt. Conclusions beyond that?
Geez!
Well, now. This little tidbit made the memeorandum aggrigator, so I took a peek. Sooooo.... let's take a look at the experiment here: Nineteen (count 'em, 19!) all female university students at the University of Oregon? $100 at stake? Yeah, that sounds to me like a tightly controlled experiment, a representative sample and statistically significant results.
That the researchers did detect some correlation between MRI readings and revealed preferences, I don't doubt. Conclusions beyond that?
Geez!
Thursday, June 14, 2007
Checkout Time for Hilton?
A while back I posted a brief comment (amazingly) agreeing with Al Sharpton that it certainly appeared Paris Hilton was receiving preferential treatment by her early release, later rescinded by her sentencing judge. Now the Los Angeles Times reports she may, in fact, be serving more time in jail than some 80% of similar cases. If so, I was wrong (hey, it wasn't the first and won't be the last time!) and she should be dealt with like the others. Having said that, her being among the top (or bottom) 20% of the curve in this case doesn't strike me as especially egregious and, in any case, we're talking a few weeks out of the life of someone who doesn't exactly have a job on the line or a family to feed. Thus, I don't see her remaining sentence as per se unjust. (I assume she's right around the 20% mark, otherwise the report would have been over 85% or 90% or whatever.)
I don't give a rodent's hindquarters about Paris Hilton, but the notion of making an example of her because of her wealth or celebrity is obscene. Arguments about the justice of the laws in question aside, she should be treated no better and no worse than the average person convicted of her offenses. That's what equal justice under law is all about; no more and no less.
I don't give a rodent's hindquarters about Paris Hilton, but the notion of making an example of her because of her wealth or celebrity is obscene. Arguments about the justice of the laws in question aside, she should be treated no better and no worse than the average person convicted of her offenses. That's what equal justice under law is all about; no more and no less.
"Cowering, ineffectual ninnies"?
Hat Tip to Arts & Letters Daily, Matt Taibbi weighs in at Adbusters with a funny, scathing and yet sympathetic look at contemporary American liberalism. Here's the lede paragraph:
Perhaps its biggest, but certainly not its only problem according to Taibbi. Well worth the read.
The biggest problem with modern American liberalism may be the word itself. There’s just something about the word, liberal, something about the way it sounds – it just hits the ear wrong. If it were an animal it would be something squirming and hairless, something that burrows maybe, with no eyes and too many legs. No child would bring home a wounded liberal and ask to keep it as a pet. More likely he would step on it, or maybe tie it to a bottle-rocket and shoot it over the railroad tracks.
Perhaps its biggest, but certainly not its only problem according to Taibbi. Well worth the read.
Wednesday, June 13, 2007
Sexist Pigs Abandon Katie
Poor Katie Couric is, according to CBS chief executive Leslie Moonves, the victim of sexism.
Couric's ratings on the CBS Evening News this month hit a 20 year low, pretty amazing when you consider she took over from Dan Rather.
As I have written before, here's the real deal. Every year there are new "men over 55" who used to be men under 55. They haven't been watching network news for years and nothing is going to change that. Years ago, I used to quip that the most frightening sentence in the English language was "More people get their news from ABC News than from any other source." But whether it's from ABC, NBC or CBS, more and more people of all ages are getting whatever news they do get from the many alternatives now available. Why wait for the local weatherman when you can click on the weather any time you want (or click on the Weather Channel on TV)? It doesn't matter who's selling your product if it's a product no longer in high demand because better alternatives exist, and that has nothing to do with Couric's gender or even her modest journalistic skills.
Ms Couric has managed a 2 per cent increase in women age 18 to 49 since her September debut. However, that has been more than offset by an 11 per cent decline among men over 55, who still constitute the bulk of the evening news’ audience.
Couric's ratings on the CBS Evening News this month hit a 20 year low, pretty amazing when you consider she took over from Dan Rather.
As I have written before, here's the real deal. Every year there are new "men over 55" who used to be men under 55. They haven't been watching network news for years and nothing is going to change that. Years ago, I used to quip that the most frightening sentence in the English language was "More people get their news from ABC News than from any other source." But whether it's from ABC, NBC or CBS, more and more people of all ages are getting whatever news they do get from the many alternatives now available. Why wait for the local weatherman when you can click on the weather any time you want (or click on the Weather Channel on TV)? It doesn't matter who's selling your product if it's a product no longer in high demand because better alternatives exist, and that has nothing to do with Couric's gender or even her modest journalistic skills.
Tuesday, June 12, 2007
Attack of the Flying Fish!
Hectic days recently, so little blogging. (Plus cyber-ineptitude led to deletion of two recent posts!) [sigh...] Still, breaking news of this magnitude must take precedence over all else: A woman was brutally attacked in Florida by a leaping sturgeon!
Sunday, June 10, 2007
"... And in the end you're completely alone with it all. "
I wrote earlier this year about The Sopranos as its final season began, so I might as well bookend the season with a few valedictory remarks. First, no matter what happens to Tony, whatever happened to the ducks? Second, can we all agree that, forgetting all the goombas that came and went, the sexiest thing week after week was Dr. Malfi's legs? Finally, given that this is, after all, show business we're talking about, does anybody this side of those who really believe the Wachowski Bros. had three Matrix movies in mind all along think there won't be a Sopranos movie?
Could Hell Really Be Freezing Over?
My embedded Weather Watcher in Hell, Frosty the Snowman, reports a cooling trend. Radley Balko takes a break from his increasingly “Dog Bites Man” reports of police SWAT Team brutality, incompetence and unaccountability to report on Reason’s "Hit & Run" that Ron Paul’s campaign has experienced a surge of contributions amounting to between three and four million dollars and is closing in on $5 million. (The lengthy comments section of Balko’s post includes various snipes from former Paul staffer Eric Dondero, whose “Great Man” views of a future libertarian America strongly suggest that the great man he has in mind to lead us to the Promised Land has a surname ending in a sounded vowel.)
In the spirit of full disclosure, I’m toying with the idea of sending Paul a few bucks, too. Mark it up to rational irrationality or expressive voting. ("Geez, Ridgely, more Caplan plugs?") I’d still say Paul has the same chances of winning the Republican nomination that Barack Obama has of being the next Imperial Wizard of the KKK. Even so, between the Dean Factor, i.e., the yet unmeasured political power of the internet, and the historical precedent of Barry Goldwater, who knows?
In the spirit of full disclosure, I’m toying with the idea of sending Paul a few bucks, too. Mark it up to rational irrationality or expressive voting. ("Geez, Ridgely, more Caplan plugs?") I’d still say Paul has the same chances of winning the Republican nomination that Barack Obama has of being the next Imperial Wizard of the KKK. Even so, between the Dean Factor, i.e., the yet unmeasured political power of the internet, and the historical precedent of Barry Goldwater, who knows?
Richard Rorty, R.I.P.
Richard Rorty, one of the preeminent American philosophers of the 20th century, died on June 8 at the age of 75. Obituaries can be found here and here.
Rorty was among the faculty at Princeton with Donald Davidson and others when it was the unquestionably reigning philosophy department in the nation and was a principle figure in the Anglo-American analytic tradition especially as infused and influenced by American pragmatism. Rorty’s work and later career moved to what both critics and admirers might have called a “post-philosophical” perspective, a view influenced by Wittgenstein and others that there was, if you will, less there than meets the eye in philosophy as traditionally understood and practiced in the academy. However his philosophical views may have shifted over time, he remained committed to a progressive political perspective which nonetheless at least had the salubrious advantage of finding serious fault with the likes of Foucault.
Much as I like to criticize contemporary academic philosophy, to some extent for the same reasons Rorty found the field confused and wanting, I remain convinced that philosophers have shaped human society and even human thought, itself, more than anyone else throughout history. There are no emperors or generals whose influence compare to the influence, for better or worse, of Plato or Aristotle; and in more modern times I continue to find, however bastardized, misunderstood or unacknowledged, the works of Kant, Wittgenstein and a very few others continually influencing our “original” thinkers in virtually every other field of thought. Many contemporary scientists are scornful of philosophy, but philosophy gave birth to science and, my criticisms aside, it is far easier to find a philosophically naïve scientist than a scientifically naïve philosopher.
Few people will have ever heard of Richard Rorty. I didn’t know him but I did meet him once during his days at the University of Virginia. He struck me as someone who had followed Socrates’ admonition that the unexamined life is not worth living and was more than happy with the career and the life that advice led him to pursue. We should all be that fortunate.
Rorty was among the faculty at Princeton with Donald Davidson and others when it was the unquestionably reigning philosophy department in the nation and was a principle figure in the Anglo-American analytic tradition especially as infused and influenced by American pragmatism. Rorty’s work and later career moved to what both critics and admirers might have called a “post-philosophical” perspective, a view influenced by Wittgenstein and others that there was, if you will, less there than meets the eye in philosophy as traditionally understood and practiced in the academy. However his philosophical views may have shifted over time, he remained committed to a progressive political perspective which nonetheless at least had the salubrious advantage of finding serious fault with the likes of Foucault.
Much as I like to criticize contemporary academic philosophy, to some extent for the same reasons Rorty found the field confused and wanting, I remain convinced that philosophers have shaped human society and even human thought, itself, more than anyone else throughout history. There are no emperors or generals whose influence compare to the influence, for better or worse, of Plato or Aristotle; and in more modern times I continue to find, however bastardized, misunderstood or unacknowledged, the works of Kant, Wittgenstein and a very few others continually influencing our “original” thinkers in virtually every other field of thought. Many contemporary scientists are scornful of philosophy, but philosophy gave birth to science and, my criticisms aside, it is far easier to find a philosophically naïve scientist than a scientifically naïve philosopher.
Few people will have ever heard of Richard Rorty. I didn’t know him but I did meet him once during his days at the University of Virginia. He struck me as someone who had followed Socrates’ admonition that the unexamined life is not worth living and was more than happy with the career and the life that advice led him to pursue. We should all be that fortunate.
The Myth of the Rational Voter: Why Democracies Choose Bad Policies, by Bryan Caplan
Insanity is doing the same thing over and over expecting different results. -- attrib. Albert Einstein
I am fond of writing, even if you are not fond of reading it again, that I know just enough about law, philosophy and economics to be dangerous, mostly to myself. As it happens, I blogged twice about Bryan Caplan’s new book The Myth of the Rational Voter, acknowledging on both occasions that I hadn’t read the book but was responding only to comments written about it.
Of course, that not only doesn’t pass the minimal standards for scholarship, it doesn’t even pass the minimal standards for journalism. My plea in mitigation is that I am neither a scholar nor a journalist, a fact I make abundantly clear all the time by what and how I write here without needing to confess the fact as well. Still, having wasted so much virtual ink on the subject already (admittedly, a sunk cost), I’m happy to report that I have now, oddly enough, read the book.
This would not usually have been the case so quickly because I do not review books professionally and must therefore either buy them or wait for a library copy. I like buying books, but I am cheap and thus usually wait until they have been remaindered, which means I'm always behind the power curve on the cocktail party chat circuit. In this case, however, I received a free copy courtesy of my older son who, serendipitously, just attended a seminar in which Mr. Caplan was one of the speakers. Indeed, he even inscribed the book to me, which was kind of him but which raises a problem. He wrote in it that I am “a rational man in an irrational society.”
This would be puzzling if I took it as more than a gracious sort of inscription to a stranger. Aside from his lack of evidence about my rationality – we’ve never met and I seriously doubt he reads this blog (we’ll get to society’s irrationality in a minute) – there is the definitional problem of rationality, itself, and it is a problem that underlies Caplan’s argument. Rationality, as economists understand it, is tied very closely to their concept of efficiency, the layman’s version of which is getting the maximum bang for your buck. Caplan also imports the notion that truth seeking, perhaps even apart from its usual connection to efficiency, is an integral component of rationality as well. No doubt, under many conditions these are good operational criteria of rationality. But if the bang you are buying with your buck or your vote is your own subjective sense of well being, not only can ignorance be bliss, so can false beliefs. There is, at least, certainly nothing illogical about such a possibility. Indeed, not only the possibility but the fairly high likelihood that I am, by his criteria, at least somewhat irrational is at the heart of Caplan’s contention.
I move now to the jargon-laden version: Caplan contends that the collective effect of individuals’ systematic biases resulting from preferences over beliefs, when combined with an understanding of democracy not as a market but as a commons, undermines the Public Choice concept of rational ignorance as a means of understanding the workings of democracy. If I tried to explain all these terms, I would fail, so you should read the book instead. (And since you're not cheap, you should buy a copy now.) Still, I'll take a shot at a concept or two.
The rational ignorance hypothesis is easy enough for even dumb guys like me to understand – acquiring knowledge is time consuming and, at least in that sense, expensive, therefore people tend not to acquire more knowledge than they believe they need. People know, despite idiotic “Your Vote Counts” propaganda campaigns, that even if their votes get counted those votes never really count in the sense of altering the results. Thus, they tend to vote, if at all, largely out of ignorance and, as a result, essentially randomly. If so, this is either good or bad, depending on whether you believe the Miracle of Aggregation (the ignorant masses cancel each other’s vote out while the very few people who know beans from bacon make the real choice) or the Gucci Gulf theory that self-serving
But Caplan argues that it isn’t rational ignorance but an ironically ‘rational’ sort of irrationality that is the key to understanding why, for example, voters continue to prefer protectionist policies despite the fact that such policies are not in their or the nation’s best interests. Let's call it the "We can't all be that dumb so we must be crazy" thesis. If democracy is really a commons and not a market, then as with most commons situations there is negligible cost to the individual voter resulting from casting his vote as a method of indulging his preferred beliefs over his desire to opt for objectively rational (that is, efficient) economic policies. Moreover, if voters’ motives are altruistic (or, and here’s your vocabulary building word for today, sociotropic), they have that much less motive to correct their (objectively) mistaken beliefs about the relative efficacy of one economic policy versus another.
Caplan contends that because irrationality and not the prevailing Public Choice hypothesis of rational ignorance better explains these voting patterns and habits, we have “neither well functioning democracies nor democracies hijacked by special interests [but] democracies that fall short because voters get the foolish policies they ask for.” More succinctly, he invokes Mencken’s famous observation that “Democracy is the theory that the common people know what they want, and deserve to get it good and hard.”
As Caplan notes, “recognizing irrationality is typically equated with rejecting economics.” Even so, if it is true that people have preferences over beliefs and that, as a result, they deem irrationality as a good among other goods, then they will in some cases prefer irrationality; that is, they will demand a certain amount of irrationality depending on its price. What he perhaps fails to stress sufficiently is that this is important only in the prescribed sense of rationality economists employ. In fairness, however, he does address the issue:
Many escape my conclusions by redefining the word rational. If silly beliefs make you feel better, maybe the stickler for objectivity is the real fool. But this is why the term rational irrationality is apt. Beliefs that are irrational from the standpoint of truth-seeking are rational from the standpoint of individual utility maximization. More importantly – whatever words you prefer – a world where voters are happily foolish is unlike one where they are calmly logical.
I think this is a critical acknowledgment. Whether it is rational under Caplan’s or most economists’ definition of the term or not, it has been my experience more people would prefer to be happily foolish than calmly logical.
Efficiency, while a dandy instrumental good, is neither an intrinsic nor an exclusive good. (I am using the term good here as a normative term and not as an economic term of art.) That is, there is nothing per se irrational in the sense of being internally contradictory as opposed to merely being less than optimal in terms of efficiency about a normative outlook that holds efficiency to be only one of sometimes competing values. More broadly, I’d simply note that consequentialism is not the only ethical game in town. Say what you will of Kant, you'd be hard pressed to call him irrational.
Caplan makes a few other claims I find questionable; for example, that the Self-Interested Voter Hypothesis is false. It may really be false, but I found his confidence regarding that claim unconvincing and his examples capable of alternative explanations that would still support the basic (and, I think, generally useful) intuition that people, including people acting as voters, act in their own perceived self-interest. However, these sorts of disputes can run a high risk of having the very concept of self-interest turned into mere tautology and, in any case, the point may not be critical to his case. He also invokes his epistemically privileged understanding of his own preferences in evidence against the revealed preference thesis, a sort of introspectionist attack on the nearly universal operational behaviorism of social science. I think there's a good point lurking there, but it needs more development.
Finally, there is the question of who the intended audience is for this book, and I’m guessing it isn’t the likes of me. As mentioned before (and, once again, made abundantly clear), I am not an economist; but I found the claim that democracy is a commons and not a market, once made, both insightful and uncontroversial to the point of being obvious. That is, it is obvious in that "Oh, of course!" way someone else's hard work seems simple once the work is done for you. But Caplan seems to think that both economists and political scientists will find the claim highly controversial, so what do I know? Further, perhaps again because I simply don’t get it because I’m not an insider, it isn’t clear to me what difference between the rational ignorance hypothesis, tweaked a bit here and there, and Caplan’s alternative account would be for practicing economists. Then again, I’ve always found the “rational” part of phrases like “rational self interest,” “rational maximizer,” etc. a bit suspect. On balance, though, this book is an extended theoretical argument aimed at or at least aimed to Caplan's fellow theoreticians.
Theory aside, human society orders itself in all sorts of ways. There is, if you will, a market for markets and a market for democracy. In many cases markets and democracy can and perhaps should compete to determine which works best for people. Caplan contends that “democracy overemphasizes citizens’ psychological payoffs at the expense of their material standard of living.” But assuming there are to be tradeoffs at all, what is the right balance?
Generally speaking, I prefer markets to government and so, obviously, does Caplan. However, in my own case my preferences go at least as much to what I believe would be the psychological benefits of the greater freedom of markets as they do to my belief that I would be better off materially, though I happen to think that would usually be the case as well. But the fact is that I probably would rather be happy than right just as I probably would rather be more happy and less well off materially than vice versa.
Indeed, any other choices would strike me as irrational.
Saturday, June 9, 2007
"It's Fun to Eulogize The People You Despise"
Herewith, a bit of nostalgia from Tom Lehrer's National Brotherhood Week:
One of the byproducts of growing older is the occasionally tedious chore of having to explain dated cultural references to one's children. Hours could be wasted explaining everyone mentioned in Paul Simon's A Simple Desultory Philippic or Billy Joel's We Didn't Start The Fire.
Be that as it may, former Dallas County, Alabama Sheriff Jim Clark died recently and was, if not exactly eulogized, remembered by former civil rights activist and current Georgia Congressman John Lewis.
Clark became the unwitting negative stereotype of the Southern police officer for his generation and an even more unwitting agent of change. The national media attention that followed his (anticipated) overreaction to civil rights demonstrators in what became known as Bloody Sunday actually helped the civil rights movement convince Congress to pass the Voting Rights Act of 1965. Years later, after losing office, Clark was convicted for conspiracy to import marijuana and served nine months in prison. By all accounts, Clark remained unrepentant unto death.
Sheriff Clark was never anything more than a footnote to history. Today, however, one still occasionally hears breathtakingly absurd claims that America is as racist as it was fifty years ago, a claim that cannot possibly be made in good faith by anyone who remembers what the America of fifty years ago was really like. In the story of our collective journey toward a more perfect union, Clark deserves no eulogy, but he needs to be remembered.
Oh, the white folks hate the black folks,
And the black folks hate the white folks;
To hate all but the right folks
Is an old established rule.
But during National Brotherhood Week,
National Brotherhood Week,
Lena Horne and Sheriff Clark are dancing cheek to cheek.
It's fun to eulogize
The people you despise
As long as you don't let 'em in your school.
One of the byproducts of growing older is the occasionally tedious chore of having to explain dated cultural references to one's children. Hours could be wasted explaining everyone mentioned in Paul Simon's A Simple Desultory Philippic or Billy Joel's We Didn't Start The Fire.
Be that as it may, former Dallas County, Alabama Sheriff Jim Clark died recently and was, if not exactly eulogized, remembered by former civil rights activist and current Georgia Congressman John Lewis.
Clark became the unwitting negative stereotype of the Southern police officer for his generation and an even more unwitting agent of change. The national media attention that followed his (anticipated) overreaction to civil rights demonstrators in what became known as Bloody Sunday actually helped the civil rights movement convince Congress to pass the Voting Rights Act of 1965. Years later, after losing office, Clark was convicted for conspiracy to import marijuana and served nine months in prison. By all accounts, Clark remained unrepentant unto death.
Sheriff Clark was never anything more than a footnote to history. Today, however, one still occasionally hears breathtakingly absurd claims that America is as racist as it was fifty years ago, a claim that cannot possibly be made in good faith by anyone who remembers what the America of fifty years ago was really like. In the story of our collective journey toward a more perfect union, Clark deserves no eulogy, but he needs to be remembered.
Friday, June 8, 2007
Constant Viewer: Ocean's Twenty-One
HOLLYWOOD, June 2026 – Principal photography is scheduled to begin this week for Ocean's Twenty-One (working title, O-21: Bingo!), the tenth sequel to the glossy remake of the original glossy Sinatra Rat Pack action / adventure / comedy / romance / paid vacation for middle-aged actors and singers. Once again George Clooney’s Danny Ocean gathers up the usual suspects: Brad Pitt, Elliot Gould, Don Cheadle, Bernie Mac, Casey Affleck, Scott Caan; newcomers Orlando Bloom, Hugh Jackman, Tobey Maguire, Owen Wilson, John Travolta, Johnny Depp; the starting lineup of the Los Angeles Lakers, a CGI performance from the late Prof. Irwin Corey and a special cameo appearance by CBS News anchor Paris Hilton.
This time around also features the return of Matt Damon, missing from the last six sequels since his election ten years ago to the U.S. Senate. Sen. Damon (D-Mass) is ironically reprising his role as "Good" Will Hunting for O-21, now an MIT professor whose mathematical formula to beat the odds at BINGO becomes the film’s McGuffin when Danny and the boys try to clean out every Sunday afternoon BINGO game in Branson, MO in time to make the Early-Bird Special at Denny's. An unnamed aging actress and a pliant ingenue or two round out the cast.
Asked about his new casting choices, Clooney explained from his Palladian villa, Palazzo dei Sequali, “Hey, we’d have gotten Stallone and Willis back if they weren’t busy shooting Rocky Dies Hard IV. But you know, it really doesn’t matter. We plan on milking this cash cow one way or the other until they finally pay us to stop making the damned things."
Clooney surprised television audiences early last year in his first ever return to “E.R.,” then in its thirty-second season, earning an Emmy nomination for his portrayal of “the bleeding guy on the gurney.”
* * * * *
P.S. -- Constant Viewer has seen Ocean's Thirteen and can think of at least thirteen good reasons why you shouldn't, all of which bear portraits of George Washington.
This time around also features the return of Matt Damon, missing from the last six sequels since his election ten years ago to the U.S. Senate. Sen. Damon (D-Mass) is ironically reprising his role as "Good" Will Hunting for O-21, now an MIT professor whose mathematical formula to beat the odds at BINGO becomes the film’s McGuffin when Danny and the boys try to clean out every Sunday afternoon BINGO game in Branson, MO in time to make the Early-Bird Special at Denny's. An unnamed aging actress and a pliant ingenue or two round out the cast.
Asked about his new casting choices, Clooney explained from his Palladian villa, Palazzo dei Sequali, “Hey, we’d have gotten Stallone and Willis back if they weren’t busy shooting Rocky Dies Hard IV. But you know, it really doesn’t matter. We plan on milking this cash cow one way or the other until they finally pay us to stop making the damned things."
Clooney surprised television audiences early last year in his first ever return to “E.R.,” then in its thirty-second season, earning an Emmy nomination for his portrayal of “the bleeding guy on the gurney.”
* * * * *
P.S. -- Constant Viewer has seen Ocean's Thirteen and can think of at least thirteen good reasons why you shouldn't, all of which bear portraits of George Washington.
And The Standard Here Would Be... ?
Okay, so by now everyone knows there is a new Creation Museum in Kentucky. Well, it seems they have various videos portraying the Creationist perspective and these videos use actors. Apparently, Eric Linden, the actor who portrays Adam (of "and Eve" fame) in one 40 second spot has been accused of "participation in projects that don't align with ... biblical standards" and the accusations are now being, um, investigated by Creation Museum personnel.
I have two questions:
(1) If by "biblical standards" is meant anything like how the overwhelming majority of persons described in the Bible (okay, Jesus excepted) actually behaved, how high can those standards be?
And,
(2) Investigate? Since when did these folks care about evidence?
I have two questions:
(1) If by "biblical standards" is meant anything like how the overwhelming majority of persons described in the Bible (okay, Jesus excepted) actually behaved, how high can those standards be?
And,
(2) Investigate? Since when did these folks care about evidence?
Land of Lincoln by Andrew Ferguson: A Semi-Wised-Up Quasi-Review
Not even Mona could begrudge Andrew Ferguson his gig at The Weekly Standard. Together with Matt Labash, Ferguson keeps me returning to the Neoconservative Magazine of Record despite myself. Them boys can write.
Ferguson recently published Land of Lincoln: Adventures in Abe’s America, a book I heartily recommend despite my general antipathy toward both biography and history. In fact, Land of Lincoln is not so much either history or biography, strictly speaking, as it is the story of Ferguson’s own coming to grips with the Lincoln of both his and our imagination as he road-trips his way from Richmond, Virginia, where a proposed statue of Abe in the Capital of the Confederacy became ‘surprisingly’ controversial, through Springfield, Illinois (home of the Museum of Funeral Customs!) and Gettysburg, Pennsylvania (where you can take the Orphan Tour!), then finally to the National Mall and the foot of the Lincoln Memorial.
I came away from the book with a richer sense of what I brought to it; namely, the notion that Lincoln was a complex and contradictory figure both personally and publicly and that our equally complex and contradictory views of him are as much a mirror, albeit of the funhouse variety, of ourselves as they are of the most important man in American history.
That’s quite a claim, “most important man in American history.” Surely, one could argue that Washington, who historian James Thomas Flexner called "The Indispensable Man,” has a shot at the claim. Maybe a few others do, too. But the Civil War is as much the defining event of the republic that followed it as the Revolutionary War was the defining event of the republic that preceded it. It has been repeated endless times that prior to the Civil War the standard phrase was “the United States are” while afterward it became “the United States is,” but what we are today as a people and what the United States is today as a nation began not at Lexington and Concord but at Fort Sumter.
Be that as it may, as enigmatic and contradictory and ultimately unknowable as Lincoln undoubtedly was, all these things made him, after all, only human. Philosophies of history bore me almost as much as, I am somewhat ashamed to admit, history itself; so don’t expect any “do great men make history or does history make great men” musings from this quarter. Moreover, the logic of counterfactual conditionals, of “If X (where X is false), then Y,” permits any Y, any not logically impossible conclusion at all. But while reading Land of Lincoln, I remembered the first “alternate history” I ever read, one of those “If the South won the Civil War” novels aiming for the history buff / science fiction fan crossover market. I didn’t much care for it and never tried another alternate history novel.
In a sense, though, we are all authors of alternate histories, fitting or forcing together whatever we think we know about the past through the filters of what we want to believe about it. Lincoln towers over our imagination even as his Memorial statue towers over the tourist who cannot help but feel a frisson of awe at its sight or the engraved words of the Gettysburg Address.
I am as “wised up,” to use Ferguson’s phrase, as the next cynic and as critical of Lincoln as any Southerner or libertarian opponent of strong government can be. But arguing the right and wrong of history, like playing “what if Lincoln had failed” is, in the pejorative sense, a merely academic pursuit. The United States is what it is today in no small measure because Lincoln did not fail. The important question is, as always, where do we go from here?
Ferguson recently published Land of Lincoln: Adventures in Abe’s America, a book I heartily recommend despite my general antipathy toward both biography and history. In fact, Land of Lincoln is not so much either history or biography, strictly speaking, as it is the story of Ferguson’s own coming to grips with the Lincoln of both his and our imagination as he road-trips his way from Richmond, Virginia, where a proposed statue of Abe in the Capital of the Confederacy became ‘surprisingly’ controversial, through Springfield, Illinois (home of the Museum of Funeral Customs!) and Gettysburg, Pennsylvania (where you can take the Orphan Tour!), then finally to the National Mall and the foot of the Lincoln Memorial.
I came away from the book with a richer sense of what I brought to it; namely, the notion that Lincoln was a complex and contradictory figure both personally and publicly and that our equally complex and contradictory views of him are as much a mirror, albeit of the funhouse variety, of ourselves as they are of the most important man in American history.
That’s quite a claim, “most important man in American history.” Surely, one could argue that Washington, who historian James Thomas Flexner called "The Indispensable Man,” has a shot at the claim. Maybe a few others do, too. But the Civil War is as much the defining event of the republic that followed it as the Revolutionary War was the defining event of the republic that preceded it. It has been repeated endless times that prior to the Civil War the standard phrase was “the United States are” while afterward it became “the United States is,” but what we are today as a people and what the United States is today as a nation began not at Lexington and Concord but at Fort Sumter.
Be that as it may, as enigmatic and contradictory and ultimately unknowable as Lincoln undoubtedly was, all these things made him, after all, only human. Philosophies of history bore me almost as much as, I am somewhat ashamed to admit, history itself; so don’t expect any “do great men make history or does history make great men” musings from this quarter. Moreover, the logic of counterfactual conditionals, of “If X (where X is false), then Y,” permits any Y, any not logically impossible conclusion at all. But while reading Land of Lincoln, I remembered the first “alternate history” I ever read, one of those “If the South won the Civil War” novels aiming for the history buff / science fiction fan crossover market. I didn’t much care for it and never tried another alternate history novel.
In a sense, though, we are all authors of alternate histories, fitting or forcing together whatever we think we know about the past through the filters of what we want to believe about it. Lincoln towers over our imagination even as his Memorial statue towers over the tourist who cannot help but feel a frisson of awe at its sight or the engraved words of the Gettysburg Address.
I am as “wised up,” to use Ferguson’s phrase, as the next cynic and as critical of Lincoln as any Southerner or libertarian opponent of strong government can be. But arguing the right and wrong of history, like playing “what if Lincoln had failed” is, in the pejorative sense, a merely academic pursuit. The United States is what it is today in no small measure because Lincoln did not fail. The important question is, as always, where do we go from here?
Subscribe to:
Posts (Atom)