Ask me: " how do you keep cynicism at bay?"

>> Sunday, February 28, 2010

Janiece writes:

One of the things I admire about you is that as a public defender, you are constantly exposed to the dregs of society. Yet, you seem to maintain a level of distance from that aspect of your profession, presumably by being dedicated to the process rather than to your clients. I strongly suspect the turn-over at a public defender's office is quite high, and yet you seem dedicated to your chosen field after many years.

I don't think you're a saint, so how do you keep cynicism at bay?


There's actually a lot to unpack there, but perhaps we should start with the very last question: the short answer is that I don't.

I don't know if the public defender office I work at is typical in this respect, but I suspect it is: there's a lot of MASH-unit humor or sensibility in a PD's office, or at least the sort of thing that's imagined to be MASH-unit sensibilities after decades of various incarnations of M*A*S*H at the theatre and on TV (and all those reruns!). You laugh and maintain a certain degree of laid-back-ness in the face of things that, if you didn't laugh and try to stay chilly, you'd just lose your fucking mind, if you even had one to start with (seeing as how it may have been crazy to go into that area of law to start with). Actually, I doubt it's even unique to PD offices--I certainly see it mirrored in the local DA's office. You get used to the fact that people will come back, no matter how much they swear to do good, and that today's victim will be tomorrow's defendant and vice-versa, and that all sorts of terrible things happen in the world almost as a matter of routine. And you put up a bit of a shell and make off-color jokes about awful things in the inner sanctums of the office, and you gird your loins for another day because that's what you do and not because you're sure it matters in the least. And yeah, you get kind of cynical about all of it, and I do mean all of it--not just the clients, but about the whole process and everybody in it, including yourself, other lawyers, cops, judges, probation officers, social workers, and anybody else you might think of.

That's not to say you lose all hope or optimism. Just as was the case when Pandora opened up the Gods Big Box O' Fear And Evil, Hope is the last thing in the box or you'd just quit, and a lot of people do (although my office has been pretty stable for a number of years). But "hope" and "cynicism" aren't necessarily exclusive or even opposites. You can certainly hope that a client will stay in rehab or hang onto a job or whatever--while realistically (or cynically) expecting that, no, this won't happen.

Which brings up something else, I guess, which is that cynicism is actually easier and more comfy, you know? It's the clients that you really get hopeful for without the leavening of cynicism who break your heart. But you get over that and move on or you don't and you go into something nice, careerwise.

The other part of your question is sort of based on a wrong-but-understandable presumption: "you are constantly exposed to the dregs of society." I'm not sure that's actually true, or true the way I suspect you mean. I alluded to part of that a moment ago, writing, "today's victim will be tomorrow's defendant...." The fact is that the number of clients I've represented in twelve years who really were, in my opinion, scum of the Earth with no redeeming features whatsoever, I can probably count on one hand; actually, I can think of only one right off, and am just sort of guessing at there being maybe two or three more, a literal handful at best. Meanwhile, the poor innocents that Justice needs to protect are very rarely that at all, and its not the least bit unusual for the DA to be prosecuting somebody tomorrow who was a chief witness for the State a few weeks back.

A lot of the people I represent never really had a chance to know any better; they never had the familial or social support or education in school to even know what their better choices might be, a lot of them are repeating the mistakes their parents made. A large number are addicted to drugs and/or alcohol, some of them because of bad choices and some of them because they're mentally ill and tried self-medication instead of seeing an actual doctor (something that may have been financially unavailable or, believe-it-or-not, simply not a part of their life skills set--they know that doctors exist but don't realize that one might be able to address their particular problems, or going isn't the logical or obvious thing in their family or community). Increasingly, in a bad economy, you see the "Jean Valjean" scenario--I've represented any number of people arrested for stealing diapers, baby food, children's clothing, and other fairly basic necessities, just as Victor Hugo's character was imprisoned for stealing bread for his sister's children; another common enough crime is to steal something then attempt to return it to the store it was stolen from for a refund or to pawn it--usually there's a rent check or utility bill that was the impetus. None of these people are "dregs" in any meaningful sense.

On the other hand, I will also say I've had the misfortune of dealing with crooked cops, bitter and vengeful victims who appear to be motivated by nothing but spite, outright liars (North Carolina, incidentally, allows citizens to personally swear out misdemeanor charges before a magistrate, which frequently results in people using "grudge warrants" to extract revenge or inconvenience an enemy), and elitists in business or my own profession who are lacking in any sort of empathy for otherwise decent people who may have made one single awful decision--so there are certainly dregs of the human race I've faced who weren't my clients. Sometimes you deal with that unpleasantness by taking on the mantle of being an even bigger asshole than they are--I confess I've sometimes had a grim satisfaction in making somebody's life unpleasant, particularly when I finished by getting an acquittal out of it so that the bastard paid and got nothing in return. Seriously, it makes you feel kind of like Batman or something, the grim avenger sticking up for the little guy and causing a little righteous pain to the iniquitous.

Call that, if you will, not being a saint but being a sinner on the side of the angels.

The last thing I'd say is that process plays surprisingly little role in day-to-day lawyering with what I do. It's not absent, but there is a sense in which process is a tool in your arsenal and not always an end in itself. "When the facts are against you," and old saying goes, "argue the law, and when the law's against you, argue the facts." Sometimes I am process oriented because I have a tendency to be a rules lawyer because I'm an asshole and that's one of the ways my assholeness manifests (it's probably an insecurity thing, actually, but that's beside the point). E.g. I'm also one of those people who is slightly mortified whenever somebody wants to play Monopoly with a kitty. (It's just wrong, people--fines and fees go to the bank, says so right in the rules, and "Free Parking" is explicitly a space on which nothing happens. Those influxes of random cash is why your games take forever and you think they're boring.) (I'm trying to be less anal-retentive as I grow old; the jury remains out as to my success or lack thereof.)

Thank you for asking. Does that answer the question(s)?


The Wolfman

>> Saturday, February 27, 2010

So I didn't expect a whole helluva lot from The Wolfman, a remake of the 1941 George Waggner film, given the mediocre-to-lousy reviews it's been getting, but I have to confess that I really didn't expect such a complete and utter failure. I'd even go so far as to say that Wolfman is a candidate for worst-ever-werewolf-film, except I haven't seen the assorted sequels to The Howling--it's possible the one with the marsupial werewolves is worse, I can't say.

I don't really want to belabor the point too much. It's just not a good movie, and I have no idea whether that's the fault of director Joe Johnston or studio meddling that resulted in numerous and infamous reshoots that delayed the film's release. Certainly, Wolfman suffers from a fairly amusing visual schizophrenia, with slow scenes that seem to be intended to create psychological tension and atmosphere alternating with an over-reliance on CGI and sped-up effects to convey to slower audience members that time has passed (in one particularly ridiculous scene, Benicio Del Toro or his stand-in crosses a bridge in silhouette while the moon zips past him like a passing freight train--I suppose his character must be walking very slowly). One frankly suspects it's not really Mr. Johnston's fault at all, that somebody at the studio wanted a punchier, faster, more exciting film with cheap jolts and a big finish while Mr. Johnston tried to make a slower, moodier picture (this seems especially likely considering that at least one reshoot apparently was to add a climactic fight scene to the film that, rather than coming across as exciting or intense, made me giggle).

There's also a possibility the fault lies somewhere in Andrew Kevin Walker's and David Self's screenplay, sort of based on the original screenplay by Curt Siodmak, though yet again there are hints that they wrote a different movie from the one that the studio ultimately released. Hard to say--if the climax was a late reshoot, then one of the primary and least-coherent plot threads in the movie may have been a late addition and the reason it's utterly absurd perhaps the result of sticking in three, maybe four scenes that seem sort of just glommed into the movie in an attempt to add a dramatic conflict that raises all sorts of plot questions. Then again, some of the additions to and alterations from Mr. Siodmak's original screenplay just don't add anything much or even offer a silly distraction; in particular, while Hugo Weaving is typically marvelous as Chief Inspector Frederick Abberline, the insertion of the historical investigator of the Jack The Ripper killings into a rural village many miles outside his jurisdiction is just silly; I don't even mean wrong or unrealistic, though it is--I can overlook a great deal of movie fantasy police work simply as part of suspension of disbelief--I mean silly, in that one just sits there wondering what this historic person is doing in this movie; it would've made as much sense to then have poor Larry Talbot chased by Sherlock Holmes, instead, or (if we're going for real historic persons, to have him treated by Sigmund Freud or throw in a guest cameo by a twelve-year-old Albert Einstein, on vacation in England and readily available to explain how the moon's gravitational MacGuffining drives a lycanthrope to transform unless he eats these prototypical rolled-corn "flakes" that Harvey Kellogg is hoping to patent in a few years (hey, who says you can't have product placement in a movie set in the 19th Century!)).

Speaking of which, the cast really is mostly fine, even Anthony Hopkins, whose performances these days frequently would seem more at home on rye with melted swiss and mustard. Benicio Del Toro brings an interesting mix, actually, of Lon Chaney, Jr.'s pathos and Oliver Reed's swagger--does that make him the best man-Wolf-man ever? I dunno, but he's good. Emily Blunt is agreeable and certainly easy on the eyes, and I already mentioned that Hugo Weaving is great even if his character really has no business showing up. And, while Christina Contes' role as Solana Talbot is minuscule, credit to the filmmakers for having her at all, since her presence avoids the awful miscasting of the 1941 version, in which the idea that the tall, brooding Mr. Chaney, Jr. and tiny, fey Claude Rains are even in the same family tree (much less son-and-father) is actually one of the hardest things to swallow in the whole movie as far as I'm concerned (the exotic beauty Ms. Contes, on the other hand, bridges any gap between Sir Anthony and Del Toro; clearly the boy took after his mother).

A movie like The Wolfman lives and dies by its special effects (indeed, the original The Wolf-Man is more notable for its makeup design and early transformation effects than much of anything else). Sadly, the effects in The Wolfman are largely abysmal beyond some predictably nice makeup work by Rick Baker, who could probably have done this movie in his sleep by now. Aside from that, however, we have quite a lot of unconvincing CGI--I've seen better in videogames I've played, and not even recent ones. Most laughable is probably a patently-unbelievable bear, which unconvincingly stands there until it rears up on its hind legs, growls, and goes back to all fours; unless there is some kind of moratorium on using bears in movies, I can't imagine why they needed a (very badly rendered) digital bear, and if there is a moratorium on bears why they couldn't rewrite the movie to use a lion, or tiger, or dog, or perhaps even a midget on a leash (seriously, that's how bad the bear is). I joked on Twitter that they maybe should've used Nicholas Cage in a bear suit: it certainly would have been just as convincing, and perhaps he could have decked the wolfman... or a gypsy... or maybe even have just run around hitting everybody, that certainly would have been a better movie and worth the matinée ticket.

On a completely different subject: I will be answering questions, probably starting tomorrow. Keep on asking in the "Ask me..." thread, if you'd like!


Ask me, ask me, ask me....

>> Friday, February 26, 2010

It seems that this has been a slow week for the old el bloggerino. Not sure why. Of course, I could do yet another '80s music entry, but I don't want to drive it into the ground. And I could post a music vid, but I kind of did that yesterday.

A few things cropped up, but nothing that I felt was worth doing a full-blown blog entry on, and if you follow me on Twitter, you probably know about half of it.

So here's what we're a-gonna do for today's entry: sometime in the ancient murks of Standing On The Shoulders Of Giant Midgets, archived in a place I can't find because I can't remember what I called it closely enough to do a meaningful Google search for it, I opened up the floor to questions from the gallery and tried to answer some or most or all of them--I can't remember that, either. But regulars seemed to enjoy the chance to ask me questions, and it provided good blog fodder. So, why not see if any of you have any other questions you might want to ask yours truly? And if none of you do, don't fret, something blogworthy will happen somewhere in the universe eventually. But if you do have something, post a comment here and I'll try to come up with something appropriate to say sometime next week.

Any takers?


"Everything's gonna be alright..."

>> Thursday, February 25, 2010

And now, because I haven't put anything up today and don't really have anything, and because it was running through my head for some strange reason while I was making dinner, one of my favorite Bob Marley And The Wailers tracks, "No Woman, No Cry."

Two quick things about this. First, more evidence of the Internet's awesomeness: in looking for the song on YouTube, I was fortunate enough to find this clip, which is an extraordinary live version filmed in Boston in 1979. Even if you're--I shudder to write this--not a Marley fan, it's worth a play in spite of some sound glitches that may be a recording artifact or mic trouble, it's hard to say. Secondly, and I did not know this: when I was checking Wikipedia just to be anal about whether there was a comma in "No Woman, No Cry," I stumbled across this bit of trivia:

Though Bob Marley may have written the song, or may have written the melody, songwriter credits were given to "V. Ford". Vincent Ford was a friend of Marley's who ran a soup kitchen in Trenchtown, the ghetto of Kingston, Jamaica where Marley grew up. The royalty checks received by Ford ensured the survival and continual running of his soup kitchen.

Helluva guy, that Marley.

And now, without further ado (or with a further adieu, heh), Marley And The Wailers:


Son of '80s music didn't suck

>> Wednesday, February 24, 2010

I meant, I really meant to take a break from this "nascent feature." But (a) I feel uninspired and (b) I could list a few hundred records, probably. I hope I'm not beating something to death--unfortunately, those Mount Vernon people seem to have gone home and aren't doing anything funny I heard about. So, anyway, what the Hell--here's five more awesome albums released 1980-1989.

Peter Gabriel, Peter Gabriel (a.k.a. Security) (1982): This album is one of the most intense musical experiences I can think of--on his fourth eponymous/first titled solo album, and first record for Geffen, Gabriel's chosen theme seemed to be human contact or connectedness--which perhaps doesn't seem like much of a theme, except this was a man who spent much of his term as frontman for Genesis wearing various disguises and costumes, and who didn't appear on the covers of his own albums undistorted or unmangled until 1986; on the cover of the 1982 album Gabriel appears on a phosphor-speckled television monitor, swaddled like a mummy or a burn victim.

The disturbing image is only a hint. The opening track, "The Rhythm Of The Heat," a song about losing one's Western identity based on a dream Carl Jung once related, begins with a tribal pulsing and ends with a fusillade of footfalls performed by an African dance troupe. The next track, "San Jacinto," reverses the meme--a Native American losing his cultural identity to Westernization ends the song moaning "Hold the line" in a way that, thanks to Gabriel's tortured delivery, could be defiant or pleading. A political prisoner is told he's not isolated on "Wallflower," but it's not clear whether this is a truth or a dream. And then there's the album single: "Shock The Monkey," a Motown soul track by way of epileptic seizure.

Some albums, like this one, are meant to be played in total darkness.

The Police, Ghost In The Machine (1981): This was, actually, the first Police album I owned, but I think it would be my favorite even if it wasn't.

At some basic level, I have to confess I sort of kind of almost want to hate The Police. Part of that is that Sting has turned into... Sting (what the fuck, man? what the fucking fuck? dude.), although that's more than offset by Stewart Copeland being, maybe, my all-time-favorite percussionist. The man can lay down a rhythm. The thing about The Police was that there was a certain level of contrivance in the whole project--basically, these guys were jazz musicians who decided punk was more salable but somehow ended up playing this super-white reggae half the time instead. The problem with hating them for this is that, fucking hell, they were and are unbelievably talented, beyond talented.

It's possible that part of why I love Ghost is that it's maybe the first Police album where they dropped the contrivance of being "punk" or "sorta reggae" and just played like the unbelievably-talented sons-of-bitches they were, performing something that was too rock and too jazz to really be "fusion." Well, maybe that's not quite true, either--"Every Little Thing She Does Is Magic" has sort of that paleface reggae thing going, but it's such a perfect piece of pop music that calling it "contrived" seems horribly unfair.

It's also the band's darkest record, which works too. The synth-drenched "Invisible Sun" paints the Irish troubles as a trap between darkness and desperation, an impossible choice between being shot at or being a shooter. And "Rehumanize Yourself" is probably the most giddily scathing lyric Sting's ever penned, and in the noble service of taking the piss out of ultra-right-wing shitheads ("Billy's joined the National Front / He always was a little runt / Got his hands in the air with the other cunts / You gotta humanize yourself.") Hell, even the self-pitying "pity the poor rockstar" track, Copeland's "Darkness" works somehow.

Guns N' Roses, Appetite For Destruction (1987): They could be boorish and vile, but goddamn they were so much fucking fun there's something wrong with you if you can't get into the spirit of a little misogyny or suicidal fixations on boozing, whoring and getting high. The Guns were what the Stones would've been if they'd been born about twenty years later, may the God Of Sex, Drugs And Rock'N'Roll love 'em.

Nothing they ever did after quite compared to the miracle of this first album. Use Your Illusion I & II (1991) could be slammed together into one really great disc if you left off all the shoulda-been-b-sides and, sure, Lies (1988) has "Patience" and "Used To Love Her" (and I don't care what anybody says--that song is funny) but is otherwise pretty inessential; Appetite remains the one and only record where the Guns got into a groove and stuck to it. Part of the reason, of course, is that this was the one record where G'n'R's secret songwriting weapon, Izzy Stradlin, is fully utilized and unleashed (the only track Stradlin didn't write or cowrite is "It's So Easy," and guess what--yeah, it's alright, but it's the one you could hit the "skip" button and not be missing a helluva lot).

Eleven perfect maximum rock'n'roll tracks (and "Easy"), one of which is "Sweet Child O' Mine," and how many rock ballads get better than that one, folks? I could've and maybe should've included this in the first five instead of this batch--I mean, this is the kind of album to the '80s that's like Revolver to the '60s or Exile On Main Street to the '70s, the kind of album that sort of has to be on your shelf if you're a rock fan or said shelf will never, ever, ever be complete.

Camper Van Beethoven, Our Beloved Revolutionary Sweetheart (1988): So, here's the thing about CVB: the album I should probably put on this list is Telephone Free Landslide Victory (1985), which includes seminal tracks like "The Day That Lassie Went to the Moon" and "Take The Skinheads Bowling," which are so wonderfully literal and abstract at the same time (e.g. you should take skinheads bowling because their heads are just like bowling balls, which is something you either instantly grokked or my "explanation" really didn't help explain why the song is so sublime--notice how, either way, my "explanation" is completely superfluous, like a third nipple, especially if it happened to be on your forehead). In the '80s, if you knew a pretty eclectic record store you could maybe get TFLV if they had a copy in with the bootlegs and local labels, these days your best bet is to see if the Cigarettes & Carrot Juice boxed set is still in print (it's one of the greatest compilations ever, consisting of all of CVB's pre-Sweetheart albums and EPs, along with a really awesome live album, Greatest Hits Played Faster).

That's not to say that Our Beloved Revolutionary Sweetheart, the first really easily available CVB record, and probably their most accessible effort, isn't perfect. Because it is. Perfect and strange, with those Roger McGuinn-ish guitars, with Jonathan Segel's sweet violin set against David Lowery's nasal snarkiness as it should be when all is right in the universe. "She Divines Water" is gorgeous and possibly incomprehensible, "Tania" (the cut that gives the album it's title) may be the catchiest song ever written about a brainwashed bankrobbing terrorist, and "Life Is Grand" just goes to prove that even a hardened, bitter cynic can write a cheerful, upbeat, optimistic song solely for the purpose of fucking with people. Also, telling somebody they "look like Grace Slick" has never seemed as appallingly insulting as it does in "Turquoise Jewelry" and probably never will again. Sorry, Ms. Slick.

Cowboy Junkies, The Trinity Session (1988): Remember that old Volkswagen ad where people are driving around in a VW convertible to a party listening to Nick Drake, and then they get to the party and it's a beautiful night and Pink Moon is a beautiful album, so they don't even get out of the car, they just pull out of the parking area and keep driving? Yeah, no offense to the late Mr. Drake, but the album was supposed to be The Trinity Session. Also, we weren't in a convertible, it was an old VW microbus.

Trinity is as much about its ambiance as anything, which isn't a slam; the Junkies recorded the record with a single mike in an old church (hence the album title), and never has so much open mike noise sounded better--the low hiss and natural reverb and sounds up in the church rafters give Trinity a special, hard-to-describe natural presence that most engineers try to get rid of on most records. It's a presence that's well-suited to Margo Timmins' husky vocal deliveries of songs written by the likes of Hank Williams ("I'm So Lonesome I Could Cry") and Lou Reed ("Sweet Jane") or made famous by performers like Patsy Cline (it's an enormous compliment to Timmins that the Junkies' take on "Walking After Midnight" holds its own against Cline's version). It's also well-worth noting and high praise, too, that the band's original numbers, such as "To Love Is To Bury" (mostly written by Margo Timmins and her brother, Michael, the band's guitarist) fall into place nicely with the classic covers.

You could listen to Trinity Session anywhere, but may I suggest: driving through the country or perhaps in the mountains on a clear spring or summer night, windows down (or top, if you're in a convertible); there are ghosts on roads in the South, you know, and ghosts on this record.


'80s music didn't suck (part II)

>> Tuesday, February 23, 2010

Prince, Purple Rain (1984): There are a few records that maybe everybody heard too many times that could make a list like this. Springsteen's Born In The USA (1984) and Dire Straits' Brothers In Arms (1985) are albums that could make a list like this (especially the latter)but they almost wore out their welcome (especially the former). So there's a temptation to leave the Purple One out of this, or to go with Around The World In A Day (1985) or the underrated Parade (1986). (A string of albums, by the way, that almost makes the case in and of itself that the '80s didn't suck.)

But why deny the sublime? Purple Rain was an awesome record top to bottom, the peak of an era when Prince hadn't become the punchline to a joke nobody got and could put together a flawless record. What's the weak link on this album? "Baby I'm A Star," with its relentless groove? "Computer Blue"? Hey, no song that begins with Wendy and Lisa sounding like they're working a phone sex line is a bad song. As for the rest of the record, there's no "good" songs, there's only great stuff--"Let's Go Crazy," "When Doves Cry," and the title track, which I'd throw out there as an easy candidate for 10-best guitar solos of all time.

Talking Heads, Speaking In Tongues (1983): Speaking of strings of perfect albums: the Heads '80s studio output consisted of the brutalistic but beautiful Remain In Light (1980), the trippy Little Creatures (1985), the soundtrack from David Byrne's can't-explain-why-I-like-it-but-I-do directing debut True Stories (1986), and the for-the-sake-of-completeness-let's-mention Naked (1988). And, of course, this one. Only one misfire in the batch, and to be fair it was sporting of them to try to make it work one last time before calling it quits.

But enough about that. Speaking In Tongues may be the ass-shakingest disc in this set of ten, and that's on a list with Purple Rain. Here is how you tell if your ass is broken: if you can play "Pull Up The Roots" without causing a shimmy, it's busted, and you need a new ass. To see if your shoulders are working, try "Swamp" or "Girlfriend Is Better." As for the classic "Burning Down The House," the album's first track and the big track in the U.S.--you can argue it's the weakest track on the record. Still, that jittering acoustic guitar fade-in is kinda fucking awesome.

Pixies, Surfer Rosa (1988): Maybe this should've been first on the list--except how do you deny Bowie, who (by the way) faithfully covered "Cactus" on 2002's Heathen, which had to rock for Black Francis.

Surfer Rosa probably defined '90s alternative as much as anything that came out that decade, even Nevermind, which Kurt Cobain openly acknowledged was his attempt to record a Pixies album. It's also one of those '80s records, along with Meat Is Murder, that you hear indie stations playing alongside stuff that came out this year, and "Break My Body" and (of course) "Where Is My Mind?" sound like they might've come out this year. As for the ephemeral "Tony's Theme"... hey, everybody's entitled to a larf now and then.

U2, The Unforgettable Fire (1984): Time has been kind, which is why this one is making the list instead of my personal favorite, War (1983) or that ubiquitous classic, The Joshua Tree (1987). Unforgettable Fire was not a loved album when it came out, getting middling reviews from much of the press and baffling friends who were expecting something clamorous and strident after the Live At Red Rocks EP (1983).

That wasn't this album, with its subdued Anton Corbijn cover photo hinting at the record's wintry churchbell guitars. U2 wouldn't be U2 without anthems--"Bad" and "Pride (In The Name Of Love)" became staples of the band's live shows, but this was more of an atmospheric album, bookmarked by the spacey jangle of "A Sort Of Homecoming" and the nearly a cappella "MLK," with Bono's keening voice matched only by a haunting, buzzing drone. Even the urgent "Wire" and "Indian Summer Sky" were implosive, convulsive tracks.

But out of all the band's '80s records--and the '80s were inarguably U2's finest decade--this is the one that has aged the most gracefully, getting mellower and deeper like something kept in an oak cask. Whether it was just a special year for the band or whether it was some arcane spell cast by Brian Eno, I can't say.

On the downside, this album marked the beginning of U2's love affair with the concept of America, an obsession that turned into a sort of goofiness by the time it metastasized on Rattle And Hum (1988). Like Bowie, U2 would have to go to Berlin to recalibrate. Unlike Bowie, Bono hasn't seemed to possess the self-awareness to know when he's veered into self-parody; Bowie has stepped back from the brink more than once, but Bono, having fallen over the edge, no longer has that option.

R.E.M., Lifes Rich Pageant (1986): For the first few years, R.E.M. seemed to want to be an antirock rock band, much the same way Pavement eventually would be; the guitar player didn't play solos per se or even straight melodic lines, necessarily, the lead singer mumbled his way through everything, there was a country streak there that wouldn't pass in Nashville and more than a trace of The Velvet Underground's sensibilities without as much hipster artiness (or heroin addictions).

But when they did decide to record what was basically a rock album--well, Pageant rocks. "Begin The Begin" is good guitar crunchiness and feedback and "Fall On Me" and "I Believe" as anthemic as you could ask for. "Swan Swan H" is too tender not to mention and "Cuyahoga," a memorial to one of the times the river of the same name caught fire again, too desolate to overlook. And they close with the band's cover of "Superman," which sounds like it was recorded in someone's garage--and that's a compliment.

And that's ten. I could keep doing this, and I might, though early feedback is suggesting I haven't quite made my case. I didn't mention The Police, or Peter Gabriel, or Kate Bush--which also means I haven't even mentioned some of my favorite artists of the era. And a few stalwarts of the '70s continued putting out good records, plus some opening shots from some people who would be titanic in the '90s.

But this was a start.


'80s music didn't suck

>> Monday, February 22, 2010

The other day I was talking to my Dad on the phone, and he asked me what I thought about '80s music. What I told him, actually, was my general theory that we tend to remember the best music of a decade while putting the worst aside--shit sinks and cream floats--and you can find plenty of crap music in the '60s if you look for it, it's just that most people forget, say, how mediocre Herman's Hermits were.

But that's not quite true; I mean, first of all, not all the cream floats, or The Zombies wouldn't be so overlooked all the damn time except for basically two songs ("She's Not There" and "Time Of The Season") that aren't even their best cuts (though they both rock). Meanwhile, even though it's been thirty years since the '70s ended and the '80s began, both decades remain unfairly reviled. In the case of the '70s, it seems everybody thinks "disco" and not, say, Dark Side Of The Moon and Born To Run, not to mention punk. In the case of the '80s... well, let's face it, mainstream '80s radio sucked, plus a lot of then-new studio tools and tricks got abused to the point that even a lot of good '80s songs haven't aged well because of the way they mis-recorded the damn drums or went overboard with a cheesy digital synth.

It's a shame, though, because there was a helluva lot of good music, and I'm not just saying that because I grew up in the '80s. There are records I'd put toe-to-toe against any album of any other era--and, conveniently enough, I have a blog and can do it publicly. So this one's for my Dad, and for the sake of filling a post, and (yes, I know I've said this before) who knows, maybe it could become a feature. Here, in no particular order and reflecting my biases, ten albums released 1980-1989 that you should know or better yet own:

David Bowie, Scary Monsters (And Super Creeps) (1980): Some things are clichés because they're true. Saying Scary Monsters is David Bowie's best album is a cliché, but, y'know, it's true. Bowie left Berlin practically rabid: assaulting what was becoming postpunk in a way that basically laid the floorplans for what postpunk and New Wave would turn into, mad to kill whatever was left of his early-'70s Starman persona, crazy to turn Robert Fripp's jittery guitars loose. Bowie was a man so eager to take names and kick ass he took and kicked his own ("Ashes To Ashes") between digs at fellow musicians ("Teenage Wildlife") and "the scene" ("Fashion"). And ever with that damn perfect musical sensibility that makes anybody who's ever tried to write a song simultaneously want to worship Bowie and pummel him into mud for even existing.

The only bad thing you can say about Scary Monsters, actually, is that it doesn't sound quite as fresh anymore, but that's only because Bauhaus spent about six years copying the record and then so many people were copying Bauhaus; hey, I'm not knocking Bauhaus here, if there's a follow-up to this entry it's likely to include The Sky's Gone Out, I'm just saying--Scary Monsters was a perfect template for what a record could sound like.

Bruce Springsteen, The River (1980): The Boss came out of the '70s at a crossroads between his folksier incarnation as "The Next Dylan" and the Phil Spectorish rocker of Born To Run, depressed after a difficult legal situation with his original management that made it impossible for him to work for several years after Run had put him on the cover of Time, defiant and triumphant after returning with 1978's Darkness On The Edge Of Town. Most musicians would have struggled to choose a possible direction and finally picked one. Springsteen gave up and picked all of them.

The River is a sprawling, epic mess and that may be why there's a good case to be made it's Springsteen's masterpiece. It's all here--blustery anthems ("Two Hearts") and introspective descents into darkness ("Stolen Car"), defiant working-class roars ("Out In The Street") and bitter meditations ("The River"), moments of tenderness ("I Wanna Marry You") and fits of violence ("Point Blank"). Springsteen has recorded a lot of great records--this is the only one you actually have to own.

The Smiths, Meat Is Murder (1985): Happy Valentine's Day. If I'd been more with it last week, I might have noticed that this past Valentine's Day marked the twenty-fifth anniversary of The Smiths' best album--oh sure, The Queen Is Dead is made of awesome and I'll always have a soft spot for Strangeways, Here We Come, but neither of those records have "How Soon Is Now?", a song that's practically become an indie standard (it's possible people need to stop covering it for a while, in fact--except how can I blame them, I'd cover it myself if I was in a band) and could have been written and recorded yesterday. It's a cliché to say "Now?" is a perfect pop song, but some things are clichés because... well, you know.

It's not the only perfect cut on the album. "What She Said" seems a perfect retort/update to The Beatles' "She Said She Said" ("She said, 'I know what's it's like to be dead'" becoming a fey, "What she said: 'I smoke 'cos I'm hoping for an early death and I need to cling to something'"). "Barbarism Begins At Home" lets the bass do the heavy work. And then there's "The Headmaster Ritual"; the best rip on a brutal educational system ever penned in a nation where no songwriter has ever written a kind word about the educational system--Morrissey managing to leapfrog over the likes of John Lennon, Roger Waters, Roger Hodgson and Ray Davies with serious pokes in the eye. ("Same old suit since 1962"? That's just cold, man.)

Public Enemy, It Takes A Nation Of Millions To Hold Us Back (1988): PE didn't invent hip-hop, or even socially-conscious and political hip-hop. But I don't think anyone--particularly amongst us suburban white folks--can say that PE's arrival wasn't when shit got real. For much of the decade, rap and hip-hop tended to be seen as party music when it was taken seriously at all.

Millions was a technical masterpiece: there are few DJs out there who can match the taste and, I dunno, aggressiveness of Terminator X--I almost hate to use that word given PE's militant image in some quarters, but there's an angry exuberance in a track like "Bring The Noise" that won't be denied, you're either swept up in it or you're not, and if you're not, I frankly think there's something wrong with you. And I think Chuck D's flow is just as unstoppable, his cadences making lines like "They wanted me for their army or whatever / Picture me given' a damn I said never / Here is a land that never gave a damn / About a brother like me and myself / Because they never did" in the anti-military "Black Steel in the Hour of Chaos" viscerally work. Hell, even Flavor Flav's shtick works for the record, something that would wear out before much longer.

Hip-hop isn't to everyone's taste, nor are left-wing politics. But if you want to know if the genre is capable of being powerful and important, and whether it can say something as meaningful as any protest anthem by Joan Baez or Pete Seeger, this is the record to start with.

The Clash, Combat Rock (1982): Speaking of left-wing politics. Again with the clichés: it was said that The Clash were the only band that mattered, and again with the truth. I've gradually come to the conclusion that The Clash might well have been the second-best rock band ever, after The Beatles.

Combat Rock, the band's penultimate studio album and the last with the core, classic lineup is evidence of both propositions. This was an album, and a band, that embodied Dylan's credo that pop music ought to be about something, and a band that hit hard from the left with cuts like the bitter, jittery "Know Your Rights" ("This is a public service announcement--with guitars!") and the somberly anti-imperialist ballad "Straight To Hell." Which doesn't sound like it would be all that listenable, except that The Clash's other ethos was to try to capture the sound of the neighborhood streets they grew up in--a sea of radio melodies from a dozen genres pouring through open tenement windows, styles ranging from '50s British rock to raga from the East and reggae from the West. Not to mention that Rock's big radio hits were the cheeky "Rock The Casbah" and "Should I Stay Or Should I Go," a song that I suspect is impossible not to sing along to.

To be fair, it's not as strong a record as the band's '70s albums--but those would go in a different blog entry. I wrote, earlier, that there was only one Springsteen record you had to own; I don't think you have that luxury with The Clash, it's all of them or why are we even talking about rock'n'roll? You may, of course, beg to differ.



" may have happened just this way."

>> Sunday, February 21, 2010

Something about Laurel J. Sweet's bombshell revelations about Dr. Amy Bishop's background reminded me of this truly groundbreaking piece of investigative journalism from the 1980's. (On a related note, I'm a little baffled that Alan Moore mysteriously failed to reference it in From Hell--seems like it could've been incorporated into the "Dance of the Gull-Catchers" segment somewhere... oh well, maybe if he ever does a revised version, he and Eddie Campbell can rectify it.)

Hope you're having a good weekend, folks!


That would be a natural "1" on your journalism check...

>> Saturday, February 20, 2010

Tuesday, The Boston Herald reported that alleged campus shooter Amy Bishop was a Dungeons And Dragons player and wonders whether this was a factor in Dr. Bishop's alleged murder of her professional colleagues. An anonymous coward "source" told reporter Laurel J. Sweet:

Bishop, now a University of Alabama professor, and her husband James Anderson met and fell in love in a Dungeons & Dragons club while biology students at Northeastern University in the early 1980s, and were heavily into the fantasy role-playing board game, a source told the Herald.

"They even acted this crap out," the source said.

Sigh. The number of things that are obviously wrong with that paragraph. First, as any informed gamer can tell you, Dungeons And Dragons is a card game, not a board game, one which uses a special customized deck of standard playing cards in which all of the Threes and the Five of Hearts have been removed, the Seven of Spades is folded in half, and the Queen Of Diamonds has a moustache drawn on with a Sharpie (Wizards Of The Coast sells pre-customized decks, but you can easily make your own). Secondly, I refuse to believe that Alabama actually has a university, or--to be more exact--I'm aware that there is an entity called the "University Of Alabama" that fields a football team, thus I am reasonably certain that the first sentence in the passage quoted above should identify Bishop as a coach. "Northeastern University" also sounds vague--next they'll be saying there's some sort of, I don't know, southern... college... something-with-religion, like, maybe some kind of "Southern Methodist University" or some sort of lameass fake name like that. Ha! Sure there is!

Also, isn't a story about a girl playing D&D kind of suspect to start with?

I kid, obviously. Still, it's not hard to imagine where Ms. Sweet stores her head when she's not using it, when she writes lines like, "Some experts have cited the D&D backgrounds of people who were later involved in violent crimes, while others say it [sic] just a game." Vague, on occasion, Ms. Sweet? I understand there are space limitations that your editors insist on (though frankly omitting an apostrophe "s" in the word "it's" seems a bit stringent on the spacesaving front, not to mention grammatically suspect)--but couldn't you at least mention one expert by name or one person who says "it just a game" (I assume the source for the latter quote was a bigfoot or thawed-out caveman, or possibly even Tarzan: "Me know Jane no want Boy play, but me say it just a game.") Sans accreditation of your sources, y'know, someone might think you were keeping it vague because you were sorta making it up; not out of whole cloth, of course, but just pulling something out of your brain that you remember seeing or hearing somewhere that you couldn't really bother checking up on.

Regular readers know I'm a gamer and a D&D player; I imagine it's easy to think I'm offended more than I'm amused. Really, I assure you, it's the latter--Ms. Sweet's article is too vague, unsourced, sloppily written, poorly edited and generally thrown together to work up more than a Jack Shafferish O RLY? sort of reaction after seeing the piece mentioned on IO9 yesterday.

At any rate, I gleefully look forward to Ms. Sweet's dissection of Dr. Bishop's record collection. Did she listen to heavy metal? Was she influenced by backwards-masked messages? Some experts say there are coded Satanic messages embedded in musical recordings by acts such as Led Zeppelin, Black Sabbath and Judas Priest, while others say goddammit I just totally fucked up my copy of Houses Of The Holy and all I heard was somebody, Bonham, maybe (not sure), burbling about "grudge my wampa tiles" which doesn't even fucking make any sense, dude.


Mount Vernon v. Marbury, or: Two shall enter, one shall leave...

>> Friday, February 19, 2010

The Mount Vernon Statement might be the gift that keeps giving this week. Yesterday, I planned on writing about a minor what-the-hell-were-they-thinking aspect of the Constitution, but ended up writing, instead, about how the American Civil War was the Founding Fathers' fault. So maybe for today's entry I should go back and talk about Article III like I meant to in the first place. What the hell, right? And maybe I can come up with something interesting for the weekend.

One of the funny things about originalists like the Vernonites is that if you go back to the Constitution they think existed and try to read it, you discover that it describes something alien and unrecognizable. Presidents are elected by states, not the people. Senators are chosen by state legislatures. Some people only exist as fractions. States have rights, sort of, as long as they stay in line. And then there's Article III.

Here it is, in all its glory:

Article III

Section 1.
The judicial power of the United States, shall be vested in one Supreme Court, and in such inferior courts as the Congress may from time to time ordain and establish. The judges, both of the supreme and inferior courts, shall hold their offices during good behaviour, and shall, at stated times, receive for their services, a compensation, which shall not be diminished during their continuance in office.

Section 2.
The judicial power shall extend to all cases, in law and equity, arising under this Constitution, the laws of the United States, and treaties made, or which shall be made, under their authority;--to all cases affecting ambassadors, other public ministers and consuls;--to all cases of admiralty and maritime jurisdiction;--to controversies to which the United States shall be a party;--to controversies between two or more states;--between a state and citizens of another state;--between citizens of different states;--between citizens of the same state claiming lands under grants of different states, and between a state, or the citizens thereof, and foreign states, citizens or subjects.

In all cases affecting ambassadors, other public ministers and consuls, and those in which a state shall be party, the Supreme Court shall have original jurisdiction. In all the other cases before mentioned, the Supreme Court shall have appellate jurisdiction, both as to law and fact, with such exceptions, and under such regulations as the Congress shall make.

The trial of all crimes, except in cases of impeachment, shall be by jury; and such trial shall be held in the state where the said crimes shall have been committed; but when not committed within any state, the trial shall be at such place or places as the Congress may by law have directed.

Section 3.
Treason against the United States, shall consist only in levying war against them, or in adhering to their enemies, giving them aid and comfort. No person shall be convicted of treason unless on the testimony of two witnesses to the same overt act, or on confession in open court.

The Congress shall have power to declare the punishment of treason, but no attainder of treason shall work corruption of blood, or forfeiture except during the life of the person attainted.

Nice, isn't it? Succinct, to the point, establishes a branch of government that hasn't existed since 1803.

Those who have heard of Marbury v. Madison know where I'm going with this, and I have to wonder if I should belabor the point. What happened, for those needing the refresher, is that Congress unconstitutionally tried to give the Supreme Court the power to issue something called a Writ Of Mandamus, which is basically an order to do something (in this instance, a Writ directing the Secretary Of State to deliver the previous presidential administration's mail); however, when asked to execute this power, the Supreme Court responded by unconstitutionally decreeing that it didn't have the power Congress had unconstitutionally extended to the Court.

See, one of those simple cases of two wrongs making a lasting historical and legal precedent radically altering the relationship between civic institutions.

If you want to understand, sort of, the logic of the Supreme Court's decision in Marbury, you can just scroll up the page, or maybe hit "CTRL-F" in your browser and typing "writ of mandamus" (capitalization optional unless you checked the box that makes it not). Scroll through Article III, reproduced above, and read the part where it says "Writ Of Mandamus." See--it's not in there, so the Supreme Court can't issue one, neener neener.

But now for the fun part. Scroll up again, or "CTRL-F" and type "declare act of Congress unconstitutional" and try to find that. Yeah, yeah, you already know--that's not in there, either. Matter-of-fact, the Article III version of the Supreme Court doesn't actually do much of anything. Okay, that's an exaggeration--the Supreme Court is a court in Admiralty, so they (or such inferior courts as established by Congress) decide shipping disputes and marine salvage claims. And lawsuits between states. And cases involving ambassadors. Alright, so the Article III court actually does a lot, I take it back. But what the Supreme Court Of The United States explicitly does not do in any way shape or form according to Article III of the Constitution Of The United States is the thing that everybody thinks the Supreme Court does, namely, decide whether statutes are unconstitutional or not.

The closest, the very closest you can come to that is in Section 2, first paragraph: "cases, in law and equity, arising under this Constitution." But note that this doesn't give the judicial branch the power to throw a law out, it just gives the judiciary the power to hear cases. You could (and perhaps if you want to be strictly constructional, should) read that to mean that the judiciary has the power to try cases that the Constitution says the judiciary has power to try--i.e. treason. Limited government, indeed.

At this point, I have to admit that I don't know if the Vernonites really believe American law should be rolled back to 1803, or if they want to have-and-eat the cake, or if they're just liars. The last one seems plausible--one can imagine some of the Vernonites believe Marbury was wrongly decided and ought to be reversed (good luck!) and just don't want to tell anybody because, really, most conservatives like getting laws declared unconstitutional as much as anybody. (See also.) Saying that the country has been on the wrong course since Obama's election or the Clinton administration or FDR is one thing--saying American history has gone straight to hell since the beginning of the Jefferson administration is pretty damn hardcore. I have to respect that. Grudgingly. And ask if you own a powdered wig.

Or is it that the Vernonites just don't know what the hell they're saying? Did I hear a bell? Do we have a winner?

Image ©2007 swatjester provided via Wikimedia Commons,
used under a
Creative Commons Attribution-Share Alike 2.0 Generic license.


The Founders wrote the Civil War into the Constitution...

>> Thursday, February 18, 2010

A few more thoughts about the the Mount Vernon Statement, which I wrote about yesterday. Specifically, I was thinking about the way the Vernonites fetishize their imaginary version of the Constitution and Jim Wright's piece over at Stonekettle Station (if you haven't hit it yet, it's worth a read, and not because Jim gives yours truly a nice nod).

Jim gives a good bit of discussion to the idea that the Vernonites are Constitutional amateurs--I might use the word "dilettantes," just because it connotes people who are dabbling in something without enough investment to be experts; I agree with him up to a point, though I also have to note that whatever disagreements I might have with Vernonite Edwin Meese, I can't disparage Mr. Meese's professional credentials as a lawyer, including, naturally, his time as the country's Attorney General. I'd have to say Mr. Meese ought to know better and that his view of the Constitution, if sincere (and I think it is), is delusional--but he's no dabbler.

The thought that occurred to me, anyway, is that the Founders themselves were dilettantes--farmers and businessmen and lawyers who were sometimes out of their depth while valiantly trying to cobble together a nation based on their ideals. The only problem with that thought, though, is that it's also delusional, or based on the popular mythology. Whatever else they did in their day jobs, nearly all of the men who wrote the Constitution were, in fact, professional lawmakers prior to and during the Revolution. We have this fanciful collective notion in the United States, I think, that men like Washington, Jefferson and Adams were ordinary folks who dropped their normal routines as hard-working men to lead a nation to liberty, etc.; the truth is that they were wealthy, well-educated men whose business affairs in agriculture, finance, industry, law and other fields, along with the fortunes they inherited and/or married into, left them with lots of free time and disposable income that could be channeled into full-time political careers in state legislatures and the Continental Congress.

So the question that leaves me with, that I thought I'd answered with their imagined amateur status, is how did they make such a botch of the Constitution?

That's a pregnant statement, I realize. Let me say that I love the Constitution, that as a member of a State Bar I took an affirmation to uphold the Constitution which I remain inordinately proud of, that if the Constitution were a woman I would marry it, especially if it had nice gams and dark hair. But the Constitution is a damn mess, is the thing.

How bad is the Constitution Of The United States? Let's start with the cold fact that the Constitution caused the American Civil War.

I talked around this in yesterday's post in discussing the "Constitution of 1868," but hadn't quite crystallized the conclusion. I think I left it at suggesting that by deferring the problem of the slave trade and hacking together some uneasy compromises between non-slaveholding and slaveholding states, the Constitution failed to avert the Civil War. But of course this isn't strong enough--the Constitution of 1789 didn't merely fail to avert the Civil War, it made the War inevitable.

Consider: the thirteen colonies' only real binding tie in 1789 was Anglophobia. Encompassing a vast geographic territory--perhaps a minimum of four geographic regions1--the new "union" remained arbitrarily divided into thirteen oddly-sized and shaped chunks; some in which slavery was profitable and in others not, some possessing a distinctive religious or ethnic character and others more diverse, some on course to be essentially agrarian while others were already diversifying into industry and exchange, and so on. In fear that the British would wait to catch a second wind and for the French to lose interest in the New World and then attempt to reclaim their lost colonies, it seemed vital for the delegates of the states to maintain a strong, undivided front for the purpose of national defense in spite of the fact that they otherwise had little else in common. This in turn led to a number of compromises. Some were mostly harmless, like tacking on the Bill Of Rights. Others, as discussed here yesterday, were poison pills leaching out a fatal toxin.

Now, here's the funny thing about those compromises, and one in particular: as part of maintaining a united front, the one thing the Constitution conspicuously lacks is a mechanism for dissolving itself, or more specifically for allowing states to opt out. There is, true, a nod towards states reserving rights for themselves--but absent a method of secession, that's in fact a meaningless phrase. A state might say they have the right to do something, to which the other states collectively say, "Nope, you can't," and what's the remedy then? One supposes that this might be an Article III Constitutional question (I think we'll be revisiting Article III at some point), but should recourse to the Supreme Court fail, what then? To assert a right and be told, in effect, "suck it up," is as if to not have the right at all, no?

And this is how the Constitution caused the Civil War. The absence of secession clauses was meant to be a feature, not a bug--indeed, the Constitution's direct precursor, the Articles Of Confederation, expressly forbade secession. But without a secession clause, the idea of "states' rights" is a fiction, a fig leaf. The only remedy for a state which persists in asserting a right against the will of the federation is to withdraw, a remedy not specifically enumerated in the Constitution and implicitly forbidden, leaving it to the remaining members of the Republic to decide whether to show weakness by tolerating the withdrawal--thereby establish a precedent that makes the Union meaningless--or respond by insisting on the supremacy of the Federal government, using force if necessary.

We know how this turned out.

None of this, by the way, is to absolve slavery or remove it from the list of causes of the American Civil War. It is still correct to say the Civil War was caused by slavery, seeing as how the venomous threads the Founders wove into the document were pertaining almost exclusively to slavery--agreeing to put a freedom of conscience clause into the Constitution's appendix to gain a few signatures was a harmless compromise, the agreement to count slaves as three-fifths of a human being for purposes of allocating House and Electoral College seats was not.

As I wrote, I love the Constitution. But if the authors sowed the seeds of their nation's own destruction--and the Civil War was both apocalyptic and transformative, a nation destroyed in oceans of blood and storms of fire and regenerated as a true union--then how much reverence are they really due? Respect, perhaps; admiration for their strengths and regrets for their frailties, undeniably. Yesterday, I believe I mocked the Mount Vernon Statement's second sentence: "Through the Constitution, the Founders created an enduring framework of limited government based on the rule of law." I have to mock it again--not because the Founders were without noble ideals; I revere most of their principles as much as the next post-Enlightenment liberals--but because the Constitution as it was penned was a failure of truly epic proportions.

1At a minimum: North, Mid-Atlantic, South, Appalachia; strong argument could be made for considering the Ohio River Valley and/or the Trans-Canadian northern border states (Vermont, New Hampshire, Maine) as additional regions. Why is this important? Because even then, as now, residents of, for instance, Appalachia have more in common with each other--culturally, ethnically, politically, economically--than with their "fellow residents" in the states Appalachia crosses.


Mount Vernon junkies

>> Wednesday, February 17, 2010

Gabriel Winant at Salon has brought this to my attention: a number of the nation's conservatives were scheduled to gather at Mount Vernon today to pontificate on what's wrong with America these days. Sort of. Not in any specific terms or with anything approaching a plan, but with a manifesto called "The Mount Vernon Statement," which you're free to go sign at that last link if you're into vague libertarian sentimentalism and online petitions.

The problem with this brand of conservatism, as always, is that it hearkens back to a world that hasn't existed for more than two centuries. Indeed, it hearkens back to a Constitution that hasn't existed for almost a century-and-a-half. Legal scholars and serious students of American History will tell you that the United States has gone through at least two substantial Constitutional crises since the Constitution was ratified between 1787 and 1790.

The first of these was the result of the unsustainable compromises of 1787. The fact, however unacceptable to certain American right-wingers and naïve purveyors of schoolhouse mythology, is that the Constitution of the United States was not a perfect or ideal document capable of creating anything even similar to what the Mount Vernon Statement fancifully calls "an enduring framework" for any purpose. Rather, what the Constitution did was cobble together a coalition of radically different demographic, ideological and economic geographic factions--most broadly and notably the slaveholders of the Southern states and the small, non-slaveholding farmers and nascent industrialists of the Northern states--by a mixture of strange compromises (e.g. the appending of a "Bill Of Rights" as an external-yet-inherent part of the document) and simply tabling other issues for at least twenty years (Article I, section 9), possibly and conveniently outside of the remaining lifetimes of some of the Founding Fathers (how brave). Even the bicameral structure of Congress represented compromises that no longer seem as relevant--between equal representation (Senate) and majority rule (House), between elitism and an acknowledgement of states as entities (Senate, again--remember direct election of Senators was invented in 1911, effective 1913) and acknowledgement of the hoi polloi (House, again).

But the most infamous and least-resolved of the compromises of 1787 was the slavery issue. It wasn't just a moral issue, either--consider the significance of the "three-fifths compromise" in light of what the addition or subtraction of the slave population would've done to apportionment of House seats, allowing slaveholding states to stack the House by dint of large populations with absolutely no say in the political process. The slave issue boiled and brought the United States to the verge of war with itself for nearly eight decades until it resulted in war outright--and settled the question of whether "States' rights" trumped the Union for what one would've thought was once-and-for-all-time at the points of bayonets and along lines and arcs traced by ball and shot.

What could have happened in 1868, when the XIIIth, XIVth and XVth Amendments were added to the Constitution, could have been a second Constitutional convention to ratify what was written in blood, 1861-1865. And perhaps that's what should have happened in retrospect. But it's not what happened, of course. Instead, what happened was the raising of a pretense--that what had happened and what had been settled was the ascendancy of the Constitution of 1787 over insurrection, the growing pains of a nation and not the outright transformation. That this didn't happen was the result of a perhaps instinctive need for a unity myth that restored the country and salved the wounds of a war in which, yes, the cliché of "brother against brother" was a mere fact. (I am told, incidentally, that I, like so many, had kin on both sides of the American Civil War.)

At that point, anyway, if not before, the Constitution of the Mount Vernon Statement crowd, of the Becks and Norquists and Meeses, was a dead letter, a document that had failed to deliver to posterity what the Founders sought to secure and outrageously and bloodily failed.

The second Constitutional transformation that's commonly recognized was far more subtle and resolved with no blood and much real compromise--and just as much intentional or accidental dishonesty. The Constitution of 1787--and even the "Constitution of 1868"--was (in each instance) the product of a largely agrarian and somewhat backwards society. The authors of those documents could not have possibly foreseen a world of interconnected international economies and global trade, of too-clever-for-their-own-good innovations in speculative investing, of instant communications and rapid transit making isolationism impractical if not impossible, of an economy in which intellectual activities--investment, invention, finance, law--could be more vital than the tangible fruits of physical labor.

As I write this, it sounds a little recent. The difference between the crisis of 2008 and the crisis of 1929 is that the kinds of things mentioned in the previous paragraph were already facts of life and subjects of conversation, whereas the financial, social and political transformations of '29 were something like an earthquake--pressures building between tectonic plates for years finally snapping with cataclysmic effect.

This is recent enough--within the adult lifetimes of a number of Americans who are still active in culture and politics--for conservatives to curse what they correctly see as a Constitutional transformation centralizing power in the Federal government at the expense of states. What the conservatives fail to see or admit, however, is that the Constitutional transformation was necessary, because the decentralized model of the 19th Century Constitutions was completely inadequate to deal with national problems in a global setting. (Got that?) Certain Republicans may frame the FDR years as a power grab by Democrats--and in one sense they're right, it was a power grab; only problem with that analysis is that it was necessary to save the Republic.

How necessary? It was so necessary that the United States faced a Constitutional crisis that culminated in FDR's infamous court-packing plan--a crisis that was averted when a Republican, Justice Owen Roberts, "flipped" on a U.S. Supreme Court case involving a minimum-wage law ("The Switch In Time That Saved Nine"). Indeed, this was not the only compromise that conservatives of the era acceded to as it became apparent that their politics had mostly failed to either avert national crisis or to fix it.

Like the crisis of the 1860s, the crisis of the 1930s deliberately--albeit perhaps unconsciously--tended to ignore the greater truth of what had happened, instead glossing it with the myth that the Constitution of 1787 was still present and working just fine, granting a few Amendments here and there that themselves radically altered the Founders' "enduring framework" (e.g. unified party tickets; direct election of Senators; extension of suffrage to freed slaves, women, and those eligible for military service due to age--regardless of land ownership, vested fiscal interests, literacy or descent). That the United States that emerged from the Depression and World War II was in many ways unrecognizable compared to the United States that emerged from Civil War--which in turn was a radical alteration from the nation created in 1787--was, for the sake of a unifying national mythology, swept under the proverbial carpet.

Again, in retrospect, maybe it would have been better to call a Constitutional convention and hack these things out openly and honestly. Except, of course, it's more than a little hard to do that with South Carolinians shooting at a Federal fort or hordes of displaced Okies wandering around looking for work that doesn't exist. And it's more than fair to say that there were sound reasons for people to tell themselves and each other they weren't doing anything that wasn't in the spirit of the document, letter be damned.

And here we have the crucial point, by the way: whether you're a liberal or conservative in America may depend more than anything on whether you see the Constitution of 1787 as a living document that's grown and evolved--the Constitutions of 1868 and 1937 being natural, organic growths in the spirit of the thing--or whether you see it as a dead hand guiding the tiller from beyond the grave.

The problem with the latter view, which conservatives are enamored of, isn't that it's constrictive; the problem is that it's divorced from reality. It's delusional. Not because what the Founders said might not be interesting or aspirational or even, on rare occasion, relevant, but because the document it hearkens back to hasn't existed since the beginning of the American Civil War. The Mount Vernon Statement might as well allude to the rules of The Round Table or the Jedi Code, or propose modeling our nation after the Kingdom Of Prester John--none of those things ever really existed, either.

It does not bode well for this country that a number of reasonably educated and relatively accomplished men and women like those whose names appear on the Mount Vernon Statement--not to mention a loud and energetic teabagging rabble--is in thrall to a fanciful history of America that has as much to do with reality as the ravings of a junkie.


Weee-arethechampions, my frie-ennnnd--bombombombom!

>> Tuesday, February 16, 2010

Why am I inordinately excited? Why do I feel there's no time for losers? Because after a day-and-a-half of struggling, I finally got the wireless on my netbook to work under Ubuntu Netbook Remix (UNR), after which everything else will be gravy, hell, already is gravy, delicious mushroom gravy made from a red wine stock, oh yeah.

This was, in a way, a potentially suicidal upgrade. And not even a good kind of suicide, like Spock dipping his hands into a reactor core in Star Trek II suicide, where if I ended up cradling my bricked Dell Mini 10 it would, at least, tell me that it was and always will be my friend. The first thing was that I was actually reasonably happy using the preinstalled flavor of Ubuntu. It was an old build (8.04, Hardy Heron), yes, but all the drivers worked and all of the proprietary codecs, drivers and miscellaneous software that you have to go grey or black to put on an American Linux machine was present and street-legal courtesy of Dell's deal with Fluendo. What's more, although I'm happy with the Dell Mini 10 as a piece of hardware, some initial research into UNR suggested there were two particularly serious hardware-related problems I could be facing.

First, that while the graphics chipset on my generation of Mini 10 (and not, curiously, the Mini 9 or Mini 10v) is branded Intel, it's actually a piece of GMA hardware that apparently doesn't always play nice with Linux (many people were complaining online that their displays were so lagged as to be unusable; I should also note that apparently new Mini 10s don't use the GMA chipset and don't have this issue). As it turned out, this hasn't been an issue thus far. I haven't tried watching any videos yet--so there may yet be some issues there--but so far the graphics are impressively crisp.

The second issue, however, was driving me a little crazy. The built-in wireless in the Dell Minis is a Broadcom chipset and it doesn't like playing with Ubuntu at all. Although the installation went smoothly, the Mini would not use the proprietary drivers installed from Canonical's repositories even though the machine agreed they were there and usable. This is, evidently, a known-bug, with a number of solutions, none of which appear to work for everybody.

What did work, this evening, was two lines in the terminal:

sudo apt-get remove dkms
sudo apt-get install bcmwl-kernel-source

And here I must apologize profusely: I sent myself an e-mail with those lines, but having no idea whether they would work, I failed to send myself the link to the source. It might have been Ubuntu On The Dell Mini, which has been a very useful site during the changeover--only, despite the fact that the author includes these commands in one of his entries, I don't think he was the one I got it from (not to slight him any--again, he's been enormously helpful, and he may have been the ultimate source for the tip). Bottom line: whoever you are, Internet stranger who got my wireless working, thank you!

So, as you can see from the screenshot above, I got things working.

But why bother, if I was happy with the existing install?

The main reason, actually, was Ubuntu One. I've been aware of the service for awhile (since the last upgrade to my main machine, actually), and just wasn't too terribly interested in cloud computing, but this weekend I started playing with Ubuntu One a little out of curiosity, and I find I do like the idea of being able to sync writing projects between the netbook and main laptop. There are other services for this, of course, but Ubuntu One, as you can imagine, is nicely integrated into Ubuntu (imagine that). But it's not integrated into 8.xx or earlier versions, so there was reason one for an upgrade.

Secondly, I liked the layout and the promises of improved speed (which I'm already noticing, I think, unless it's all in my head). Third, while 8.xx was adequate and stable, it was also long-in-the-tooth; which sort of brings up reason three-"a" or three-"b" or however you want to denote it, which was that the Dell preinstalled build of Linux was very much tied into Dell repositories, which who knows how long they'll be maintained or how well, and if I was going to shift away from the Dell repos, I might as well go for a clean install, and if I was going to go for a clean install, why not go for Remix?

So there it is.

So, y'know, thus far it's a win. At least now that I have wireless working. Everything's coming up Millhouse. How are you?



>> Monday, February 15, 2010

And how often do I do that. Because I'm always right, even when I'm not, neener-neener.

No, but seriously....

Last week I had a bit of fun at the expense of the worst state in the country. Today, Slashdot has brought to my attention that apparently the reason the South Carolina anti-subversion statute is in the news is because it's a 1951 law that South Carolina is trying to repeal.

Awww. That's not nearly as much fun.

Still, I guess I should give credit where credit is due: repealing a sixty-year-old law is pretty progressive for South Carolina. I imagine that this bodes well for the state acknowledging the Apollo 11 moon landing in 2038. Go, South Carolina, go! You guys will be paleolithic in no time!

As for the shortage of new material--it's nearly 9:00 P.M. EST and I still don't have anything more to offer you fine folks except a correction to last week's entry--what can I say? I just haven't been inspired. I will try, try, try to do better. Or post some music or something if I can't. When I see if I actually install Ubuntu Netbook Remix on the laptop, there may (or may not) be an entry on that. (I was alright with the netbook running an old version of Ubuntu until I discovered I couldn't use Ubuntu One on it; I figured Ubuntu One might be useful from a writing POV since it would facilitate syncing up whatever I was working on. We'll see if that works for me.) UNR looks awesome, but a lot of people have had problems installing it on a Dell Mini10 (which is what I have) because of the Mini10's lackluster graphics processor. I'll let you know how it goes.


Love hurts

>> Sunday, February 14, 2010

So, uhm, what better way to honor Valentine's Day than with a, ahhh, "classic" ballad by Carole King and Gerry Goffin, as performed by Hole:

Err.... Like Courtney said about this one when the band did it on Unplugged: "Nice feminist anthem."

Happy Valentine's Day.


Can't get it out of my head

>> Friday, February 12, 2010

Just got Hunky Dory on CD, and like I said yesterday on Twitter, I'm having a hard time listening to the album--because I keep hitting "repeat" during "Life On Mars?"

Here it is live, about ten six years ago:

If I don't get back around here today--be good to each other, don't wreck up the place, stay warm, and try to have a good great Friday, eh?


Okay, I'll admit--sometimes the batshit crazy is kind of fun after all...

>> Thursday, February 11, 2010

I'm one of those people who complains at times about the low quality of political discourse in our times. Instead of candidates getting up in front of the people and explaining their ideas, we're reduced to cycles of meaningless and dishonest attack ads; I realize, of course, that this is a situation that goes back at least a half-century and that my longing for meaningful debate is probably based on over-romanticizing the Lincoln-Douglas debates, still--

Then again, this is fucking awesome:

This is so awesome, I didn't believe it was a real ad when I saw it at The Onion A/V Club today, paired with the semi-infamous Carly Fiorina Pink Floyd fan video "FCINO" attack ad she's using in her Senate campaign. I didn't necessarily think it was a creation of The Onion itself--while The Onion is satirical, the A/V Club is the humor site's slightly-more-serious arts-and-culture-focused sibling; no, I suspected it was somebody's idea of what an attack ad might look like on some kind of mind-altering substance (possibly something as innocuous as whippits).

But it's not! TPM confirms, through their sources, that the horrorshow video is in fact a real ad for New Orleans Parrish Coroner's candidate Dr. Dwight McKenna that's been airing on New Orleans TV. The ad evidently references a scandal involving incumbent Dr. Frank Minyard, in which Minyard's office was accused in the 1990s of removing decedents' corneas and bone specimens and donating them for transplant without permission of the decedents' families. I'm not clear on the status of the suit--it sounds like maybe a codefendant settled and claims were dropped against the coroner's office; also unclear is whether Dr. Minyard has a disabled eastern-European deputy coroner on staff named "Igor" or "Ygor."

I cannot possibly say that Dr. McKenna's ad isn't toxic politicking--it's salacious, scandalous, apparently misrepresents facts, caters to irrational fears and prejudices, and fails to present a positive image for its candidate... well, okay, I may have to retract that one, since I suppose, "Hey, at least I'm not some kind of crazy body-snatching maniac!" probably counts as a positive message. Sort of. (Although, technically, I must point out that Dr. McKenna's ad doesn't actually say that he isn't a body-snatching maniac who will auction off New Orleans' dead; it's possible that what he means by "Say no to Dr. Minyard" is that he'll pay the coroner's office power bill.) But... how can I resist the sheer gonzo craziness of the thing?

I reluctantly must applaud Dr. McKenna: you may be doing your part to drag American politics into the gutters, sir, but at least you made it an entertaining crawl.


Enormous raging moronic asstard of the day...

>> Wednesday, February 10, 2010

President Bush remains the only president in history to face a foreign attack on the continental mainland of the United States. His responsibility was unlike any other commander in chief in history.
-Jeffrey Scott Shapiro
Bush Billboard a Sign of Hope and Change
Fox News, February 10th, 2010

Sacking of Washington D.C. by British forces, 1814,
during the War Of 1812, Madison Administration.

Wreckage of Columbus, NM, 1916 following
the Battle Of Columbus, Wilson Administration.

(H/T to Jim Wright!)


Another proud member of the UCF...

Another proud member of the UCF...
UCF logo ©2008 Michelle Klishis international gang of... international gang of...
смерть шпионам!

...Frank Gorshin-obsessed bikers.

...Frank Gorshin-obsessed bikers.
GorshOn! ©2009 Jeff Hentosz

  © Blogger template Werd by 2009

Back to TOP