Not safe for work? I'm not sure this film is safe for anywhere...

>> Wednesday, June 30, 2010

This short film, "BLUE: An Erotic Life" by Tibo Charroppin, is amazing. It's also, I feel obligated to warn you, not the least bit safe for work, what with all the Play-Doh™ sex, drug abuse, autoerotic asphyxiation, etc. But it is funny as hell. Seriously. If you think a blue blob of Play-Doh™ fucking is funny. Which it totally is.









(H/T to IO9!)

Read more...

What is he talking about?

>> Tuesday, June 29, 2010

Faith-based atheism? Yes, alas. Atheists display a credulous and childlike faith, worship a certainty as yet unsupported by evidence—the certainty that they can or will be able to explain how and why the universe came into existence. (And some of them can behave as intolerantly to heretics who deviate from their unproven orthodoxy as the most unbending religious Inquisitor.)

Faced with the fundamental question: "Why is there something rather than nothing?" atheists have faith that science will tell us eventually. Most seem never to consider that it may well be a philosophic, logical impossibility for something to create itself from nothing. But the question presents a fundamental mystery that has bedeviled (so to speak) philosophers and theologians from Aristotle to Aquinas. Recently scientists have tried to answer it with theories of "multiverses" and "vacuums filled with quantum potentialities," none of which strikes me as persuasive. (For a review of the centrality, and insolubility so far, of the something-from-nothing question, I recommend this podcast interview with Jim Holt, who is writing a book on the subject.)

-Ron Rosenbaum, "An Agnostic Manifesto,",
Slate, June 28, 2010


'Kaaaaaaaaaay....

I don't even know what he's talking about. No, wait: I do. It's just that what Rosenbaum is talking about is stupid. Profoundly stupid in that special profoundly stupid way an erudite person can be when he has no idea how stupid he's being.

Allow me to answer Mr. Rosenbaum's question, "Why is there something rather than nothing?" the only way I know how:


I don't know, why the fuck not?


I am, as you know, an atheist. Have been for, I dunno, decades now. I am an atheist because I have no evidence satisfactory to myself that a deity exists and because I agree with Laplace and not Lagrange. I can think of no way in which postulating the existence of the divine as a solution to unanswered or unanswerable questions resolves more questions than it raises, and therefore I discard it, ptui!

But can I say there's no deity? No, of course not--no more than I can say a good many other things for which I have no evidence and no particular reason to believe don't exist. To that extent, you can describe my position as a form of agnosticism although I've sometimes bristled at the term for it's connotations of "undecided" or "uncommitted." To say that I don't know whether or not there's God is a logical position, not a position of uncertainty.

But what of Rosenbaum's oh-so-important question? Back to that--

He's being absurd. I don't know that there are many atheists, if any, who feel that science will answer all questions someday and indeed I suspect atheists are as likely as anybody else to get into epistemological debates. Indeed, Rosenbaum's accusation that atheists share a faith that science will eventually answer all questions fails to account for the fact that there is a branch of science that addresses precisely the issue of limitations on knowledge and specific instances in which what is knowable about objects is destroyed or in which what is knowable is inherently limited; to have faith that scientists will eventually know everything, in other words, would be a paradox since science has already identified situations in which things cannot be known.

And is the particular philosophical question Rosenbaum asks important as opposed to interesting? I think it's interesting to think of reasons there might be something other than nothing, but as a pragmatic matter we are, apparently, here and things do exist as opposed to not-exist or we wouldn't be having this conversation. But important?

As I think around this, it's obvious that the question is important insofar as if you could know the answer with some certainty, it might have implications for life, the universe and everything, but insofar as you probably can't know, it's merely an interesting question to get drunk over until somebody cracks wise about The Great Green Arkleseizure.

Let me put this another way: if you could prove that God created the universe, then it would mean there was a God and that could be very important if God continued to have anything to do with daily affairs. But since you can't prove God created the universe you're stuck with saying it would be nice to know why things exist but you still have to get up in the morning and go to work either way, unless nothing actually exists but then you'll probably be fired from your imaginary job when your imaginary boss gets angry that you pretended to sleep in.

And then there's this, and it's a good bit of why I agree with Laplace: supposing one suggests that God created the universe, who or what created God? It's a juvenile question, except that if you're arguing God's the Prime Mover, you're arguing that things can exist without creators and therefore it's hard to see what the divine hypothesis adds other than a mysterious extra moving part; and if you're not arguing that God is a Prime Mover, then who or what created God's creator, etc.? It's easier to suppose that the universe created itself or simply happened. Furthermore, one might point out that a God who created the universe and then absented Itself is probably less meaningful than a "deity" that came into existence on the first Tuesday after the universe auto-started but then behaved as a good parent, proffering moral instruction and preventing the occasional extinction--not that there's any evidence, in my view, of such a being, merely that a proactive non-Creator deity would at least be useful unlike whatever kind of creature Rosenbaum might be suggesting to explain why there's something and not nothing, if he's suggesting anything at all.

Rosenbaum says at one point:

In fact, I challenge any atheist, New or old, to send me their answer to the question: "Why is there something rather than nothing?" I can't wait for the evasions to pour forth. Or even the evidence that this question ever could be answered by science and logic.


I imagine he might regard "Why the fuck not?" as an evasion; if so, he'd be an asshole. I mean, seriously, trying to claim that this is the question that somehow dismantles disbelief in the existence of gods is really a dickhead move from a usually-thoughtful writer when you really get right down to it.

I can't move on without one more point about how asinine Rosenbaum's piece is. See, Rosenbaum goes on to present a list of problems he and Australian scientist John Wilkins have with the so-called "New Atheists," among them:

[Atheism] tries to co-opt Agnosticism as a form of "weak" Atheism. I think people have the right to self-identify as they choose, and I am neither an atheist nor a faith-booster, both charges having been made by atheists (sometimes the same atheists).


Of course, if you're going to write a three-page essay about atheism and agnosticism, it might help to familiarize yourself with the terms of the discussion. Referring to agnosticism as a form of "weak atheism" is not a matter of "co-opting" agnosticism, but rather a matter of trying to classify it and work it into an epistemological framework--an effort, actually, you might think someone like Rosenbaum would sympathize with if he wasn't trolling. (And what can I say here, except that obviously I've taken a bite of the chum he scooped over the side.)

Rosenbaum's cluelessness is perhaps summarized near the end of the piece when he writes:

Wilkins' suggestion is that there are really two claims agnosticism is concerned with is important: Whether God exists or not is one. Whether we can know the answer is another. Agnosticism is not for the simple-minded and is not as congenial as atheism and theism are.

The courage to admit we don't know and may never know what we don't know is more difficult than saying, sure, we know.


The "two claims" Rosenbaum cites aren't novel questions of epistemology that have just come up in reaction to the so-called "New Atheists" recent prominence; the question of whether God exists and the question of whether we can know is almost certainly as old as theology and in the context of agnosticism goes back at least as far as Thomas Huxley coining the word. I can't really begin to imagine why Rosenbaum wants to trot it out now as a profundity, unless he's perhaps attempting to use the cosmic as misdirection, hoping his reader is so unaware and clueless as to notice the paucity of ideas there, much less originality of the few to be found.





Read more...

Conan--The Musical!

>> Monday, June 28, 2010

This... this is the best thing ever.







Read more...

"Devil's Arcade"

I'll be honest, I'm not even sure I understand this song, but it breaks my heart every time; Springsteen, "Devil's Arcade":





Read more...

"I'm Alright"

>> Sunday, June 27, 2010

I'm alright. How 'bout you?

More than alright--The Stones at The T.A.M.I. Show, 1964, closing the show with some Bo Diddley:





Read more...

New version of firefox unveiled at the Smithsonian

>> Saturday, June 26, 2010

©2010 Smithsonian National Zoological Park

Last week, the Smithsonian National Zoological Park's red panda, Shama, gave birth to her first cub.

The original title of this post was going to be something like, "Your weekend moment of awwwwwwww!" or something like that, but how could I avoid the obvious geek joke? At least half of you would have done the same, admit it.

Anyway, if anything's getting you down, it's a nice pic to keep coming back to. I tried to find a bigger version but that was the best I could do. Adorable, aren't they?




Read more...

The Legend

>> Friday, June 25, 2010

Like every single human being on the planet with an e-mail account, I get lots and lots and lots of e-mails full of vague advice on how I can make my penis larger and please "her" more. No idea who this woman might be, but whatever. I also get an occasional e-mail advising me that my breasts could be larger and containing a link that (of course) I never click; I can only assume that there is some spammer in the world who thinks the entire population of the world consists of underwhelmingly-endowed, small-chested hermaphrodites.

Which, considering the quantities of agricultural hormones and hormone-simulating pollutants that end up in the water supply, may actually prove to eventually be the case. Yet another thing few, if any, Golden Age SF authors got right--no flying cars, no Mars colonies, no city-sized supercomputers, no vestigial hermaphrodites; nice call, Heinlein and Asimov.

But this post isn't about that. Nor, frankly, is it about my penis or man-boobs. You're welcome.

No, this post is about the wonderful caption that appeared in my junkmail folder the other day:


RE: You will be the Legend of the 10inch manhood...


"The Legend Of The 10-inch Manhood." This amuses me, I think because usually that's a phrasing you see in the English titles of Asian martial arts movies like The Legend Of Drunken Master or The Legend of Musashi. Is "The Legend Of The 10-inch Manhood" a mighty warrior who fights with... well, you get the idea. Or perhaps it's an object found in a remote, exotic temple, much like the idol Indiana Jones has to recover with a sandbag in Raiders Of The Lost Ark. (Great, now I'm imagining Alfred Molina yelling, "Throw me the schlong!" That's going to be a mental scene that's easy to get rid of.)

Among the more obscure usages of the word "manhood" is "men collectively," i.e. mankind, and I think this is the best mental scene of all: a Gulliver-esque visitor to a land full of 10" men would probably be a martial arts legend, indeed; he wouldn't even have to be particularly good at any martial art more sophisticated than stomping around a lot. I write this with the caveat that I remember Stephen King's "Battleground," a short story about a man getting his ass whipped by toy soldiers. So I'll acknowledge that, yes, our normal-dude-in-a-land-of-little-men needs to be careful if the minimen are armed.

On a tangent: in writing the above, I also seem to recall a television show in the 1980s blatantly ripping off the King story (which apparently was adapted by Richard Christian Matheson for the Nightmares And Dreamscapes miniseries which I haven't seen and, no, isn't what I'm thinking of). As best as I can tell, the "man-attacked-by-toy-soldiers" show I'm thinking of almost has to be an episode of a short-lived ABC anthology series from 1981 called Darkroom, "Siege Of 31 August". I think that has to be it from the scant plot summaries available online; the only clip I could find had been taken down, and who's to say I'd recognize it after, oh crap, twenty-nine years anyway? Indeed, I'd forgotten the TV show Darkroom ever existed in the first place (and seems to have been a possible long-lost inspiration for Garth Marenghi's Darkplace).

Can anyone confirm my elderly memory?

Anyway, assuming the ten-inch men are armed with... well, nothing (they could be armed with toothpicks and it would hurt like a bitch), the Legend Of The 10-inch Manhood would indeed be a fearsome... well, legend. A nice hard punt and one of those poor petite bastards would fly. The Legend would be revered as a god, which I suppose is what the spam is trying to promise in the first place, though I expect with the great power of being nearly seven times taller than anybody else in the world would come great loneliness. It's hard not to set yourself above everybody you know when... you're above everyone you know.

I have to confess I don't know how to end this post, so we'll leave it at that--there's no end to what you might do with this unoriginal premise, or what's been done already. May the Legend live on, etc.

Read more...

"Over and over and over again, my friend..."

>> Thursday, June 24, 2010

The good news? He was wrong in 1965.

The bad? He's relevant right now.

Barry McGuire on Australian TV in 2008--you know the tune:




Read more...

George Will, failings of...

>> Wednesday, June 23, 2010

Today, as it has been for a century, American politics is an argument between two Princetonians -- James Madison, class of 1771, and Woodrow Wilson, class of 1879. Madison was the most profound thinker among the Founders. Wilson, avatar of "progressivism," was the first president critical of the nation's founding. Barack Obama's Wilsonian agenda reflects its namesake's rejection of limited government.

-George Will, "Liberalism, failings of: a history,"
Times Union, June 4th, 2010


Alright, alright, alright. I give up. I surrender. Mea culpa, I'm sorry, forget I said anything ever at all to anyone ever. Toss it in the memory hole and burn it.

It is hypothetically possible that at some point in the past I might have suggested that George Will was something other than a bespectacled submoronic jackass. It's possible I might have insinuated that he was--however wrong he might be in some particular or another--capable of intelligent, coherent thought turned into reasonably well-phrased pieces of published commentary. This might have happened in a parallel universe or alternate timeline, or it might sound like something I might say out of an abundance of reasonableness because, deep down, I'm a genuinely nice guy and wonderful humanitarian soul capable of seeing many sides of a subject and all that crap.

Never happened, none of it. Say it did, and it's a lie. You're a damned liar and I don't want to hear your slanders nor read your libels.

I can categorically deny ever saying anything even vaguely lukewarm about George Will with safety by referencing the quote that leads this post: anybody who would publish anything as mind-numbingly stupid as that cannot possibly be anything more than a drooling degenerate with a college diploma.

Did you know that the past century was a struggle between James Madison and Woodrow Wilson for the hearts and minds of Americans? Did you know that Wilson was the "avatar of 'progressivism'," as opposed, say, to the militaristic racist fucktard who exploited the turn-of-the-20th-Century progressive movement, alienated hardcore progressives like William Jennings Bryan, bombed Veracruz, and tragically had a mostly-shitty reputation rehabilitated by the resumption of hostilities between Germany and the rest of the universe (proving him right about the need for some kind of international organization and the need for the allies to lay off Germany after the armistice instead of demanding pounds of flesh)? Me either. I kind of thought Woodrow Wilson was a pretty scummy President, but evidently he's supposed to be my some-kind-of-god or something because I'm a progressive and Wilson was elected by Progressives. Of course, by that logic, racists ought to adore the S.O.B., right?

Meh.

Michael Lind has a pretty good takedown of the Will piece over at Salon, and of course you can read the Will piece yourself if you're dying for a nosebleed. I won't even pretend to try to be as thorough as Lind; I'll merely note that in addition to the historical revisionism Lind focuses on, Will offers a feast for fans of non sequiturs and way-out-of-left-field statements. Like:

The name "progressivism" implies criticism of the Founding, which we leave behind as we make progress. And the name is tautological: History is progressive because progress is defined as whatever History produces. History guarantees what the Supreme Court has called "evolving standards of decency that mark the progress of a maturing society."


To which even an intelligent conservative might reply, "Whaaaa--" moments before his head explodes like the poor esper trying to pick Michael Ironside's brain in David Cronenberg's Scanners. (Our reader also would make the same fart-smell face and twist his head around like the exploding psychic does before he does his exploding thing, so the simile is perfect.) How, exactly, does "progressivism" imply criticism of the founding? Will doesn't elaborate on the point because of course he doesn't have one: it's something that sounds vaguely clever, but to the extent it actually means anything, what it means isn't actually true, unless one supposes that the budding flower is implicitly criticizing its roots (stoopid nurturing roots, we hates them). (Given the transitory nature of a bud and the enduring nature of a tree, I suppose a tree would be a superior metaphor, but we'll leave it be; choose whatever sessile lifeform makes the prettiest picture in your head, really, it's all good.) And this business about "History is progressive because progress is defined as whatever History produces," who the hell says that other than, perhaps, a once-significant conservative pundit struggling for relevance in the teabagger era who's trying to set up a strawman he can knock over? If a progressive ever actually said that, we'd all be Reaganites like Will because that's certainly what history has produced, or perhaps we'd all be neoconservatives, seeing as how that was recently the hip thing. I mean, there's not a liberal on the planet who would say that a bomb crater or fascist regime was an improvement because the inexorable forces of capital-H History led up to that moment.

It's not even a good strawman: it's too stupid a thing for a conservative to claim a progressive would say. Will would be better off claiming that we progressives are pro-unicorn-rape or something, because, you know, we think unicorns really have it coming with the way they add to rainbow-pollution with those ridiculous farts (and we all know progressives have despised rainbows ever since Woodrow Wilson was attacked by one in 1915). That would actually be a somewhat less-batshit thing for Will to write, but noooooooo....

Moving on, this also seems worth poking:

The liberating--for government--idea is that the Constitution is a "living," evolving document. Wilson's Constitution is an emancipation proclamation for government, empowering it to regulate all human activities in order to treat all human desires as needs and hence as rights. Unlimited power is entailed by what Voegeli [William Voegeli, author of Never Enough: America's Limitless Welfare State] calls government's "right to discover new rights."


It's worth pointing out, I think, that what we have here is one of modern American conservatism's cardinal sins. In the American system, "government" isn't merely a separate, distant, distinct institution the way "The Crown" is in a monarchy or "The Nobility" is in a feudal system or "The Papacy" is in a certain theocratic system that historically claimed for itself immunity from secular laws and self-rule. One of those awful Historical documents we progressives aspire to distance ourselves from and implicitly criticize says in its very first sentence:

We the People of the United States, in Order to form a more perfect Union, establish Justice, insure domestic Tranquility, provide for the common defence, promote the general Welfare, and secure the Blessings of Liberty to ourselves and our Posterity, do ordain and establish this Constitution for the United States of America.


Another terrible document we American progressives yearn to trample in the dust of our passing contains this bit of silliness:

We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable rights, that among these are life, liberty and the pursuit of happiness. That to secure these rights, governments are instituted among men, deriving their just powers from the consent of the governed. That whenever any form of government becomes destructive to these ends, it is the right of the people to alter or to abolish it, and to institute new government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their safety and happiness.


Nearly a century later, somebody said this at a funeral (seems worth mentioning, somehow):

The world will little note, nor long remember what we say here, but it can never forget what they did here. It is for us the living, rather, to be dedicated here to the unfinished work which they who fought here have thus far so nobly advanced. It is rather for us to be here dedicated to the great task remaining before us—that from these honored dead we take increased devotion to that cause for which they gave the last full measure of devotion—that we here highly resolve that these dead shall not have died in vain—that this nation, under God, shall have a new birth of freedom— and that government of the people, by the people, for the people, shall not perish from the earth.


It is inevitable that social constructs become independent of the people who comprise them, but we should never forget that the whole point of American government is that we are the government. We elect these people from among ourselves, and even the faceless bureaucrats we didn't vote into office are our friends and neighbors who applied for a job the same way anyone would apply to work at a factory or a used car dealership or a doctor's office. Setting aside the practicalities of corruption (something conservatives largely appear to be in favor of, given their typical stands on campaign finance reform), the American system is to have controlled revolutions every few years, putting into office our avatars, or at least the people we hope or expect will represent us. Of course, it will happen in the course of events that sometimes the body politic will elect someone we progressives despise or somebody those conservatives loathe--but the lovely thing about the whole setup, at least on paper, is that we all have the power to try to convince each other that our boy or their woman would have been better for the job and here's how you should vote in the next election cycle.

Alright, I'll admit Reagan and Will weren't the first people to say "government" is the problem and that plenty of liberals have said the same--and just as stupidly. But face it, if you think government is the problem, you're the problem, literally and actually, because it's your government and your government is you. Bitch about it when the totalitarian regime actually really does take over or we somehow adopt an actual caste system in which the us-and-them of citizens-versus-government is born and bred; in the meantime, by all means print fliers and Vote For Doe and so on.

(I also feel obligated to add, as I look over the Gettysburg Address again, that here we have a Republican President talking about progress and setting forth a progressive standard: "It is for us the living, rather, to be dedicated here to the unfinished work which they who fought here have thus far so nobly advanced." This, not obedience to historical forces--I presume Will is mistaking progressivism for Marxian/Hegelian mysticism--is the essence and definition of progressivism, and if I had to sum up progressivism in one sentence I might steal Lincoln's line--it is our task to dedicate ourselves to the unfinished, perhaps unfinishable, noble work that our predecessors have advanced--that is to say, have made progress on, thereby criticizing their predecessors, oh brother! Seriously, though: what is progressivism in a nutshell? What Abe said.)

It lastly has to be pointed out that the other shoe finally drops late in Will's piece when we discover, if "discover" is the right word to apply to something so tediously predictable, that the real gripe here, as ever, is that Will is unhappy paying taxes:

Although progressivism's ever-lengthening list of rights is as limitless as human needs/desires, one right that never makes the list is the right to keep some inviolable portion of one's private wealth or income, "regardless," Voegeli says, "of the lofty purposes social reformers wish to make of it."


What's most confusing here is the way it's put: by "inviolable portion," Voegeli (and Will) clearly mean "all." Given that one of progressivism's cardinal points is that every person should have enough to live, it logically has to follow that progressives believe everyone has a right to an inviolable portion, including people who would otherwise have nothing; were the progressive to say that wealthy Peter should be robbed utterly blind to pay poor Paul, he would be creating the absurd situation in which newly-rich Paul must be robbed to pay now-impoverished Peter, and oh, look, now we have to rob everybody again; clearly this would be stupid and clearly nobody is saying it except certain jackholes looking for strawmen to knock over (and they do it beautifully, I must say, and by "beautifully" I mean "by running around in mad circles wildly waving their arms and frothing like syphilitic lunatics enjoying 'outdoor recreation hour' at an Edwardian-era hospital for the criminally insane").

Progressive taxation is a long and complicated subject, and I don't really want to get into it too deeply now, except to say that a compromise has to be found between your right to have enough and your responsibility to contribute to the society that you continue to benefit from and that made it possible for you to be comfortable in the first place. Honestly, I've lately started to fantasize about taking all the tax-bitches from the upper class and removing them and their estates to a remote tropical isle--we could call it "Galt Island," why not?--and let them fend for themselves Lord Of The Flies-style and see how long it takes for them to start burning each other's mansions down and clubbing one another to death with bones at the watering hole à la the simians in 2001 (eat your heart out, Dennis Miller); I'm willing to wait, really, though I don't think it will take long for somebody to complain that somebody else's pool parties are making too much noise and then it's on. Also, the fact I wouldn't give them any food, just packets of seeds and maybe a few pigs and cows probably wouldn't help. (Or it would totally help, I guess, depending on your preferred outcome in this social experiment.) Naturally, I'm also not opposed to speeding things up by borrowing the rules of Battle Royale, allowing the denizens of Galt Island to cope with the vagaries of an unrestricted and unregulated economy in which some are randomly assigned useless crap and others lethal weapons while everybody gets an explosive radio collar that detonates in randomly-designated areas (fair is fair).

What were we talking about again? Oh yes, Will on taxation. Let me wrap it up with a summary of Mr. Will's position: "Whiiiiiiiiiiiiiiiiiiiiiine." There.

Anyway, any of the nearly-nice things you may have imagined I might have said about Will in the past--they never happened. Never, not ever. How's the weather?



Read more...

What does Linux look like on a netbook?

>> Tuesday, June 22, 2010

The other day I said I might do a post on my Ubuntu netbook, and since I don't have anything else to write about and the subject might interest some folks....

The first thing to say about netbooks in general--and I pointed some of this out to my Dad on the phone--is what they aren't. A netbook is going to have a low-voltage CPU like Intel's Atom and a minimal amount of RAM (my Dell Mini 10 has a single gig of RAM--tons, if you remember the days when onboard memory was still measured in kilobytes, but only half of what would be considered a minimum amount of RAM on practically any laptop or desktop you'd buy these days); furthermore, the typical netbook is going to have a low-powered onboard graphics processor that barely deserves mention. All this is ample processing power if you want something you can throw under the arm and cart around for surfing the web or doing light tasks like word processing or watching video, but a netbook isn't going to gracefully handle a big, crunchy, CPU/memory intensive task like photo editing or playing a contemporary video game.

This is something you really shouldn't have to say but end up having to: I know people who have bought netbooks as laptop replacements and they've hated them, but frankly that was their own damn fault. A netbook is not a laptop replacement, it's a netbook.

I guess it also shouldn't have to be said but it ends up having to be said: there's been a lot of talk about the iPad being a "netbook killer." It isn't, not really; while there may be some overlap in uses and it's possible that the netbook's proper niche will shrink because of what the iPad does (hopefully not so much that manufacturers abandon the concept), what the netbook has that the iPad doesn't is a keyboard and the power to do true (if light) multitasking. The iPad is possibly a wonderful device for reading books and watching movies, but it's not exactly something you'll be writing your next novel on up at the local coffee shop, and the netbook serves that purpose wonderfully--while also letting you read books and watch movies. I'm not really trying to knock Apple, here, I'm just saying that iPads and netbooks are... well, forgive me, they're Apples and oranges.

The other thing I'd like to say about netbooks before I move on to the OS is that my Dell Mini 10 (named, natch, for its 10" monitor) is 7½" x 10½" and weighs less than three pounds. I think I told my Dad on the phone it was the size of a hardback--by which, I probably should have clarified, I mean a really lightweight harback like Going Rogue or The Overton Window, not something substantial like Infinite Jest, let's say. "Trade paperback" might have been more descriptive, though a netbook obviously doesn't have floppy covers. At least not yet. Give it a few years, I suppose.

When I bought the netbook, I ordered it with Ubuntu pre-installed; Windows XP was also an option, but I can't think of any really good reason for getting a netbook with Windows unless you're just that afraid you'll get lost using a different interface. I'm not trying to knock Microsoft here, either, though I imagine that's how it sounds; Windows is the dominant environment and if you have a Windows-specific app, you should be using Windows regardless, but given that a netbook is a low-power environment and you're not likely to be using it for high-intensity applications, if you need a specific Windows app (including games), I don't know why you'd be getting a netbook in the first place, if you see what I mean. Find yourself a nice 12" laptop with more juice and just deal with the slightly greater size and price of the thing.

Unfortunately--and this may have changed, I don't know--the version of Ubuntu Dell was shipping was 8.04, a Long Term Support (LTS) release, but one already two versions behind what was current when I bought the machine. (It is quite possible that Dell is now offering Ubuntu 10.04 if they're still selling Linux preinstalls--10.04 is the current LTS.) I ended up reinstalling to take advantage of UNR/UNE--Ubuntu Netbook Remix, now called Ubuntu Netbook Edition. It was mostly hassle-free, though I did have some issues initially with video and wireless because of some hardware quirks specific to the Mini 10 (and apparently not applicable to the Mini 9 or Mini 10v). Those were solved fairly quickly thanks to the wonderfulness of the Linux community; let me say here that if you're concerned that "free software" doesn't offer "customer service," I believe you'll find that it's frequently far easier to get solutions from blogs, forums and bugfix pages than it is to get Microsoft to acknowledge your existence (and I'd be genuinely surprised if Apple was really better).

This is a 10.04 UNE desktop (as always, an image can be clicked for full view):



You can install a standard desktop environment like GNOME on a netbook, but the major point of the UNR/UNE releases is Maximus, an interface designed to be easy to use on a tiny screen. Maximus replaces the usual menus-and-folders interface with what is basically a window with a tab-bar running down the side. The first tab is a customizable favorites tab where you can put the stuff you like using most.

This is a tab opened to "Files & Folders," showing a USB flash drive plugged in:



Maximus really gets its name, however, from the fact that it maximizes every window by default. This, for instance, is Writer's Café on the netbook:



And here's OpenOffice.org Writer, displaying the same short story I had on display yesterday:



I use Dropbox to sync Writer's Café and stories I might be working on. That way, journal entries in WC (along with scrapbooks and notes) stay up to date between the machines, as do other projects: whichever machine I'm using uploads a file to Dropbox's servers and downloads it to the other machine when I turn it on (if there's a network handy, of course). It's great for personal use--I wouldn't do it with super-top-secret files or anything, but that goes without saying.

All versions of Ubuntu 10.04, including UNE, integrate support with Ubuntu One, so why do I use Dropbox instead? Well, I tried U1 first, actually, and that's what sold me on the concept. Unfortunately, at the time I tried U1 it was very, very new and had a lot of bugs yet to be worked out--in particular, I ended up with a lot of conflicts during file syncs, and the uplink/downlink times were really, really slow. I am, as they say, cautiously optimistic about U1, however--the issues I was having may have even been resolved in whole or in part by now.

As for other things you can do with a netbook--here's Totem playing a Doctor Who episode:



Yep, the dude with the flashlight on his head and clothes drier components on his chest is a Cyberman, busy menacing the Earth. Although I don't think it's actually stated in the episode, I presume that the ginormous fan was necessary because those older models were prone to overheating and shutting down at inopportune moments, and later versions presumably had better heat sinks or were maybe liquid-cooled.

This is fullscreen (the clip is a fairly low-fi .AVI file, and possibly not a great example of video, but it's what I had handy):



The speakers on a netbook, incidentally, are going to be predictably limited and a little tinny; watching a movie or listening to an album on headphones or earbuds, of course, solves those problems or at least replaces them with different ones.

All-in-all, using UNE on the netbook is a great experience. Maximus is the kind of thing that doesn't sound like it would be a good idea ("I like my menus and toolbars!"), but the truth is that it's astonishingly efficient on a small screen ("Converted!"). For a reliable little work-away-from-home device, it's pretty damn keen.

Hope that's a helpful follow-up, and I promise the rest of the posts this week will be about something other than Linux fanboyism. Really.




Read more...

Since my Dad was asking...

>> Monday, June 21, 2010

The other day my Dad was asking about laptops and netbooks, etc., and of course I had to mention being a Linux disciple. And he happened to ask about the user interface, which gives me a little bit of an excuse to disciplize or whatever it is we're supposed to do when we preach the faith.

I should say, though, Linux isn't for everyone. Although Ubuntu is extremely user-friendly and in a lot of situations really will "just work," there are scenarios in which it can be a royal pain in the ass. I never have been able to get a Canon printer working under Linux, for instance (though my HP printer has worked so beautifully it's made me wonder why I kept trying to get my old Canon to work). There are certain applications that aren't quite "there"--GIMP may be a fantabulous graphics editing program, for example, but I'm willing to concede it's no Photoshop.

But then again, I was reading some of the asinine Cult Of Mac responses to this Salon article from Dan Gillmor about his decision to go penguin, including one in which a Macolyte snarks "have fun compiling software" (something I've never had to do) and fielding questions from my Dad, and I realize there's a lot of mythology, misinformation and FUD still floating around out there. You really don't need to compile software unless you really, really want to (and it's nice having the option, I suppose); similarly, you don't ever need to look at a command line unless you want to (and the truth is that the command line is actually superior to a GUI in a lot of respects, it just requires learning another language, and that's a pain in the ass). And there are open source apps that are peer to their proprietary counterparts: I'm sorry, but the word processor component of OpenOffice.org is superior to every version of Microsoft Word released in living memory in pretty much every single respect I've been able to run across; this is simply a true fact, something that is the way the sky is blue on a sunny day and the same way fish prefer living in water.

But whatever. The real point of this post was to proffer some screenshots from my desktop so possibly-interested parties such as my Dad could get a look at what a Linux desktop looks like--which is what a Windows or Mac desktop looks like, basically. The pictured installation is a Ubuntu Linux 10.04 (Lucid Lynx) installation running on a Dell Studio 17. I have a wallpaper-switching application that rotates background images; to avoid any kind of licensing and legal issues, the background used here is a photo I took in 2008 and processed with the GIMP.

Alright, let's see what we have (all of these images may be clicked on for a large view at full 1920 x 1200 resolution)....

This is what the desktop looks like with nothing open:



The GUI used here is called GNOME, and is a shell with some similarities to earlier versions of the Mac shell--you may notice there are taskbars on the top and bottom of the screen, f'r'instance.

This is a screenshot with three windows open--a file folder, an OpenOffice.org writer window containing a recent short story, and a Writer's Café window (notice the tooltip of the day, aptly enough, is a certain famous two-word line from Douglas Adams--pure coincidence, believe it or not):



Ironically, this and other application views I'm sharing are sort of "unrealistic" in the sense that this isn't how I actually work on things--GNOME (like KDE and other Linux graphical interfaces) makes it easy to set up multiple virtual desktops; I like to have my word processor and Writer's Café maximized on separate desktops on the laptop (workflow is different on my netbook, which I may share some screens from later this week). Virtual desktops aren't unique to Linux: you can set them up on Windows (with third-party software and it's a pain in the ass) and I think they can be set up on Mac, though I don't know if Apple requires third-party tweaks to do it as well.

This is more like a typical screen when I'm writing:



Here are some more apps running:



What you see in the last picture are: a Terminal window (the command line), Rhythmbox playing Ennio Morricone's The Good, The Bad & The Ugly and a Totem movie player window playing an MP4 of NASA's video for "O Pato" with its hysterical sample of something that sounds like Donald Duck visiting a hooker (paused--while I could easily play the music video and the music player at the same time, obviously two tunes at once would sound like crap).

This is a Firefox window with the tab for Echo Bazaar (a browser-based game) in the foreground:



As you can see, using Linux is a lot like using any other contemporary OS. No scary command lines, strings of code, grotty interfaces or whatever. But if it's like using other OSes, with some quirks, why use it as opposed to Windows or Mac?

  • It's stable. Even when it crashes, it's stable.

  • It's free in several senses of the word--free as in beer and free as in you can do whatever you're willing to learn how to do.

  • As a related point, that freedom includes a lot of customization that Apple and Microsoft can't allow their users because they have trade dress to protect.

  • It's clean: the security of Linux (and, yes, Apple's OSX) isn't just due to the whole, "Oh, nobody writes viruses for Linux or Mac because everybody uses Windows" nonsense; while that may play a part, the fact is that the way applications interact with the kernel in a *NIX-based OS essentially precludes a virus or trojan from embedding itself without an administrator specifically authorizing it to do so.

  • This may tie into all of the above or it may merit its own bullet: I feel empowered using Linux, like my computer is mine and not just something I'm borrowing from Redmond or Cupertino


There's probably something else, but that's enough for now.

I hope this post is helpful as a data point for somebody. Linux isn't for everyone, I'll say it again. But it also isn't really leaving anyone out--well, except for gamers, who are stuck with Microsoft or consoles. But for almost anything else you want to do: you can do it on Linux, and you can often (not always, but probably most of the time) do it as painlessly as you'd do it on a Mac or Windows machine.

Later in the week I may throw up some screenshots from the netbook--I have Ubuntu UNE Linux installed on it, and it's a slightly different beast--in a good way, a very good way.

Read more...

Tree of life

>> Sunday, June 20, 2010

I was just going to give you a music video today because I didn't have any blogworthy material and expect to be a little busy with other things, but this was interesting; when I was looking at the RSS feed from Pharyngula, I found that P.Z. Myers had a piece up on this wonderful little--no, that's the wrong word, it's compact but ginormous--thing:



What you're looking at is a radial tree of life created by some University Of Texas researchers--David M. Hillis, Derrick Zwickl, and Robin Gutell--designed to show the biological relationships between about 3,000 species whose RNA has been sampled. The full size thing, a fifty-three inch by fifty-three inch PDF file, can be downloaded here, and it's worth visiting the page and downloading it to your machine just to get a good gander at it.

You should certainly take a look at this little arc, at the very, very least:



Hello, entire human race! Aren't we just adorable?

I don't know if it needs to be said but we'll mention it anyway: as many or all of you probably already know, the linear "tree of life" images most of us saw in our biology textbooks (and soon to be excised from Texas schoolbooks altogether, one imagines), are awful (Texas, ironically, may be doing students a small favor, on second thought). They create the illusion of progress in evolution, which is more than a little deceptive since evolution is not an inherently progressive process. Creatures whose innate traits give them advantages when passing on their genetic information to future generations pass those traits on, whether those traits involve greater complexity or greater simplicity of form. There are reasons for complexity, mind you, some selective (finding new environmental niches in which there's room for a new competing species frequently involves greater specialization), some accidental (once a physical feature has become fixed in a population, it tends to remain even as new features are added--hence, to provide a standard-issue f'r'instance, humans possess an organ that appears to serve little more purpose than to occasionally become infected and kill its owner unless surgically removed).

The linear tree spiking up from some distant past into an eternal present implies progress that isn't there, while an eternally-expanding ring whose edge holds a flickering array of appearing and vanishing species is, I think, probably as perfect as a visual can be (in two dimensions, at least--a sphere might be even better if you could maybe make it a hologram like the one used to plan the demise of the second Death Star in Return Of The Jedi). Somewhere at the distant center of the supernova is whatever microscopic, watery, self-contained chemistry kit that sat briefly on the threshold between adaptive self-replication and soup. And here on the surface--ourselves, crammed in with every single living creature we can see and the billions of living or lifelike critters we can't.

It is a thing, a hell of a thing, to marvel at, brothers and sisters.

Go in peace.


Read more...

"Something Left, Something Taken"

>> Saturday, June 19, 2010

This is a remarkable little animated short with a unique visual design that rewards a full-screen view. It's also probably the most adorable film ever made about the Zodiac killer, which may sound like some kind of weirdly-backhanded compliment but really isn't: although the short doesn't use traditional stop-motion techniques, the creators constructed and photographed puppets and models made of various materials--fabric, pencil erasers, cardboard, yarn, etc.--and then digitally animated the images in Adobe After Effects. The result really is visually interesting and worth a look (and I recommend you visit the creators' blog at the link in the previous sentence--the behind-the-scenes view really does enhance one's appreciation of what's been created here).

Seriously, take a look. It's really, really good stuff.

Max Porter's and Ru Kuwahata's "Something Left, Something Taken":





Read more...

A little love for my home state on a Saturday:

>> Friday, June 18, 2010

EDIT: Or for a Friday. Dagnabit. See, what happened is I pre-posted this a couple of days ago because I wasn't sure how the rest of the week would go and to get it off my to-do list. And then I miscounted the number of days and thought, hell, make it a Saturday post. Which it isn't. Which is ironic, actually, because this is better as a Friday post--after all, Nina Simone singing "I Got Life" ought to be a reminder that you've got it, too, and what a great thing to remind yourself of after a long week.

I'll go ahead and 'fess up that there's another post already in the pipe for tomorrow, with a charming-but-ghoulish little animated short, but it doesn't think it's firing on a Sunday.

And thanks to Vince for sending me a tweet about this, otherwise I would've noticed... I don't know... much later, probably.





A little home state pride on a Saturday, and why not? My North Carolina girl Eunice Waymon, a.k.a. Nina Simone, performing "Ain't Got No" / "I Got Life" from Hair. Give it up for the beautiful Ms. Nina, please:





Read more...

The Noisettes, "When You Were Young"

>> Thursday, June 17, 2010

It is completely and utterly impossible to say too many nice things about The Noisettes.

Covering The Killers, "When You Were Young," live for the BBC:





Read more...

Quote Of The Day

>> Wednesday, June 16, 2010

On American Europhilia:


Liberals want the U.S. to learn from contemporary Europe how to minimize extremes of inequality. Conservatives want the U.S. to learn from Victorian Britain how to design charity for the poor. Liberals want to learn from the successful practices of particular existing 21st-century European countries how to deal with particular issues like retirement policy, regional industrial clusters and apprenticeship policies. Conservatives want to learn about the modern economy by exhuming the speculations of long-dead European classical liberals -- French, English and Central European. Liberals want the U.S. to learn from the contemporary European Union how to structure a workable system of international institutions. Conservatives want the U.S. to study the grand strategy of the 19th-century British empire and the tactics of the 19th-century Prussian army. Liberals want to view the American founding and the Civil War through their own eyes. Conservatives want to view them through the eyes of a German professor born in 1899.

- Michael Lind, "Why do conservatives want to
European-ize America?,"

Salon, June 15th, 2010

Read more...

Well, now I've gone and done it...

>> Tuesday, June 15, 2010

So, I sent off a 1,000-word zombie erotica thing to that "Rigor Amortis" thing phiala brought to my attention. Kind of a minor sort of thing but not really, as I've never worked up the balls to submit anything to anybody before.

How do I feel about it? Sort of nauseous, actually. I don't really expect anything to come of it, so I can't quite put my finger on why I feel slightly terrible. Insecurity, I guess.

But, you know, what do I have to lose, right? I mean, what's the worst thing that can happen?

(Don't answer that. It probably involves, I don't know, rabid ferrets or something, right?)

Anyway, figured I'd let the few of you who don't tweet know that I've now officially joined the ranks of writers awaiting rejection notices. That'll be all. No, wait, not quite: I am grateful and would thank everybody who's been encouraging and supportive. Seriously, you're part of the reason I grew a pair, finally. Thank you.


Read more...

Okay, this was interesting: the Leidenfrost Effect and medieval law

Sort of a random topic, but I thought this was kind of interesting: going through RSS feeds the other night, I stumbled across this item at Kottke, featuring this video edit of the Mythbusters dipping their hands in molten lead:






The reason they can do this is the Leidenfrost Effect; from Wikipedia (in case you can't watch the Mythbusters video right now):

The Leidenfrost effect is a phenomenon in which a liquid, in near contact with a mass significantly hotter than the liquid's boiling point, produces an insulating vapor layer which keeps that liquid from boiling rapidly.... The effect is also responsible for the ability of liquid nitrogen to skitter across lab floors, collecting dust in the process. It has also been used in some potentially dangerous demonstrations, such as dipping a wet finger in molten lead or blowing out a mouthful of liquid nitrogen, both enacted without injury to the demonstrator.

It is named after Johann Gottlob Leidenfrost, who discussed it in A Tract About Some Qualities of Common Water in 1756.


Now, what I found most fascinating about this, frankly, isn't the physics demonstration (which is undeniably trés cool), but that it immediately reminded me of Charles Mackey's classic, Extraordinary Popular Delusions And The Madness Of Crowds.

If you haven't read Extraordinary Madness, it remains a fascinating and relevant work a century-and-a-half after publication. Rambling in scope, Mackey fills out a history of early economic bubbles and crises with thoughts on everything from religious manias to fads in pop music (it turns out, if you didn't know, that a stupid song becoming inordinately, inescapably popular and then abruptly vanishing like it never existed when everybody was sick of it is something that happened as much in the 19th Century as now).

In his fifteenth chapter, "Duels And Ordeals.", Mackey talks about--well, the chapter title is pretty self-explanatory. In its earliest phases, the legal system (such as it was) resolved disputes by letting people beat each other up or by engaging in casual torture under the theory that if God thought somebody was innocent, they'd live or at least not lose a limb; Mackey thinks this was really stupid and says so. He also describes some of the trials by ordeal used to determine guilt or innocence before they invented lawyers:

By the fire-ordeal the power of deciding was just as unequivocally left in their [the clergy's] hands. It was generally believed that fire would not burn the innocent, and the clergy, of course, took care that the innocent, or such as it was their pleasure or interest to declare so, should be so warned before undergoing the ordeal, as to preserve themselves without any difficulty from the fire. One mode of ordeal was to place red-hot ploughshares on the ground at certain distances, and then, blindfolding the accused person, make him walk barefooted over them. If he stepped regularly in the vacant spaces, avoiding the fire, he was adjudged innocent; if he burned himself, he was declared guilty. As none but the clergy interfered with the arrangement of the ploughshares, they could always calculate beforehand the result of the ordeal. To find a person guilty, they had only to place them at irregular distances, and the accused was sure to tread upon one of them. When Emma, the wife of King Ethelred, and mother of Edward the Confessor, was accused of a guilty familiarity with Alwyn, Bishop of Winchester, she cleared her character in this manner. The reputation, not only of their order, but of a queen, being at stake, a verdict of guilty was not to be apprehended from any ploughshares which priests had the heating of. This ordeal was called the Judicium Dei, and sometimes the Vulgaris Purgatio, and might also be tried by several other methods. One was to hold in the hand, unhurt, a piece of red-hot iron, of the weight of one, two, or three pounds. When we read not only that men with hard hands, but women of softer and more delicate skin, could do this with impunity, we must be convinced that the hands were previously rubbed with some preservative, or that the apparently hot iron was merely cold iron painted red. Another mode was to plunge the naked arm into a caldron of boiling water. The priests then enveloped it in several folds of linen and flannel, and kept the patient confined within the church, and under their exclusive care, for three days. If, at the end of that time, the arm appeared without a scar, the innocence of the accused person was firmly established


And even more revealing, in a related footnote Mackey adds:

Very similar to this is the fire-ordeal of the modern Hindoos, which is thus described in Forbes's "Oriental Memoirs," vol. i. c. xi.—" When a man, accused of a capital crime, chooses to undergo the ordeal trial, he is closely confined for several days; his right hand and arm are covered with thick wax-cloth, tied up and sealed, in the presence of proper officers, to prevent deceit. In the English districts the covering was always sealed with the Company's arms, and the prisoner placed under an European guard. At the time fixed for the ordeal, a caldron of oil is placed over a fire; when it boils, a piece of money is dropped into the vessel; the prisoner's arm is unsealed, and washed in the presence of his judges and accusers. During this part of the ceremony, the attendant Brahmins supplicate the Deity. On receiving their benediction, the accused plunges his hand into the boiling fluid, and takes out the coin. The arm is afterwards again Sealed up until the time appointed for a re-examination. The seal is then broken: if no blemish appears, the prisoner is declared innocent; if the contrary, he suffers the punishment due to his crime." * * * On this trial the accused thus addresses the element before plunging his hand into the boiling oil:—"Thou, O fire! pervadest all things. O cause of purity! who givest evidence of virtue and of sin, declare the truth in this my hand!" If no juggling were practised, the decisions by this ordeal would be all the same way; but, as some are by this means declared guilty, and others innocent, it is clear that the Brahmins, like the Christian priests of the middle ages, practise some deception in saving those whom they wish to be thought guiltless.


Notice that in the Indian version Mackey describes, the hand is washed before being immersed.

Obviously, some of these trials aren't situations where the Leidenfrost Effect might occur. But with regards to the trials in which it potentially came into play--e.g. the Indian trial in which the hand is wet and then immersed in boiling oil--it certainly seems possible, doesn't it? It seems impossible to me that this is a novel observation--surely somebody in the past 254 years has suggested this; still, it's fun to think about, isn't it?

Mackey published Extraordinary Madness in 1841; the Leidenfrost Effect was described in 1756 and seems to have been reasonably well-known to engineers of Mackey's era; nonetheless, Mackey hypothesizes that fraud may have been at play when those undergoing ordeals to prove their innocence managed to grab a coin from a pot of boiling oil or water. One wonders how it escaped Mackey's notice, as thorough and wide as his work was.

The Leidenfrost Effect would explain, for one thing, why some people survived ordeals unscathed and others fried a limb. As the Mythbusters discovered, variations in the temperature of the melted lead had a profound effect on whether the Leidenfrost Effect actually occurred; in insufficiently hot molten lead, ironically enough, a test sausage was cooked more than it would be in lead hot enough to trigger the effect.

One might also speculate that Mackey was partly right about Church fathers engaging in some fraud: perhaps someone suspected of being actually guilty was allowed to put a dry hand in the pot (ouch) or a pot wasn't sufficiently heated (again, ouch). Meanwhile, an innocent party might be subtly (or not-so-subtly) encouraged to move a little faster, before the protective steam layer between their limb and the hot substance completely boiled away and vanished.

The thing about this speculation is that physics seems to me a bit more likely than either God or fraud. One issue I always had with Mackey's thoughts on ordeals was that fraud seemed a bit too pat--you can't disregard the possibility altogether (Uri Geller has convinced plenty of rational, intelligent, educated people that he has telespoonesis), but one imagines that, surely, someone would notice something a bit off about that pot of oil or hot lead if it were a con, or that somebody would helpfully screw it up ("Hey, this kettle isn't hot at all, let me throw some more coals on!"). But it turns out you don't need a miracle or deception, which is pretty awesome, really.

All you need is steam.

Read more...

"I don't know, man, can I just watch the snuff film instead?"

>> Monday, June 14, 2010

I can't believe I hadn't seen this sooner: the perfect mash-up of Nicolas Cage's godawful 8mm and his legendarily-bad remake of The Wicker Man.

Honestly, I haven't seen the Wicker Man remake in its entirety, but I can tell you this clip does accurately capture my reactions to 8mm.

The police should investigate, indeed.





Read more...

An open letter to David Walker

>> Sunday, June 13, 2010

Mail From Dave!‏

From: David Walker (debbwalk141@gmail.com)

Sent: Fri 6/11/10 1:51 PM
To: debbwalk141@gmail.com

Hello , I will like to seek your help in a business proposal , which although is sensitive by nature and not what I should discuss with someone I don't know and have not met using a medium such as this but I do not have a choice .

I am Mr. David Walker personal attorney to late Dr. Edward, who died of a cardiac arrest a few years ago leaving behind a large sum of money with a commercial bank in the Island of Seychelles which is a tax free zone, a place where plenty of rich people tend to hide away funds not ready to be used or invested. I will not mention the amount of money which runs into several millions in United States Dollars and name of bank presently until we have agreed to deal.

I trust you will understand the need for such precautions. So far, valuable efforts has been made to get to his people but to no avail, as he had no known relatives more because he left his next of kin column in his account opening forms blank and he has no known relative. Due to this development the bank has come forward to ask us as his personal attorneys to bring forward a close relative to claim the funds otherwise as the Seychelles national laws would have it, any dormant account for five years will be declared unclaimed and then paid into the government purse. To avert this negative development my colleagues and I have decided to look for a reputable person to act as the next of kin to late Dr. Edward so that the funds could be processed and released into his account, which is where you come in.

My law firm will also act as your personal attorneys since we will be portraying you as being directly related to our late client being from the same country. All legal documents to aid your claim for this fund and to prove your relationship with the deceased will be provided by us. Your help will be appreciated with 30% of the total sum which I would disclose in my next email.

Please accept my apologies, keep my confidence and disregard this letter if you do not appreciate this proposition I have offered you. I wait anxiously for your response.

Yours Faithfully,

David Walker


Dave, old boy:

I don't know how you got this e-mail address, although it looks like you sent a copy to yourself for some reason and it was mis-forwarded to me, or maybe you just have a guardian angel. Anyway, it's your lucky day.

This sounds like the perfect job for me and my team. We work fast, we work smart, and if we weren't so soaked up with love for our fellow man, brother, we wouldn't work cheap. We all have our reasons for doing what we do, whether it's a passion for helping children, enjoyment for the company of the ladies, being just a little crazy, or simply loving it when a plan comes together. Maybe we're all crazy. But I don't know if I should get too much into that, brother, because the less you know, the less you have to lie about later.

Assuming your bona fides check out, pal, and you're not really a certain U.S. Army Colonel taking us for suckers, we'll need any other information you have about the subject's family. Names and physical descriptions would help, but also things like is their host state hostile, how many guards are there, security systems in place, will we need to bribe any officials, who can we disguise ourselves as, etc.? You'll be expected to cover expenses, of course, although with enough time my team does have the ability to "produce," shall we say, certain necessary documents if needed. We have transportation covered if you can assist in the shipping of a 1983 GMC G-15 van to the host state.

Once we're in, we'll extract Dr. Edward's heirs and bring them to a designated retrieval site (we'll choose it and let you know when and where to meet us for the transfer), and then they can collect the inheritance you're holding for them. We'll expect a modest fee, of course. Naturally, the exact amount can be negotiable if any of the heirs happen to be adorable orphans or an attractive young lady in her early 20s.

As for your offer of legal representation: does your firm have any experience with the UCMJ? As you might know, my pals and I have had certain legal difficulties stemming from a misunderstanding between ourselves and our previous employers resulting in certain further misunderstandings culminating in a huge misunderstanding that led my associates and I to alter our then-current residential arrangements against the preferences of the United States Government, which the Government didn't take quite as well as we'd hoped and which triggered even more misunderstandings which led to other misunderstandings, etc. Do you know anything about appealing a court-martial?

It sounds like you have a problem and no one else can help, and you've found us. I look forward to working with you.



Sincerely,
[name withheld], Lt. Col., U.S. Army (fmr.)


Read more...

Dear Abby

>> Saturday, June 12, 2010

http://www.abbysunderland.com/Abby Sunderland is coming home. And I'm glad, and I hope she takes another shot at sailing around the world, even if she won't be a record-setter.

And yet there are people considering the whole venture a bad idea in the first place.

I've suggested that humans are children until they're in their early twenties and that children ought to be entitled to special considerations in juvenile court proceedings. Judgement isn't something kids are known for, and I'm certainly not about to back down from that.

But I also don't really see that as being the issue. Or to the extent that I do see it as an issue, I see the issue as being that personal judgement is something that one develops through practice and experience, not by sitting at home being coddled by one's parents. One of the dominant themes in the letters one sees at Salon (and elsewhere--I merely provided the link to Salon as a handy exemplar, I've been seeing these kinds of comments on other sites as well) is safety, how dangerous this whole affair was and what were the parents thinking? Which strikes me as bizarre, albeit typical: if I walk to the corner restaurant, I might be mugged by somebody, so I shouldn't do that... of course if I lay in my bed hiding under the covers and never leave, I risk bedsores.

Life, as the old joke goes, will kill you. Some of us will die when we're infants and some of us will die when we're 112. Some of us will be hit by buses and some of us will have strokes and some of us will fall out a window and some of us will be mauled by bears; actually, at the present writing there is exactly one way to come into existence (your parents had sex) and several gajillion bazillion ways to exit the stage. Someone out there will die in a plane crash despite never setting foot on an airplane, and someone else will walk away from being shot several times in the face and subsequently die a half-century later from food poisoning. Another famous old expression says that the only things in life that are certain are death and taxes; this is a lie, as skilled lawyers and accountants can reduce your tax liability to zero, while the Grim Reaper will eventually hunt your ass down regardless of the fact you looked both ways before crossing every street, took plenty of vitamins (but not too many!), had all your shots and brushed your teeth thrice daily and exclusively ate organic macrobiotic meals between sleeping eight hours and obsessively doing aerobics.

You will die. I'm very sorry if this is news, somehow.

Now, to be frank, I have a special horror of drowning and find the thought of freezing to death rather frightening, and you can do either or both in the Indian Ocean in the middle of Winter. On the other hand, to the extent we might worry about what our obituaries might say about us, "Heroic Teen Dies In Solo Circumnavigation Attempt" rings a little more badass than "Old Fat Guy Chokes On Big Mac." You surely must concede this, how can you not?

I am thrilled that Abby Sunderland's obituary remains unpublished (I'd say unwritten, but no doubt there were news organizations preparing for the worst this past Thursday; their drafts will have to be revised and extended, huzzah!). While the worst wouldn't have been a surprise, I never shared Roger Ebert's despair. For one thing, kids are physically tough little bastards by evolutionary design; for another, judgement and resourcefulness are different things.

This deserves its own paragraph, you know. Just because a kid isn't as capable of consistently making good choices, or at least isn't as capable of the consistency that ought to be expected of an adult, doesn't mean the kid isn't clever or knowledgeable or smart. Indeed, a kid's problem is likely to be that he or she is too crafty for his or her own good, but that's another topic. (It might also be noted that adults don't always choose well or wisely.) The weird point here is that one shouldn't assume that just because there's a high risk a kid will choose badly when all of his or her friends are sniffing paint thinner fumes and telling him or her how much fun it is, it doesn't follow that the kid is going to instantly drop dead the moment he or she gets lost in the woods or is facing thirty-foot waves in a small boat in the middle of the vast ocean. We get so used to kids choosing badly when it comes to things like what constitutes a nutritional dinner that we're more surprised than we ought to be if a kid is pulled out of a swamp after several days, as if a kid who thinks candy is a food group is likely to try hopping on an alligator's back for kicks, because, you know, the first choice is kind of dumb so it must follow that every single thing this child will ever do again will be dumb. Follow that logic to its inevitable conclusion, and our children are the equivalent of farm turkeys who will drown if they look up during a rainstorm.1

There is also an argument that I'm two minds over, though it's the first thing that comes to mind in this context. That would be the fact that, once upon a time, the British Navy was full of twelve-year-olds. A sixteen year old would have probably been a leftenant or something, aside from the fact that Abby Sunderland is a girl and would, of course, be married and have three children. That last bit of snark is why I'm of two minds, actually: we don't have twelve-year-olds serving in the navy now because we have those evolving standards of decency that dictate things like "little kids shouldn't be in the armed forces" and "girls can do things, too," and it would be a fallacy2 to try to claim that because children of a particular era were considered eligible to be sailors so they should be now. And yet--the fact remains that people younger than Abby Sunderland, living in an age before GPS and cell phones and satellite beacons, would've been deemed competent sailors, though admittedly they wouldn't necessarily have been attempting solo circumnavigations (though one suspects this has less to do with the ability of those sailors-of-old than it does with technical limitations that made long-distance sailing a larger-scale and more arduous effort; e.g. even something as simple as food supplies, which would have consisted of tinned meat and vegetables and a compliment of live animals, as opposed to freeze-dried anything stuffed in a small bin).

I suppose what I am really trying to get at with all of this is: I'm not a parent, so maybe it's easy for me to say this, but if I had a talented, bright sixteen-year-old with aptitude and experience in doing something risky, I'd let her do it even if it terrified me. And I'm sure it would. But the rub is that I'd probably be just as terrified if she tried doing it at age thirty-two, when all of the same awful things that could happen when she was sixteen could still happen. Because in the end we all die anyway, but the cliché that some of us never really live is absolutely true, and I would want my children to be tough and brave.

Happy sailing, Ms. Sunderland, and may a good wind be at your back on your next attempt.




1If your rebuttal is that this is exactly how most children are and they all grow up to become Fox News viewers, QED, I'm afraid I have to fold. You win.

2An "is-ought" (or I suppose in this context, "was-should") fallacy, but not technically a naturalistic fallacy.




Read more...

I'll bet Chinese Democracy would have sold more copies if this had been G'N'R's lineup at the time...

>> Friday, June 11, 2010

Axl Rose and Bruce Springsteen performing "Come Together." Because... because... because why the hell not?

Hope you're having a cromulent Friday, people.





Read more...

Thursday

>> Thursday, June 10, 2010

The alarm clock went off this morning. I thought, "That's weird that my alarm went off on a Saturday."

Then I remembered it was Friday.

And then I remembered it wasn't Friday, either.

Read more...

Making a note of it

Laura Miller is still at it. Good grief.

Some readers may recall that last month I ran a post critical of Miller's Salon review/embrace of Nicholas Carr's The Shallows, a book everybody seems to be talking about although I don't know if anyone is actually reading it. One of Miller's cutesy little touches at the end of her article was to dump what would have been textual hyperlinks onto the end of the article instead of actually including them where they were relevant, a practice she's continued. She does this, evidently, because there are studies and anecdotal evidence suggesting hyperlinks are a distraction and make it hard for readers to focus on the text.

Whatever.

No, I mean, Miller and the studies and the anecdotes she's relying on may all be correct, but it doesn't make her right. I realize that including hyperlinks may distract you from what I'm saying in a paragraph if you can't resist the temptation to immediately left-click on the link, or if you know enough to right-click and open in a new tab or window but can't resist the temptation to read the new tab or window first, or if you just can't resist waving your mouse pointer over the link to see where it goes or what the alt-text is, or, or, or.... All of that is something I'm happy to concede.

The problem, however, is that Miller has no actual point: reading a text is not necessarily an easy or linear or natural task for a reader, and writing isn't a linear task for an author, either. Miller seems to think a linkless text is somehow a sign of better writing, or that good writing is diminished by the use of links:

A sentence that's written to include hyperlinks won't necessarily make as much sense without them. You write differently when you know you can't dodge explaining yourself by fobbing the task off on someone more eloquent or better informed. You have to express what you want to say more completely, and you have to think harder about what information ought to be included and what's merely peripheral. (Knowing what to leave out is as important to writing well as what you include.) Furthermore, I've found that if I want to make my paragraph of end links meaningful, I need to include some additional text to explain what the source pages are and why the reader might find them valuable.

All of this adds up to more work for the writer. However, I'd argue that this work is precisely what a nonfiction writer is supposed to do. Our job is to collect and assimilate information about a particular subject, come to some conclusions and put all of this into a coherent linear form so that it can be communicated to other people. That's the service we provide. All of us may now swim in a vast ocean of interlocking data nuggets, but people can still only read one word at a time, and putting the best words (and the best ideas) in the best order remains the essence of the writer's craft.


Let me point out what I'd think would be an obvious flaw to any reader even remotely familiar with academic writing or the work of David Foster Wallace: the fact that text is a linear format while thought--written or read--is nonlinear forced writers to invent hypertext for the page ages ago. They're called footnotes or endnotes, depending on whether they appear at the bottom of a page or the end of the text, and might consist merely of a symbol (e.g. an asterisk) and accompanying note or a list of numbered break-outs or, depending on your preferred manual of style, an entire system of parenthetical-text-and-accompanying-source-or-explanation manifested at the end of the work.

There are assorted ways a reader might grapple with this "inconvenience to readers who are prodded to check out how clever the writer is" (as Miller quotes Sarah Hepola as saying). Some readers ignore notes altogether. Some readers read with a finger or bookmark in place at the end of a book and go back and forth (DFW's Infinite Jest was a two-bookmark book for me). Some readers read all of the notes in one burst after they're done with the meat. There are probably some other methods out there, as it seems like this is an area in which a reader learns what method(s) works best for himself or herself.

At the moment, my reading list includes Vincent Bugliosi's Reclaiming History, the famed former prosecutor's mammoth disquisition on the assassination of John F. Kennedy. I've been reading this book for a couple of years, actually, mostly because it's too heavy to take to lunch and I've had other things I've wanted to read in the meantime, not because there's anything that's hard to focus on. Now, this book happens to ship with a CD-ROM containing two PDF files: one PDF file contains Bugliosi's endnotes and the other contains his citations, and these I've copied to my netbook and I do sometimes read Bugliosi's endnotes at lunch when I have the netbook with me; had Bugliosi included these documents as printed text, History (already the size of a respectable dictionary) would have been three times its present size and therefore more expensive for the publisher to print, the reader to buy, and would have proportions more suited to building a schoolhouse or bunker than to holding on one's chest with one hand in bed (the other hand being necessary for a suitable nightcap). Is Bugliosi's tome unfocused because he had additional materials that he wanted to make available although they didn't fit into the main thread of the primary text? No. Furthermore, was Bugliosi wrong to want to include references to every single primary and secondary source he relied upon in writing a book on a controversial subject in which most works have been poorly researched or have relied upon hearsay, misinformation and innuendo? Absolutely not.

I might add, as a reader, I feel less-inconvenienced and more that I managed to get at least two books for the price of one alongside one of the most comprehensive bibliographies ever assembled on the JFK assassination. Score!

I'm also reading a translation of several of Akutagawa Ryunosuke's short stories; translator Jay Rubin helpfully offers endnotes explaining various points of history, culture, language, calligraphy (like many Asian authors, Akutagawa sometimes took advantage of the structure of Japanese/Chinese characters to convey secondary meanings), the art of translation, etc. This is one where my bookmark relocates itself from my starting point to the section in the back containing the notes for the story I'm reading. Inconvenient? Mildly and hardly at the same time--while I minored in Asian Studies in college, that was long ago and there is much I've forgotten or never learned (including Japanese), and it's far less distracting to thumb to the back to learn who a minor Tokugawa official was or how a character's name is really a pun (to offer a pair of f'r'instances) than to read through the story and miss a point or be lost at sea. This is an example of an annotated text in which I read a sentence or paragraph, thumb back to the note, thumb back to the story and re-read the sentence or paragraph with the new information. And it's really not that difficult, honest.

Drew Gilpin Faust's This Republic Of Suffering, a book about how death was handled during the American Civil War, which I'm reading partly because I've been grappling with trying to write a novel about Civil War zombies (yes, I know it's been done--my version is going to better, nyah), is also annotated, but I've mostly skimmed past them--they're reference notes, not comment notes, and not too vital to me for now. I'll probably skim through them when I'm done with the text itself and make sure I didn't miss anything.

There's not a substantial difference between hyperlinks and annotations in a text; or, if there is, it's that HTML allows you to embed an annotation within the body of the text without a nubbin at the end of a word or sentence and that the annotation potentially takes you directly to the source (that is, instead of reading a footnote attributing a thought to Mr. Smith's fine meditation on the medieval antecedents of the doohickey which you must take on faith or drive to the library to double-check, you may immediately jump to Mr. Smith's work and read it or purchase it on the fly if you wish). With regard to the latter point, it might be noted that the immediacy of the link not only offers instant gratification, but instant truth: whereas certain unscrupulous authors (in any medium) might cite a nonexistent source or deliberately mischaracterize an authority, to do so with an HTML link is a suicidal act of chutzpah that invites instant unmasking (unbelievably, I've seen online frauds do this very thing, as I'm sure you have as well). But as far as "shallowness" or "ease of reading" goes, I'm not seeing it.

And Ms. Miller surely isn't about to castigate annotated works, is she? She surely doesn't mean to suggest that a footnoted work of nonfiction or fiction is inferior to one that doesn't "show its work," is she?

In my mind, I come back to one of my favorite writers, the poor, late David Foster Wallace. One of Wallace's masterpieces, a work frequently cited at his death as the thing of his you must read, was his (in)famous nonfiction essay for Gourmet, "Consider The Lobster." Wallace was hired by the magazine--a food-porn sort of publication--to report on the Maine Lobster Festival; once there, the sensitive, thoughtful writer found himself troubled with unanswerable questions about the ethics of eating, and wrote a provocative, questing, moving piece. Do lobsters feel? Is it okay to eat them if they do? Is it okay to eat them if they don't? How do we know what goes on in the head of any other creature? Unlike many shallower writers, Foster avoided writing a polemic, mainly asking the troubling questions and ruminating on them. It definitely wasn't what Gourmet asked for or what they normally would run, but to the magazine's credit they asked for few changes and ran the piece almost entirely as it was written, in spite of the grief they correctly anticipated getting from readers who didn't want to question their gustatory habits at all.

And, like most of DFW's work, it's a heavily-footnoted piece. There are footnotes describing things that Foster thought were too-interesting to leave out but too-tangential to include in the main body of the text. There are footnotes citing people he talked to for the essay. There are footnotes where DFW goes back to something he wrote and argues against himself over it (as, I think, any thoughtful person facing a difficult question does). There are parallel stories and side-trips. And so forth. Anybody who would imply that these notes are the result of one of the most-brilliant writers of the turn-of-the-millennium engaging in the literary equivalent of "dumping a bunch of raw ingredients on the table [as] a substitute for cooking someone a meal" is an utter ass.

Of course, I imagine Miller would rejoin that DFW's notes are still his words, as opposed to what I did two paragraphs ago when I linked to Wallace's original essay, allowing a reader to visit DFW's essay (something I hope readers will do, and if they read all of "Consider The Lobster" and then don't bother coming back to this blog, I'd still consider that a win). But then Miller's argument wasn't that nuanced to start with: she would appear to be against hyperlinks generally, even ones similar to, say, Slate's sidebars (a link going to a parenthetical note; they've since mostly replaced these with little inline symbols that utilize hovertext). Nor, again, is it clear to me that there's a meaningful difference between, say, "One of Wallace's masterpieces, a work frequently cited at his death as the thing of his you must read, was his (in)famous nonfiction essay for Gourmet, "Consider The Lobster," and, "One of Wallace's masterpieces, a work frequently cited at his death as the thing of his you must read, was his (in)famous nonfiction essay for Gourmet, "Consider The Lobster."1 Surely the difference between those two modes is modest, at most.

And while I've made use of footnotes in this blog before, I've thought about setting up a second page to hyperlink-ize them instead; the main reason I haven't has been the trouble, frankly. But the beauty of the hyperlinks (going back to the HyperCard days2) has always been that it allows a writer to more-seamlessly integrate material that otherwise would have to be shunted off to a separate section.

I can nearly comprehend Miller's reservations about online writing; what I can't really abide is her smug superiority, especially when she backs it up with points that are sort of stupidly oblivious to her apparent area of expertise. I mean, why on Earth harp on hyperlinks when they merely offer digital analogues to forms of metatext that go back at least as far as the Talmud (there, by the way, is literary pedigree for you), including an analogue to established forms of metatext that millions of students, academics, scientists, doctors and laypeople read every day? This I don't get.

Miller closes with this nugget:

My little experiment [in infodumping at the bottom of the page because I'm too vain and dizzy to use established web protocols]3 may not last. But I'd still recommend it as an exercise to any writer who's become accustomed to the ease of studding his or her work with hyperlinks.4 Doing without them forces you to think harder about how important certain chunks of information are, whether that reference is as cool or funny as you think it is and just how much you're contributing to the conversation.


Setting aside the again-smug tone (yes, how much are you contributing to the conversation?), one would hope writers were doing all of these things without resorting to an irritating gimmick "experiment." Hoping that you're contributing to the net intellectual mass and being cool and funny and informative isn't a matter of whether your post is a linkfest--it's a matter of whether your writing is any good at all.





1Wallace, David Foster. "Consider The Lobster." Gourmet.com August 2004. 9 June 2010 <http://www.gourmet.com/magazine/2000s/2004/08/consider_the_lobster?currentpage=1>

2And while we're talking about hypertext and footnotes, here's a footnote about hypertext to make another point about hyperlinks that seems to be passing Miller by. Consider the clause this note goes back to: "going back to the HyperCard days." There are almost certainly two kinds of readers here as far as that comment is concerned: readers who know what HyperCard was (whether they used it or not; coming up mostly in the PCverse, I never used it myself, though I knew of it) and people who have no idea what I was saying when I said "HyperCard."

Those in the first group don't need an explanation and that link wasn't for them. They probably assumed as much and skipped past it, or waved their mouse pointer over it and said, "Whatever."

Those in the second group can use the link or not as they see fit. If they want an explanation, there it is. If they don't care, well, there it still is, but they're welcome to forget about it.

It is possible, from her article, that Miller thinks I'd be showing more skill and flair if, instead of linking to a Wikipedia article, I went ahead and wrote something like, "back to the HyperCard days (HyperCard was a program that allowed authors to create hypertext links between various data-handling virtual 'cards')." Of course, aside from the fact that this is essentially everything I know about HyperCard, and the fact that all my readers who know HyperCard either just huffed "I know that" or were distracted by some minor inaccuracy in my characterization, there's also the fact that that parenthesis is unwieldy and mostly irrelevant to the point I was making in the paragraph.

As is this footnote, really.

Which is the point in having it. Per Miller's apparent theory of writing and/or possible hypothetical rejoinder to my earlier point re: David Foster Wallace, I have proffered a footnote about an incidental matter that uses my own words in a separate section of the piece to explain a reference that was completely and thoroughly covered by an <a href=> tag. Weren't you happier with just the embedded link?

3As bracketed editorial contextual explanations go, this one may be a little... expansive.

4Also, you know what? I don't know how the writers do it at Salon, but when I want to insert a link, I have to do one of two things: I have to either click the little globey-chain icon in the Blogger interface and cut'n'paste my link into the dialogue box which subsequently opens or I have to type an "a href=" bracket into the text--which, ironically, I generally find to be quicker. Now, this certainly isn't hard in the way calculus or removing a stump from your yard are hard, but it's not exactly something done with "ease." It requires some to-ing-and-fro-ing and knowing a modest amount of HTML coding and checking the link to make sure the right item was copied from the clipboard and doing a preview to make sure I closed the link, and... well... point is, it's certainly enough effort that I don't do it without thinking about it and I don't imagine anyone else does, either, and there are certainly times I wish I didn't have to include a link but it seems like it's inviting more trouble not to. That's all.


Read more...

Another proud member of the UCF...

Another proud member of the UCF...
UCF logo ©2008 Michelle Klishis

...an international gang of...

...an international gang of...
смерть шпионам!

...Frank Gorshin-obsessed bikers.

...Frank Gorshin-obsessed bikers.
GorshOn! ©2009 Jeff Hentosz

  © Blogger template Werd by Ourblogtemplates.com 2009

Back to TOP