Where everyone has to work all the time, where poets and writers should write as much as they can, until they collapse, until their fingers bleed.
This should be obvious to everyone, but I somehow didn't notice until now that Gladwell's now-famous 10,000 hour rule is the Calvinist Work Ethic by another name: sure, putting in the time doesn't guarantee success, but not putting in the time is a sign of moral, and likely practical, failure.
My Grandfather is fond of saying that the optimists always win. I don't think that's true, but I agree that everybody who wins is an optimist for reasons that parallel the Anthropic Principle. Still, because everyone who wins is an optimist, it pays to not be a pessimist. And I don't have to believe in the 10,000 hour rule to decide that it's better to work hard than to goof off. But accepting the notion that nobody can goof off and be a success seems somehow joyless and, well, Calvinist.
I've always thought I have known what "a watched pot never boils" means.
Like many others, I presumed that a pot only seems to boil more slowly when you pay attention to it, and that the best way to make a pot boil "faster" is to distract yourself with another task or thought.
But, while making lamb stew tonight, it occurred to me that I might be incorrect in my understanding. After all, you can make a pot of water boil faster by putting the lid on it, to conserve heat. Maybe it's because I'm presently enjoying Wolf Hall, but it occurred to me that there was a time in the not-too-distant past when cooking required wood, and that wood might be scarce or expensive, at least for some people.
If you're conserving wood or coal, I thought, your cooking fire might keep cool, in a relative sense, by remaining small compared to the pot you were cooking in. It might be that, like a stove on low, the pot's contents might just barely reach a boil with the lid on. And taking the lid off, to check on it, might lower the temperature below boiling.
Thus, I realized, if you constantly watched this pot, leaving it lidless, it would not boil at all!
Does anyone know more about the origin of the idiom, giving the power to confirm or deny this understanding?
In his otherwise spot-on post, The Collapse of Complex Business Models, Clay Shirky makes the confusing statement:
Bureaucracies temporarily reverse the Second Law of Thermodynamics. In a bureaucracy, it’s easier to make a process more complex than to make it simpler, and easier to create a new burden than kill an old one.
Now, I'm no physicist, but it seems pretty clear to me that bureaucracies absolutely obey the second law of thermodynamics.
"Process" isn't anti-entropic at all, any more than gasoline-powered engines are anti-entropic. The process itself, the bureaucratic imperatives, are entropy and waste heat. Which isn't to say that you can perform useful work without a certain amount of process, or that certain applications don't require an enormous quantity of process. A good startup is an efficient engine, generating lots of results with a minimum of process. A good large organization is less efficient, but trades that efficiency for reliability and consistency—it's a big rig, with a big diesel engine, or possibly an industrial motor of some sort.
The 1991 film stars Chevy Chase and Demi Moore as go-go '80s Wall Street types who wander off of their GPS map while driving through New Jersey, and are arrested in the town of Valkenvania by the local police officer, played by John Candy. Taken to appear before the judge, played by Aykroyd, things get weirder and weirder, driving Chase and Moore to attempt an escape from the law.
On the DVD commentary track for Ghostbusters, Ivan Reitman, Harold Ramis & Joe Medjuck tell the story of how Dan Aykroyd turned in an unfilmable 600-page script set in the far future, and of how Harold Ramis helped him rework the material as the very funny, broadly accessible comedy it became.
By contrast, Aykroyd directed, co-wrote, and produced Nothing But Trouble. In other words, nobody existed who could have said "no" to him, or to help him rework his rough ideas into a form that others could appreciate. The resulting film is more or less a tour of the mind of Dan Aykroyd, and it's an extremely unusual place to be.
But the badness of the film depends not only on Aykroyd's fevered imagination: Demi Moore's outfits might best be described as "catastrophe in white," and what she lacks in comic timing she amplifies by taking her role far too seriously. The Digital Underground appears as another group of detainees, mostly so that Aykroyd can jam with them as they play.
Finally, the film contains repeated endings, one after another. More than once, you'll breathe a sigh of relief and think that the worst is over, and the credits are about to roll. Then the next scene begins, eliciting another round of groans from the audience.
If not for his obituary in the New York Times, I might never have heard of Lionel Casson, a historian focused on ancient maritime history. I've now placed one of his books on hold at the library, and hope to read it shortly.
The obituary was pretty near perfect: a description of Dr. Casson's accomplishments, and what made them unique and interesting, including an evocative distillation of his central subject. The obituary ends with a brief but witty anecdote.
In short, this obituary celebrate the man and his work, and generates additional interest in what might be an obscure subject of whom most people, myself included, have never heard.
Usually I stay out of science versus religion debates on the grounds that both sides tend to be more interested in promoting their view than hearing and understanding the other side of the issue, whereas I tend to believe that the point of discussion is mostly to hear what the other guy has to say. For my taste, too much of this sort of debate ends up as preaching to the choir, if you'll pardon the idiom.
That said, Geoff Arnold makes a great point when he says:
You know, people seem to think that as soon as something is described as “contingent”, all bets are off: that it could be not just different, but anything at all. But “contingent” means “dependent”, and things like language and culture – and science – are strongly constrained by the facts that they depend on.
Perhaps I'm naive, but for the first time I think that I really understand the terms of the religion versus science debate: the "religion" contingent is really looking for absolute ground. They're looking upon a rock which everything else can rest. Their question starts with wanting to know what we know for certain.
The "science" contingent, by contrast, is concerned with the epistemological question: how do we know? Their answer, which is backed up by nearly 400 years of science history, suggests that all knowledge is provisional—an answer anathema to the religious contingent's search for grounding. In short, the scientific perspective remains comfortable to a greater or lesser extent with the idea that maybe it is really turtles all the way down, whereas the religious contingent simply cannot accept that answer.
All of which means that perhaps I should pay more close attention to these debates, as I clearly have something left to learn here. I'm sure that the above is commonsense to anyone who follows the issue, but it wasn't something I'd considered before.
All right, maybe he's not doomed, but once again I'm less than impressed with Slate's technology columnist, Farhad Manjoo, who says today that
Google's Chrome OS is doomed.
His article starts out by admitting that Google's Chrome OS (which I'll abbreviate GCOS, even if that name has been used before) is intended, at least initially, for Netbooks.
Nevertheless, his first point ("Linux is hard to love") complains that it's hard to install software on Linux and that Linux doesn't have much hardware peripheral support.
Of course, partnering with hardware manufacturers should address immediate peripheral support; the only other hardware most people want on a netbook is 802.11 support, CDMA or GSM wireless cards from various providers, and maybe an external mouse, if decadence is the order of the day.
His second point ("We aren't ready to run everything on the Web") is exactly why GCOS is being first targeted at Netbooks. They're called Netbooks for a reason: most folks aren't running Microsoft Office on these little boxes.
It's true that we aren't ready to run everything on the Web just yet. But Google isn't shipping GCOS on everything just yet, either.
His third and fourth reasons ("Microsoft is a formidable opponent" and "Google fails often") are true. But they apply equally to every product those vendors ship. If Manjoo was writing about the Zune, would he say that Apple is doomed? Unlikely—even if Microsoft shipped a great product. Does Manjoo think that Google Mail is doomed now that it's left beta? Of course not.
Really, points three and four are just puffery to stretch the whole article out to five points. Readers like lists. Well, editors do, anyway. And readers certainly click on top ten, five reasons. But I'm not sure that readers are more satisfied after having clicked through, given the frequent poor analysis of the lists.
Manjoo's final point claims that "The Chrome OS makes no business sense," because they're giving it away for free. His central claim in this section is that it's "a wasteful customer acquisition expense": it's better for Google to spend more money improving their advertising engine than to branch out into new areas.
Correctly, he notes that the primary point of Chrome OS from a business sense "is to screw with Microsoft." I think that's defensive: every dollar that Microsoft spends chasing the notebook market and protecting their desktop franchise is a dollar they're not spending on Bing. Which I haven't tried, but I hear is pretty good. So by attacking Microsoft, they're protecting their core franchise. (I doubt that Google's Web search improvements are hampered by pouring resources into GCOS.)
Within his "no business sense" claim, Farhad Manjoo also suggests that GCOS as customer acquisition is unnecessarily expensive because Gmail and Google Docs can run from Windows. The cost of customer acquisition may be high—but the customer defection rate of users back to Microsoft Office would be infinitely lower on an operating system that doesn't run office.
Really, though, whether Manjoo is right or wrong is beside the point. In a prior life, I wrote a column about computer security. My job, distilled to its most vaporous essence, was to be controversial, and attract readers; whether I was right or wrong was beside the point, as far as the bottom line was concerned. And make no mistake, attracting readers and clicks was the real objective. The content that did so was secondary.
Thus blogging about Manjoo's article helps accomplish his real goal: being controversial enough to attract readers. I'm torn between a cynical acceptance of that answer—which would push me to stop reading the damn articles, since they're worthless from an analytical perspective—and a utopian wish that higher-quality analysis would attract more readers, and be more valuable than "top five reasons X is doomed" journalism.
But how many of you clicked on this article because of its title?
Within a single week, I've passed two milestones with regard to my writing.
First, I've gone into positive territory on my royalties for Think Unix. Yes, after nearly eight and a half years I've earned back my advance, and am now owed approximately three dollars seventy-five cents by the publisher.
I'm exceedingly pleased that people continue to read and purchase this book, and that except for the two chapters on Unix GUIs the book has remained useful. I wanted to write an "evergreen," and I feel like I succeeded. Not that I couldn't improve the book, or that there aren't things I wish I'd done better, but I think I did pretty well.
Second, I'm pleased to announce that a short story of mine is being published. I've waited until the magazine was printed and ready to go, as I've had things fall through in the past - but you can buy issue one of The Ne'er-Do-Well Magazine, which contains my short story "Lodestar."
If you've read previous versions of this story, I'd encourage you to buy the magazine and re-read it, as it's been substantially revised. Sheila, the editor, is exceedingly perceptive, and her input did the story a lot of good. I'm looking forward to my copy arriving, and reading the rest of the pieces too.
I think it's really very good when people reconsider the things they've said in conversation. My goodness, you can still find things I wrote on the Internet ten or fifteen years ago, and I certainly don't think all of the same things now. I think that the evolution of personal viewpoints is normal, and healthy, and should be welcome.
However, I think it's really bad when you pretend not to have said the things that you previously did. To enforce this kind of ex post facto internal consistency is dishonest. Maybe not to-the-core dishonest, but certainly untrustworthy, and in general not the kind of conversational partner I want to have.
I think that de-publishing is much closer to (but not the same as) pretending you never said something than it is to reconsidering previous viewpoints. It does strike me as uncomfortably Orwellian, even if it is a private group doing it, rather than the government. I mean, how would people feel if the New York Times decided to remove every mention of Monica Lewinsky from their archive due to poor behavior on her part? If Warren Ellis is right, and Cory and Xeni are the "cut and paste editors of the Internet," then it matters, regardless of whether that job was thrust upon them or one that they willingly embraced.
Finally, wading through blog comments on this whole issue reminds me why it's a good thing to keep your conversations small in the first place.
In his review of the new production of The Bacchae, starring Alan Cumming as Dionysus, Charles Isherwood compares Cumming's appearance to Shirley Temple and Boy George, missing (or ignoring) the obvious debt to The Rocky Horror Picture Show's Frank N Furter. So when Isherwood writes that the production's "insistent playfulness makes the transition to the horror of the final scenes troublesome," I wonder if he's watching the same play that I'd be seeing. (The use of pop R&B songs written for the production increases my sense that this production owes more than a little to Rocky Horror.)
In other words, if I magically end up with a couple of extra hours one night in New York, I'd love to see this. (Aw, crap, it ends on the 13th, several days before I make it to New York. Maybe I come down from Connecticut over the weekend? Anyone in New York want to see this next Friday night?)