rc3.org

Strong opinions, weakly held

Month: June 2008 (page 2 of 3)

Tim Bray on Wikipedia deletionists

Tim Bray denounces Wikipedia deletionists with great style and passion:

A little thought-experiment is in order: What harm would ensue were Wikipedia to contain an accurate if slightly boring entry on someone who was just an ordinary person and entirely fame-free? Well, Wikipedia’s “encyclopedia-ness” might be impaired… but I thought the purpose of Wikipedia was to serve the Net’s users, not worry about how closely it adheres to the traditional frameworks of the reference publishing industry?

I suggest the deletionist wankers go and cluster around an alternate online reference tome which has articles only about God, Immanuel Kant, and Britney Spears. Notability is not in question, so they should be happy.

Rogers Cadenhead has taken on the deletionists as well, regarding his own entry.

For a nice monument to the idiocy of Wikipedia deletionist sentiment, check out the “votes for deletion” page for Leslie Harpold. Her page has been deleted, but the reasons why she was deemed unworthy by the idiots who feel it’s important to keep Wikipedia smaller remain forever.

Pick one feed format

Nelson Minar recommends that the popular blog tools dump RSS and provide their feeds in one format, Atom. This echoes the best practice established at least two years ago when Nick Bradbury and Sam Ruby recommended choosing a single format for your feeds and sticking with that rather than providing the same data in multiple formats.

I agree with that sentiment and also endorse choosing Atom makes sense at this point. The only remaining advantage for RSS is that it has better name recognition and its name doesn’t conflict with the smallest particle that comprises an element or any of the other things also called “atom”.

Winning process

Paul DePodesta is a baseball guy who was made famous in Michael Lewis’ book Moneyball. At the time of the writing he was the assistant to A’s general manager Billy Beane, and then went on to serve as general manager of the Los Angeles Dodgers. Now he works in the front office for the San Diego Padres. On his blog, he writes about the basics of building a successful team. The key is to focus on process rather than outcome:

We all want to be in the upper left box – deserved success resulting from a good process. This is generally where the casino lives. I’d like to think that this is where the Oakland A’s and San Diego Padres have been during the regular seasons. The box in the upper right, however, is the tough reality we all face in industries that are dominated by uncertainty. A good process can lead to a bad outcome in the real world. In fact, it happens all the time. This is what happened to the casino when a player hit on 17 and won. I’d like to think this is what happened to the A’s and Padres during the post-seasons. 🙂

As tough as a good process/bad outcome combination is, nothing compares to the bottom left: bad process/good outcome. This is the wolf in sheep’s clothing that allows for one-time success but almost always cripples any chance of sustained success – the player hitting on 17 and getting a four. Here’s the rub: it’s incredibly difficult to look in the mirror after a victory, any victory, and admit that you were lucky.

The whole article is well worth reading.

Gitmo detainees have rights under the Constitution

The Supreme Court ruled today that Gitmo detainees have rights after all:

In a stunning blow to the Bush Administration in its war-on-terrorism policies, the Supreme Court ruled Thursday that foreign nationals held at Guantanamo Bay have a right to pursue habeas challenges to their detention. The Court, dividing 5-4, ruled that Congress had not validly taken away habeas rights. If Congress wishes to suspend habeas, it must do so only as the Constitution allows — when the country faces rebellion or invasion.

The Court stressed that it was not ruling that the detainees are entitled to be released — that is, entitled to have writs issued to end their confinement. That issue, it said, is left to the District Court judges who will be hearing the challenges. The Court also said that “we do not address whether the President has authority to detain” individuals during the war on terrorism, and hold them at the U.S. Naval base in Cuba; that, too, it said, is to be considered first by the District judges.

About time. Don’t miss Balkinization for insightful running commentary.

I’ll also say that the bottom line is that if John McCain is elected President, cases like this go the other way for a long, long time.

Update: Dahlia Lithwick has posted her analysis of the ruling.

Is it OK to require JavaScript?

WebMonkey today asks whether or not it’s OK for Web sites to require JavaScript.

My opinion on this question has changed a lot over the past year or two. Not long ago, I would have said that it’s never to require JavaScript, but I don’t feel that way any more. I think that if your site centers around publishing, you should certainly make all of your content available to users who have disabled JavaScript, but if you provide more application-like functionality, requiring JavaScript for certain features is OK.

As Simon Willison notes, you should always use unobtrusive JavaScript, and progressive enhancement is my recommended approach, generally speaking.

In the end, how you employ JavaScript is a business decision. Developers have to weigh the costs of providing workarounds for people who don’t have JavaScript with the potential loss of revenue from not providing them.

I think the more interesting question is whether or not it makes sense, as a user, to disable JavaScript. Last February I tried an experiment where I disabled JavaScript by default and only turned it back on for specific sites. Disabling JavaScript eliminates an entire class of obnoxious advertisements, and can really speed up the browsing experiment on many Web sites. I found that in many cases, leaving out the JavaScript made Web sites better, not worse. I haven’t used NoScript in awhile, but I still think that employing it is a good idea for most people.

How I roll

Inspired by this post where Flickr’s developers talk about their toolset, here are the tools I use to get the job done:

  • 20″ iMac with second 20″ monitor (in the office)
  • 24″ iMac (home)
  • black MacBook (don’t leave home without it)
  • Terminal.app
  • Quicksilver
  • TextMate (for Rails and PHP development)
  • Eclipse (for Java development)
  • vim (for everything else)
  • Subversion
  • MySQL
  • Perl (I still rely on it for one offs)
  • Adium
  • Gmail (all my mail goes through my Gmail account)
  • MarsEdit
  • Google Reader
  • NetNewsWire

Update: I inadvertently removed Firefox (and Firebug) from the list when I was working on this post. I honestly don’t know how people do JavaScript without Firebug.

Adventures in consulting (from June 30, 1998)

Here’s an old essay I wrote back in 1998, reproduced for no good reason. The story is completely true, and the client in question was Pfizer.

Adventures in Consulting, Episode 1

Sometimes as a consultant, you make seemingly innocent suggestions to your clients that end up haunting you for months. For example, I was working on a groupware project, and the client wanted usage reports to see how the application was being utilized.

I innocently suggested that we could beautify the reports with some simple graphs generated using Java applets. The applets came prepackaged with the middleware we were using, and setting them up with dynamic data was trivial. They had drop shadows, nice legends, and all sorts of other eye candy, and the performance was pretty darn good, all things considered. The client was impressed. This is why they pay the big bucks to consultants for these types of things.

Then, I noticed a problem. The client’s site is standardized on Netscape Navigator. Netscape Navigator won’t print Java applets. Faced with the prospect of being unable to print out these eye pleasing graphs, the client was nonplused. Naturally, with our solution crumbling before our eyes and the deadline staring us down, the project shifted into panic mode.

It was too late to pull back, what was once a feather in our cap was swiftly looking like a black eye. The graphs had become a requirement. We were redeemed by a third party product that was only going to cost the client another thousand bucks. It was another suite of Java graphing applets, but this one came with a funky client-server application that allowed you to click on a button and create a duplicate of the page with a GIF instead of an applet.

Sure, compared to the single Java applet, this was a bit more complex. It required two Java applets instead of one, and oh yeah, you have to run a separate server process on the Web server that will generate the GIFs dynamically. Naturally, these additional components hurt performance a bit, but hey, it worked, and when you start throwing in requirements as the project progresses, these things happen.

After we demoed this fine working alternative, there were questions about the speed of Java, firewall issues with the client-server applet, and general issues with performance. Panic again reared its ugly head.

It’s not like we were out of alternatives. Remember, I’m a consultant. The Java applet suite shipped with yet another utility, one which will run a Java applet locally and save a snapshot as a GIF. We could just create a snapshot of each report nightly and let the users go to static HTML pages. This seemed to be a fine solution; at the low price of slightly stale data we’d get great performance and something that works in almost every single browser. The verdict was rendered — the graphs must be generated using realtime data.

At this point, you may think that I was scrambling, but it takes more than a few serious setbacks to put me out of the game. Never mind that the deadlines had come and gone, and our new deadline was a couple of short days away; failure was not an option. In the shower one morning, I had an epiphany. I would write a Perl CGI script that would figure out which report a user wanted (and whether they were allowed to see it), which in turn would call the Java utility, which would call the script that would dynamically render the page and graph, and save the snapshot; and then redirect the user to the new page that was created on the fly. Brilliance.

Sure, it sounds a bit complex, and you might think that there are performance issues with a Perl script that launches Java which in turn loads a scripted web page, writes the page to new HTML and GIF files, and then sends the users to that page, but hey, that’s the price you pay for long lists of convoluted requirements. With a devilish gleam in my eye, I delivered my newest Rube Goldberg creation to the product manager, proud that I had managed to actually meet every single requirement of the project, with no tradeoffs (except for perhaps a small performance hit).

Sitting in my cube, getting started (late) on my next project, I was jarred by the very project manager to whom I had just made the delivery. “It doesn’t work.” What do you mean, I asked. Of course it works. “The page never comes up.”

Innocently I responded, “How long did you wait?”

A couple of Obama items

I saw two interesting Barack Obama-related items over the past few days. (This isn’t a “vote for Obama” post, it’s more about process.) The first is this spreadsheet of internal predictions from the Obama campaign that was leaked on February 7. In it, his campaign predicts how the popular vote and delegate allocations will turn out in each state, and the thing about it is that it’s amazingly accurate. He underestimates his margin in some states he won, but his predictions aren’t far off until late in the primary season.

Well ahead of time, his campaign predicted that it would lose Ohio, Texas, and Pennsylvania. Given that they had a plan for victory, it’s clear that the plan accounted for not winning those states. I think his late season underperformance has to more to do with the fact that he didn’t have to try very hard late in the season. He was going to win the nomination with or without Kentucky or West Virginia, and so he put less money and face time into those races than he would have had he needed to do better there to secure the nomination.

As someone who’s asked every day to predict how long it will take to fix bugs and add features to software, I’m impressed with this degree of accuracy in projecting the future. I’d love to read a post-mortem after the election that explains how the campaign came up with its forecast.

The other thing I found interesting was Obama’s June 6 speech to campaign staff. It answers the question, “How do you explain to employees that they don’t get any days off for the next five months?” I think he does a pretty good job.

John Royal’s obituary for Jim McKay

Houston blogger John Royal writes an obituary for sportscaster Jim McKay. It seems silly to let someone who was truly great at what they did pass without notice. It’s odd to think that humanity will never again witness the birth of the electronic mass media, and the iconic figures who set the standards for that medium will hold a sort of unique historic position. As Royal points out, McKay is one of the last few of a generation that will be much missed.

HBO is going to re-air its 2003 McKay documentary on Thursday and Sunday, watch it if you get a chance.

Warren Buffett is down on hedge funds

Warren Buffett has placed a $1 million long bet on the following prediction:

Over a ten-year period commencing on January 1, 2008, and ending on December 31, 2017, the S & P 500 will outperform a portfolio of funds of hedge funds, when performance is measured on a basis net of fees, costs and expenses.

Interestingly, the exact terms of the bet are confidential. The challenger is, of course, a hedge fund partnership.

Older posts Newer posts

© 2024 rc3.org

Theme by Anders NorenUp ↑