rc3.org

Strong opinions, weakly held

Month: January 2008 (page 2 of 4)

Hoping for better politics

Veteran North Carolina political observer Kirk Jones explains why 2008 may really turn out to be a <a href= http://www.exileonjonesstreet.com/2008/01/27/impressive-win/”>transformational election. Not only did Democrats turn out almost twice as many voters for their 2008 primary as they did in 2004, but they also did so without resorting to the kinds of petty corruption that politicians traditionally resort to. I’m as skeptical as anyone about just about everything, but even I look for signs of hope.

iPhone WebClip icons

A couple of notes for iPhone users. The first is that Web pages you add to your home screen do not automatically update their icons when the creator of a site changes them. I noticed that Google Reader had a new favicon this morning, and guessed that they’d added an iPhone WebClip icon as well. I opened Google Reader to see if the WebClip icon changed automatically — it did not. However, when I deleted the WebClip bookmark and added it again, Google Reader did in fact have a custom icon.

The second note is that Google Reader has a new WebClip icon, so go get it. Google Mail and Google Calendar still don’t have their own icons.

Quit blaming poor people

As the mortgage crisis unfolds and expands, you see a lot of blame laid on subprime loans, and more specifically, people who signed up for subprime loans. In fact, subprime was voted the word of the year. People are clearly responsible for the contracts they sign, but simply blaming people who took out mortgages they couldn’t pay back is the wrong way to look at the problem.

Last month I talked about the similarity of the mortgage crisis to the junk bond crisis. One of the most important similarities between that implosion and this one is that the entire market was driven not by demand for loans but rather demand for investments. Here’s the question you rarely see asked. Why were banks so eager to sign people up for such incredibly risky mortgages?

The reason is that they had already originated as many good mortgages as they could, and there was still more demand for mortgage backed securities. So mortgage brokers had to find more mortgages to sell, and the easiest way to do it was to loan money to people who really shouldn’t be buying a house, or to convince people to upgrade into larger houses that they couldn’t afford by offering them low monthly payments.

So when you search for the source of the crisis, look in the direction of the big investors who were willing to buy up any old mortgage backed security, no matter what its risk profile was. Those people put billions and billions of dollars on the line, and funded an avalanche of loans sold to the confused, the ignorant, the overly optimistic, and the dishonest.

As the economy continues to go badly, you’ll see more and more people blaming the same people who are losing their homes and watching their financial futures go down the tubes. And while I agree that they do bear responsibility for the decisions they made, they didn’t create this crisis. In many ways they’re the victims.

Definition of the law of unintended consequences

Alex Tabarrok posts the best short definition of the law of unintended consequences I’ve seen:

The law of unintended consequences is what happens when a simple system tries to regulate a complex system.

Mozilla is 10 years old

The Mozilla Foundation is celebrating the ten year anniversary of AOL’s having released the Netscape Navigator source code and creating the foundation. Here’s the original press release. One thing I remember is that Slashdot broke the story of AOL releasing the code before it was announced — it was the first really big story Slashdot broke. (In fact, I think I learned of the existence of Slashdot when someone pointed me to the story there.)

Not long after the code was released, there was a big argument about whether Mozilla should dump the Netscape 4 HTML rendering engine and use a new, modern, standards compliant engine called NGLayout, or whether they should just get a release out the door built on the existing code. Back in October of 1998, I wrote a scathing piece insulting the Web Standards Project for lobbying the Mozilla folks to move to NGLayout, which I’ve quoted in full below. (This was in my pre-blog days when I was more an essayist.) The Mozilla Foundation rewarded me for defending them so ardently by announcing that they were adopting NGLayout just a month later.

“I Want My NGLayout!”

October 5, 1998

As regular Outraged! readers already know, this writer is generally dissatisfied with the so-called standards process in the computer industry. Standards which are written before working code is created are more often than not doomed to failure, standing instead as filthy monuments to the capriciousness and excess energy of companies with time and money to burn.

One particular showpiece of the standards process is the current state of HTML, as implemented by the world’s two most popular browsers, Netscape and Internet Explorer. They both comply to varying degrees with the relevant standards (CSS1, CSS2, and DOM to the buzzword savvy), but neither browser maker has shown much initiative in the race for 100% standards compliance. This indicates two things; one, that writing browsers that comply with standards isn’t a high priority, and two, that it isn’t particularly easy (if implementing CSS and DOM were easy, both browsers would support them).

Anyway, some disgruntled Web developers have banded together to cajole Microsoft and Netscape into providing full standards compliance in their Web browsers, in order to further the Web as a platform for deploying applications, and to make the job of designing nice Web sites easier and less expensive. Anyone who has attempted to use the latest features in the Web browsers (generally mashed together under the misnomer DHTML) can attest to the fact that this is a worthy effort; the current state of standards compliance basically dictates that everything must be written twice (once for each browser).

Unfortunately, the members of the Web Standards Project, as this nascent group is called, have decided that their first axe to grind is with Netscape over which “rendering engine” will be included with version 5.0 of its browser. As most everyone who hasn’t been in a deep sleep for most of 1998 knows, Netscape released the source code to its browser back on March 31. Since then, even though Netscape has retained the prime caretaking role over the code, the browser has been open to public input, contributions, and scrutiny.

Thus, the public has gotten a rare inside look at the guts of the development process of an incredibly complex, popular application. Not long after the Mozilla project got underway, Netscape released the source code to NGLayout (which was, at the time, called Raptor). NGLayout is Netscape’s next generation rendering engine; it will provide tighter standards compliance and better performance than the current rendering engine, which is known as Mariner. Unfortunately, it is also significantly further from completion than Mariner, and hasn’t been integrated with the rest of the browser. Today, you can download a rough build of NGLayout which runs in a skeleton window and renders HTML extremely quickly during the brief period of time before it crashes.

Under ordinary circumstances, the public wouldn’t even know that NGLayout existed, and certainly wouldn’t know where its level of completion stands as compared to the Mariner engine, which is undergoing incremental improvements for the first public release of Mozilla. But, now that it’s part of the Mozilla project, people are free to view and toy with the source code, and compare it to what’s currently out there.

The Web Standards Project (WaSP) has started a petition to urge Netscape to forget about Mariner (the current rendering engine), and focus all of its energies on NGLayout, which is going to be much better than Mariner when it is completed. While this seems like a good idea, and I have no doubt that the WaSP means well, this effort betrays a baffling lack of understanding of the way the open source development model works, and poor choice of tactics.

First, let me talk about the sheer inanity of the very concept of the Internet petition. Perhaps, at one time, the online petition was a fine way to demonstrate that there was a groundswell of support behind an issue, but I firmly believe that day has passed. There are petitions for everything on the Internet; covering everything from television shows that get cancelled to the lack of a particular game for the Macintosh. News of various petitions spreads like wildfire, and since the cost of filling out a petition is nothing, people fill out petitions campaigning for issues that they scarcely care about. Unfortunately, because the level of effort required to circulate a petition online is so low, the petitions get no respect. Decision makers just aren’t interested in reading a report saying that 150,000 people want the latest version of QuickBooks to be ported to the Macintosh without knowing how many of them are willing to put their money where their mouth is.

The fact that Mozilla is an open source project only further dooms the WaSP petition to irrelevance. Even if the signatories of the petition have money to spend, it doesn’t matter, because Mozilla is totally and completely free. The blessing and the curse of the open source movement is that the areas of development are driven by the aims of the people who actually work on the projects. Mozilla will support Apple’s ColorSync because people at Apple felt it was worthwhile to contribute that code, not because somebody signed a petition saying they should do it.

The galling thing is that if the WaSP wants better standards support in Mozilla, they should be working to contribute to the Mozilla project directly, or to find some friends who can. The reason Mariner is slated to be part of Mozilla is that NGLayout doesn’t look like it will be ready in time to meet the project’s timetable. Does it really make sense to hold up the release of the first public version of Mozilla in order to appease a few puling Web developers?

Jeffery Zeldman, one of the leading members of the WaSP urges Netscape to “do the right thing,” but I’m forced to wonder if he even really knows what that is. What he and the other members of the WaSP seem to be saying is, “do the right thing for us.” Dan Shafer, pundit at large for CNet’s Builder.com, lays down an ultimatum, “Netscape must not ship a 5.0 browser without NGLayout.” He further urges them to pull out all the stops and commit its entire engineering resource to this effort.

What he, and the other folks behind this petition, fail to realize is that they are part of Netscape’s engineering resource. The success or failure of Mozilla depends on the Internet community at large as much as it does on Netscape. If they, or others, want Mozilla to have a particular feature, or look a certain way, or run on a certain platform, then they’re as empowered as anyone at Netscape to make it happen.

The source code is out there. The rest is up to you.

Spring is a more desirable skill than EJB

The Spring Team blog announces that the Spring framework is now a more commonly requested skill for developers than EJB. About four years ago, I started building applications using Spring and Hibernate, even though much of the industry focus was still on EJB. I thought that the approach that I and a number of other people had started taking was correct, I was a little bit scared about neglecting a skill that so many employers were seeking.

So I went out and bought the book Mastering EJB and read it cover to cover, just to make sure I wasn’t missing something. After reading it, I realized that while I could see that some projects would require EJB, they would not have made writing any application I’d ever worked on easier, nor would they have improved the functionality of any of those applications. So I promptly forgot everything I’d read and continued to use the lightweight libraries.

It’s nice to look back and find myself completely vindicated.

Bill Gates at Davos

FP Passport reports that Bill Gates will give a 30 minute speech Thursday at the World Economic Forum entitled “A New Approach to Capitalism in the 21st Century.” He will challenge business and government to do more to address the problems of disease and poverty in the developing world.

The implications of IE8

Microsoft’s Monday announcement of the new browser compatibility features in Internet Explorer 8 has set of a torrent of commentary. They described their new approach to markup versioning in an article at A List Apart and on the Internet Explorer blog.

The basic idea is that Internet Explorer 8 will enable you to specify which rendering engine you want IE to use in a meta tag on your page. So you can tell it your markup is IE7 compatible, IE8 compatible, or bleeding edge compatible, in which case it will use whatever the latest and greatest rendering engine is. We can assume that IE9 will retain all of those modes and include a new IE9 mode as well, assuming that Microsoft doesn’t decide it’s too much trouble to maintain all of the different rendering modes and toss some of them out by then. If you don’t include the special meta tag to turn on IE8 rendering mode, IE8 will default to using the IE7 rendering engine.

I appreciate that Microsoft is trying to solve a tough problem here — not breaking the sites of people who have included hacks for specific versions of Internet Explorer while at the same time giving themselves the opportunity to fix bugs and add new features in the browser. I don’t know if it’s an elegant solution, but it is a solution to that specific problem. What I wonder, though, is what this means for Web developers. I imagine the question most developers are asking is, “What’s the most sensible way to write markup, CSS, and JavaScript in the coming browser ecosystem?”

Sam Ruby points out that any DOCTYPE that’s unknown to IE will be rendered with the IE8 rendering engine. So if you want to bank on the improved standards support in IE and avoid using the new meta tag, just pick a DOCTYPE that is correct but that IE doesn’t do anything special with. I have a feeling that’s the approach most standards oriented Web developers will wind up taking.

Those developers whose depends on specific quirks of IE7 will be able to force IE8 to render pages in that mode, and developers who code to standards can pick a DOCTYPE that IE will use to render pages in its newest, most standards compliant mode. That strikes me as a pretty reasonable workaround for the meta tag business, which I don’t see any real need for anyone to use.

The other question I have is how the other browser makers will react. Firefox and Safari seem content to keep improving the standards support in their browsers without dithering over breakage of older pages, but their situation is different than Microsoft’s. For one thing, they have always been significantly more standards compliant than IE, so they don’t inflict severe breakage on people when they change. Secondly, people rarely include Safari and Firefox hacks on their pages. They code to the standards supported by Firefox and Safari, and then include whatever hacks they need to in order to make their pages look right in IE6 and IE7. The developers of WebKit are not going to be adopting the versioned rendering engines approach IE is taking, and really there’s no reason to do so. I expect the Mozilla folks will say the same thing.

The specific bit of good news here is that it should free developers from adding yet one more browser-specific style sheet for IE8. Lots of people these days are using hackish workarounds to make sure their pages look right in both IE6 and IE7. When IE8 is released, you won’t have to do anything to make sure your IE7-specific code doesn’t cause breakage, the way IE6 code did in IE7. And it looks like there is a path to using the latest and greatest features without any of this meta tag foolishness. So I think it’s going to be alright.

There is a whole lot more on this topic just about everywhere, so much so that I’m going to punt on linking to all of the reaction. You’ve probably read it all anyway.

Andrew Leonard on today’s rate cut

Here’s Salon’s Andrew Leonard on today’s 75 point emergency rate cut from the Federal Reserve:

If Bernanke has been “wrong” so many times, was he wrong Tuesday morning? As of this writing, around 2:20 p.m. EST, the lead headline on the Wall Street Journal declared “Fed’s Deep Cut Appears to Soothe Markets.” After falling almost 500 points at the start of trading, impelled by massive sell-offs on stock exchanges around the world, the Dow Jones industrial average had fought its way back to a relatively minor 118-point drop. What if Bernanke had done nothing, or even waited just eight days until the regular meeting of the Federal Reserve Board of Governors meeting to deliver his rate cut? If Monday, Jan. 21, is already being called the 21st century version of Black Monday, summoning up memories of the crash of 1987, what would Tuesday have looked like without a rate cut bailout?

Given the clear connection between Tuesday’s rate cut and global market turmoil, it is hard to avoid at least one conclusion. Bernanke has proven, once and for all, that juicing the stock market is now considered Job No. 1 for the Federal Reserve Bank. The material effects of rate cuts do not show up in economic growth statistics for months or even years after their enactment. By making an emergency “inter-meeting” cut a mere eight days before its regularly scheduled meeting, Bernanke is conducting economic policy in order to appease market psychology. The fragile psyches of Wall Street traders who played such a pivotal role in creating this mess by romping through the derivatives wonderland, are now in control of government strategy.

David Simon is wrong about the news

One assertion I’ve seen David Simon make in multiple places is that newspapers blew it by not charging for online access to their content when they could.

I think he’s just wrong about that, as does former newspaperman Scott Rosenberg:

I always saw print journalism as doomed. I loved it anyway, the way you might love a beautiful old car whose engine leak is too costly to repair. There was no way to know how much longer the old newspapers would run, but — outside of exceptional cases like the Times and the Journal, which face their own struggles — they plainly weren’t going to run forever. When the opportunity to leave for the Web came along in 1995, I took it without hesitation.

Here we are, a dozen years later, and only now does it seem to be dawning on many newsroom veterans that the entire industry missed the boat. Simon blames narrow-minded executives, and they are surely at fault, but they were also stuck in a transition that was bound to overpower them. Complaining that newspapers should have charged for their online wares “when they had the chance” is foolish and self-deluding — like wondering why you missed the chance to boost your restaurant’s profits by charging for air. That model was never going to work.

Older posts Newer posts

© 2024 rc3.org

Theme by Anders NorenUp ↑