rc3.org

Strong opinions, weakly held

Author: Rafe (page 53 of 989)

Will North Carolina ban community broadband?

Larry Lessig begs Democratic North Carolina governor to veto a bill that would ban community broadband networks:

On your desk is a bill passed by the overwhelmingly Republican North Carolina legislature to ban local communities from building or supporting community broadband networks. (H.129). By midnight tonight, you must decide whether to veto that bill, and force the legislature to take a second look.

North Carolina is an overwhelmingly rural state. Relative to the communities it competes with around the globe, it has among the slowest and most expensive Internet service. No economy will thrive in the 21st century without fast, cheap broadband, linking citizens, and enabling businesses to compete. And thus many communities throughout your state have contracted with private businesses to build their own community broadband networks.

This bill is a terrible idea. Banning local communities from providing services that voters are willing to pay for on behalf of corporate political contributors is fundamentally undemocratic.

I’m going to skip the diatribe about what this kind of legislation says about Republican priorities. I will add that Governor Perdue is almost certainly not going to be reelected next year, so she may as well do the right thing.

Update: Governor Perdue will allow the bill to become law without her signature, taking the most gutless possible course.

Links for May 20

Finding the value in acquired companies

Horace Dediu on acquiring companies:

Clayton Christensen succinctly defined the value in any company as the sum of three constituent parts: resources, processes or business models. Market value can be nothing more and nothing less than these three things.

An acquisition has to be positioned on one of these targets just like a product is positioned on a specific market. The problem with being deliberate about where the value lies is that once positioned a certain way, the integration team will begin to execute on that plan. This means that the thing you decided was worth most (e.g. resources) gets all the attention and the other potential sources of value (processes or profit models) are discarded.

This argument reduces to there being three separate companies being available. The buyer pays for all but gets to keep only one.

When you look at it this way you realize that the reason most acquisitions fail is because the buyer throws away most of the real value in the company.

He goes on to analyze the acquisition of Skype by Microsoft by these criteria. I’d like to see other acquisitions that occurred further back in the past analyzed in this way. My gut feeling is that nearly all acquisitions focus on resources, since they’re the most obvious repository of value and the easiest (relatively speaking) thing to integrate into the acquiring company. It’s very hard to merge the processes of two different companies together, or for an acquiring company to support new business models that didn’t develop organically.

Jumping back into Rails with both feet

Since February, I’ve been working on a pro bono project that launched last week. What’s in it for them is they got a brand new application that replaces an unwieldy bunch of paper and email-based processes. Before, they had a Web page that was protected by basic authentication and led to a set of form-to-mail scripts and downloadable Word documents. Now they have an application with real user accounts that can be carried over from year to year. There’s still a lot of work to do on the administrative side of the application, but I’m quite proud of the results so far.

What’s in it for me is I got to revisit Ruby on Rails after a couple of years away working primarily with Java and PHP. I also used Git, GitHub, Pivotal Tracker, and switched from TextMate to Vim for this project. The project has been very entertaining for me, aside from some stress as the release date moved closer to the early bird deadlines on some of the forms we were replacing.

I found it really easy to jump back into Rails, despite the fact that a lot of things have changed since the last time I used it. Most of the utility scripts that used to be run individually are now bundled under a single rails script, and Rails is geared much more strongly toward using resource-oriented controllers rather than developers structuring controllers in their own way, but beyond that, things are basically the same. I also found that most of the things that are different are, in fact, improved. Rails 3 handles dependencies far, far more elegantly than its predecessors did, which makes deployment much better. It’s easier than it has ever been to set up deployment using Capistrano. The testing framework is more flexible and powerful.

For some time I wondered whether Rails was going to be an evolutionary dead end in terms of Web applications platforms, but I’m very impressed at where it is as a platform right now.

I used two plugins that were a huge help in terms of getting the application built. The first is the Devise plugin, which provides user registration and authentication functionality. I’ve been very impressed with its capabilities and ease of use. The second is CanCan, which provides a role-based authorization system that integrates nicely with Devise. The two of them saved me weeks of work.

The one big mistake I made was not taking a test-first approach to development. I had intended to, but I was in a rush and I ran into an inflector bug that caused me some grief. That bug prevented my tests from working properly, and rather than tracking down the issue, I did a bunch of development without writing accompanying tests. Now I’m backfilling the tests, which is never fun.

The organization I’m building this application for uses DreamHost for hosting, and I assumed I’d be able to use their existing hosting account to deploy this application. Unfortunately, while DreamHost does support Rails, they do not yet support Rails 3. I wound up having to deploy the application on my own slice running at Linode. I considered sites like Heroku, but they were just too expensive. I had thought we’d be closer to turnkey hosting for Rails by now, but that still appears not to be the case. On the other hand, getting the Rails application up and running from scratch on my own Linux server was simpler than it has ever been.

Rails is no longer the hot young thing that all of the developers are swooning over, but I’m finding it to be more excellent than ever. I still can’t imagine building an application in PHP if Ruby on Rails is also an option. The other takeaway is that developers need to be on GitHub, period. I’m using a private repository to host this project and it’s working beautifully.

Hacking Tyler, Texas

Here’s a project that looks interesting: journalist/programmer Christopher Groskopf is moving from Chicago to Tyler, Texas for reasons mostly not of his choosing. In doing so, he’s going to work to adapt the town to be what he wants it to be. Here’s a taste:

Tyler has information that could be freed. Tyler has government that could be opened. Tyler has news that could be hacked. Moreover, Tyler has an almost completely unexploited market. There are no hackers there. The small number of high-tech businesses that exist in the region are either web development shops serving local businesses or robotics companies.

Should be interesting.

Against arbitrary measures of worth

Chris Dixon explains how Tom Pinckney got into MIT without a high school diploma:

Tom grew up in rural South Carolina and mostly stayed at home writing video games on his Apple II. There was no place nearby to go to high school. He took a few community college classes but none of those places could give him a high school degree. It didn’t really matter – all he wanted to do was program computers. So when it came time to apply to college, Tom just printed out a pile of code he wrote and sent it to colleges.

It’s worth noting that the incentives for college administrators are completely misaligned with this sort of flexibility in admissions standards.

In the larger sense, this is a reminder that when it comes to evaluating things, room should be made for the exercise of human judgement, especially when the relationship between measurable factors and the final results cannot be easily quantified.

The latest on torture and the hunt for Osama bin Laden

Nobody rounds up the news like Dan Froomkin, and his latest piece is on the reaction among interrogators and intelligence professionals on whether torture helped us track down Osama bin Laden. Here’s the summary:

Defenders of the Bush administration’s interrogation policies have claimed vindication from reports that bin Laden was tracked down in small part due to information received from brutalized detainees some six to eight years ago.

But that sequence of events — even if true — doesn’t demonstrate the effectiveness of torture, these experts say. Rather, it indicates bin Laden could have been caught much earlier had those detainees been interrogated properly.

The truth is that the US captured a number of people who knew the name of the courier who ultimately led us to bin Laden, and none of them ever gave up the name, even under torture.

It’s sad but unsurprising that having invested their legacies in the promotion and defense of interrogation techniques that the US has, in the past, treated as war crimes, the defenders of torture are absolutely compelled to make completely unjustifiable claims about its efficacy. And, as Andrew Sullivan pointed out yesterday, the eagerness of torture apologists to justify its use in response to Osama bin Laden being found shows that they were always lying about reserving it for “ticking time bomb” scenarios.

Tim Bray on Derek Miller

Tim Bray mourns Derek K. Miller, the Vancouver blogger who wrote about his battle with cancer right up to (and indeed, past) the end. Here’s Tim’s last bit of advice:

If your plans for your approaching death include a closing magnum opus, well then get your caching setup right.

Rackspace is shutting down Slicehost

Rackspace is shutting down Slicehost and migrating customers to their own cloud hosting service. This blog is hosted on a 512 slice there now, for $38 a month. It’s impossible to see what an equivalent cloud server from Rackspace would cost without registering for an account. I’ve been meaning on shutting down my slice anyway because of some problems that are my own fault. I’ll be migrating the site to Linode, where I already have a virtual server.

Did torture lead us to Osama bin Laden?

Andrew Sullivan rounds up what we know about the trail to Osama bin Laden. Despite the fact that torture apologists, as predicted, are crediting waterboarding with leading us to bin Laden, that does not appear to be the case.

Update: You’ll want to read this post at the Inverse Square Blog as well.

I would imagine that none of this is news to people who regularly read my blog, but in the coming days people will be asking whether torture was helpful in capturing Osama bin Laden. I think the facts on the record show it did not, and if my linking can lead other people to those facts, I am happy to assist.

Here’s another exhaustive post from Naked Capitalism on the same topic.

Further Update: This is the nonsense we’re up against.

Older posts Newer posts

© 2024 rc3.org

Theme by Anders NorenUp ↑