rc3.org

Strong opinions, weakly held

Author: Rafe (page 42 of 989)

I regret that I didn’t use Google Code Search more

Miguel de Icaza writes about the bad news that Google is shutting down Code Search. In it, he lists a number of things Code Search was useful for that never really occurred to me. I hate missing out. I particularly regret not taking advantage of it when I was wrestling with connection and socket timeouts with Commons HttpClient awhile back.

The purposes of punishment

On the Advanced NFL Stats blog, Brian Burke has written an interesting post about the philosophy of punishment. First, he lists the reasons why you punish people:

… punishment has at least five possible purposes: incapacitation, restitution, rehabilitation, deterrent, and prevention of retribution.

He then goes on to elaborate on each of those. I particularly liked his explanation of the last purpose:

Another purpose of punishment, one that I think we’ve lost touch with in modern society, is to prevent a cycle of vigilante retribution. The ancient verse eye for an eye, tooth for a tooth is widely understood as recommending retaliation. In tribal societies absent of a central authority, it was common for cycles of retribution to spiral out of control. If one party knocks out another’s tooth, pretty soon the victim’s cousins would be exacting revenge on the offender’s family. I understand the verse to mean hey dummy, don’t take an eye in exchange for a tooth. Knock out the other guy’s tooth and let that be the end of it. Otherwise we’ve got the Hatsfields and McCoys, Montagues and Capulets, or Bosnians and Serbs.

Given the pervasiveness of vigilanteism as a theme in fiction, I don’t think we’ve lost touch with this aspect of punishment to the degree that he supposes.

Why do good people build bad applications?

HeatstrokeLots of people are commenting on the Gun.io blog post The Government’s $200,000 Useless Android Application. Android developer Rich Jones stumbled across an application provided by the Occupational Safety and Health Administration (an agency of the US government). In the end, he discovers that the government paid $96,000 to have a contractor build this buggy application that would take him around 6 hours to write. You really should read the whole thing — the steps he went through to get the details of the contract are interesting.

The obvious response to this is to point out the pervasiveness of government waste, but everybody already knows that the government wastes a lot of money. In fact, pretty much any institution that has lots of money and lots of bureaucracy wastes a lot of money. For example, here’s a recent Tweet from Horace Dediu:

HP spent $1.2 billion to buy Palm, earned $600 million in losses and $1.5 billion to shut it down.

When I look at the OSHA app, what I wonder about is what kind of process led to something so bad being built and released. Who thought it was a good idea to spend $96,000 to build something so simple? What did the actual project entail?

I’ve seen seemingly simple tasks mushroom into huge projects in large organizations. Last year at work we were talking to a customer about integrating with our Web service. I can write the code to integrate with every feature we provide in a day or two — their internal estimate for the entire project was 1,000 to 2,000 hours, mainly because the customer bundled a lot of needed internal changes into the project. They did take on the project, and I have no idea how long it took them in the end.

I figured that this was a case where honest people made their best effort to produce something good, and wound up with a subpar result. In the end, though, I wasn’t so sure. The original blog post mentioned that the application was buggy, and I wanted to figure out how you could pay $96,000 for a simple, buggy application.

You can download the source to the application on the OSHA Web site. The Android version of the application contains 2134 lines of Java code, plus various layout elements. Unsurprisingly, there are no tests at all in the packaged source. There’s no documentation, either.

Just for fun, I decided to try to track down the bug mentioned in the Gun.io post — the application showed the current temperature in Boston as 140 degrees.

The application retrieves weather data from a NOAA Web service and then parses it using a SAX content handler that they wrote. Unfortunately, the code that constructs the URL for the Web service has been removed, probably because there’s no authentication. Lacking any example data, I looked at the code that processes the data instead.

The first thing that stood out to me was that the variable name of the SAX content handler is myExampleHandler. A quick Google search revealed that they just copied that part of the code from this blog post and didn’t bother to change the variable names or the comments. That’s a pretty clear indicator that the code was not written by a professional who cares about their work.

The content handler itself is really badly written. For example, the developer sets a bunch of boolean variables to keep track of which elements they’re processing, but then never actually uses them. Instead they use a completely other group of integer variables that they use as booleans. Why did they create two sets of variables for roughly the same purpose? I have no idea, but it could be because multiple people worked on it and didn’t even bother to try to figure out what was going on before they started adding code.

In the end, I wasn’t able to find the bug. My original guess was that the service returned the temperature in Fahrenheit and the developer converted it from Celsius to Fahrenheit mistakenly, but that’s not the case. Now I think it’s presenting the Heat Index and labeling it as the temperature, but I don’t understand how Android layouts work so I’m not sure that’s it either.

In any case, the application was probably not built by an experienced Java developer. They didn’t follow any normal conventions with regard to naming methods or variables. When you do not camel-case the names of your accessor methods, you’re clearly not used to reading other people’s Java code. Furthermore, there are plenty of other signs that the application was written without a lot of care. The code isn’t even properly indented, and it looks as though the developer is not very comfortable with data structures.

For example, temperature and humidity values are extracted from the XML returned by the Web service and stored as Vectors of strings in a data transfer object. When the developer goes to use them, they construct new arrays and copy all of the values in the Vectors into plain old arrays, converting them to numeric values at that time. Why not parse the numbers at extraction time? I have no idea. Why not store the corresponding humidity and temperature values in a data structure rather than just keeping them in two separate indexed data structures? I have no idea. But these sorts of rookie mistakes are a good indication that the developer who wrote this was in far over their head.

My guess is that OSHA hired Eastern Research Group to build this application because they were already on an approved contractor list and the business development person from ERG told them that they were capable of doing it. The application was then built in-house at ERG by a developer who had no clue what they were doing, probably under a tight deadline, or it was outsourced to some other firm that was fleecing ERG the same way they were fleecing OSHA. Clearly there was nobody on the OSHA side who was capable of doing even the rudimentary inspection of what was delivered that it took me 30 minutes or so to perform.

I have worked as a consultant before and I see this a lot. People who outsource software development simply lack the expertise to assess the applications that are built for them. They don’t know how much they should cost, what to look for in a vendor, or how to evaluate what’s delivered to them to make sure they got their money’s worth.

I went into this thinking that maybe everybody involved was honest and the bad result was due to flaws in the process, but now I think it’s pretty clear that ERG sold the OSHA a false bill of goods and wound up fleecing them pretty badly. I hope it’s not too late to get their money back.

The image embedded in this blog post is from the source for the application — it represents heat stroke. Maybe some of the money was spent on illustrations.

Steve Silberman profiles Susan Kare

Susan Kare is the artist who created the original icons for the Macintosh. She started at Apple by designing proportional fonts but graduated to icon design. The degree to which her work made the original Macintosh software easier for humans to relate to can’t be overstated. The blog post features work from her original sketchbook, in which she designed icons on graph paper by using the squares as pixels.

Big Data demands better shell skills

At work, I’ve been experimenting with Apache Solr to see whether it’s the best choice for searching a very large data set that we need to access. The first step was to just set it up and put a little bit of data into it in order make sure that it meets our current and anticipated future requirements. Once I’d figured that out, the next step was to start loading lots of data into Solr to see how well it performs, and to test out import performance as well.

Before I could do that, though, I generated about 33 million records to import, which take up about 10 gigabytes of disk space. That’s not even 5% of the space that the full data set will take up, but it’s a start.

What I’m quickly learning is that when it comes to dealing with Big Data, knowledge of the Unix shell is a huge advantage. To give an example, I’m currently using Solr’s CSV import feature to import the test data. If we wind up using it in production, writing our own DataInputHandler will certainly be the way to go, but I’m just trying to get things done right now.

Here’s the command the documentation suggests you use to load a CSV file into Solr:

curl http://localhost:8983/solr/update/csv --data-binary @books.csv 
    -H 'Content-type:text/plain; charset=utf-8'

I quickly found out that when you tell curl to post a 10 gigabyte file to a URL, it runs out of memory, at least on my laptop.

These are the kinds of problems for which Unix provides ready solutions. I used the split command to split my single 10 gigabyte file into 33 files, each a million lines long. split helpfully named them things like xaa, xab, etcetera, all the way through xbh. You can use command line arguments to tell split to use more meaningful names. Anyway, then I used a for loop to iterate over each of the files, using curl to submit them:

for file in x* ; do
    curl http://localhost:8983/solr/update/csv --data-binary @$file 
        -H 'Content-type:text/plain; charset=utf-8'
done

That would have worked brilliantly, except that Solr wants you to list the fields in the file on the first row of your CSV file, so only the first file imported successfully. I wound up opening all of the others in vim* and copying the headers over rather than writing a script, proving that I need to brush up on my shell skills as well, because prepending a line to a file is easy if not elegant.

Once the files were updated, I used the loop above to import the data.

When it comes to working with big data sets, there are many, many tasks like these. Just being able to use pipes to make sure that your very large data files are always compressed can be a life-saver. Understanding shell scripting is the difference between accomplishing a lot in a day through automation or doing lots of manual work that makes you hate your job.

* I should add that MacVim gets extra credit for opening 33 252 megabyte files at once without complaining. I just typed mvim x* and up popped a MacVim window with 33 buffers. Unix is heavy duty.

Facebook is on the Web but not of the Web

It’s becoming increasingly clear that while Facebook is a Web site, they don’t want to join the other Web sites in the pool we know as the Web. Anil Dash has the details and a way to encourage Facebook to change its behavior. First, he makes the case that Facebook is encouraging to drive its users away from the larger Web:

Facebook has moved from merely being a walled garden into openly attacking its users’ ability and willingness to navigate the rest of the web. The evidence that this is true even for sites which embrace Facebook technologies is overwhelming, and the net result is that Facebook is gaslighting users into believing that visiting the web is dangerous or threatening.

This is, to me, the latest front in the battle over for users on the Web. Ultimately, Facebook wants users to view ads on Facebook pages, not on your Web site. Furthermore, they want to be able to observe the behavior of their users wherever they go in order to serve up ads that users are more likely to click on. Publishers want access to Facebook’s user base. Currently, Facebook is forcing them to give up an awful lot in order to get it, but hopefully that can be changed.

It’s for these sorts of reasons that I sort of passively resist Facebook. I am still using Facebook only in Chrome’s Incognito mode so that they can’t track me across the Web, and I still refuse to use services that require you to sign in using a Facebook account. I just don’t want to cede more control to Facebook.

The privacy risks of using Google Analytics

Did you know that there’s a reverse lookup for Google Analytics IDs? I didn’t. Andy Baio has the details.

The only exercise advice you really need

The New York Times ran an article this week about how beginning runners are not well served by the massive amounts of advice being offered on running form and running shoes. What do the doctors say?

When it comes to running form, Dr. Bredeweg said, “we don’t know what is the right thing to do.” For example, he noted, forefoot strikers place less stress on their knees but more on their calves and Achilles tendons.

“We tell people we don’t know a thing about the best technique,” he said. He tells runners to use the form they naturally adopt.

The problem of excessive advice is pervasive in the world of fitness. Everyone is trying to sell an exercise routine that they claim is the best. Whether it’s weight training, Crossfit, yoga, pilates, or running, people are evangelists of what they do, and professionals are even worse.

For people who aren’t exercising regularly, the most important thing is to start doing something. It doesn’t even matter what it is. If you don’t like what you’re doing, try something else, but keep exercising. The idea that there’s one master program is completely false. If some exercise doesn’t feel good, find something else.

Eventually, once you’ve been exercising for awhile, you may set goals that your exercise routine isn’t helping you meet, and you’ll need to find a coach, do more research, or just up your intensity, but it’s not worth worrying about before you reach that point.

The truth is that Nike has always provided the best advice when it comes to working out — just do it. If you can consistently challenge yourself over a long period of time, almost everything else will take care of itself.

The real state of government regulation

Right wingers like to blame our lack of economic growth on excessive government regulations. While I would agree that there are likely plenty of regulations on the books that could be repealed without harming consumers, the truth is that we have plenty of big problems with industries that are under-regulated and cases where regulations are not enforced. All too often, this occurs when the people who suffer are poor. Here are a couple of examples.

In the first case, the LA Times published a three part series on Buy Here Pay Here car dealerships, which sell cars under a model similar to rent to own furniture stores. This industry is mostly unregulated, and involves loaning money to desperate car buyers at usurious rates. Many customers default on their loans and the cars are repossessed and sold to the next person doc one along.

In the second case, NPR and the Center for Public Integrity produced a multi-part series on how clean air regulations are not preventing industrial plants from discharging massive amounts of air pollution and damaging people’s health, mostly due to lax enforcement. Activists in Tonawanda, New York fought for thirty years to curb pollution from a nearby plant. The plant, in the meantime, systematically deceived regulators and continued its polluting ways. For all the talk of excess regulation, the plant remains open and is still producing pollution, albeit at a lower level.

For all the talk I hear about excessive regulation, what I read a lot of are stores about insufficient regulation. It’s also worth pointing out this Treasury Department blog post that examines what an economy that’s stagnating due to regulatory uncertainty might look like, and argues that the US economy is not showing those symptoms.

On a related topic, I found Tyler Cowen’s theory that regulatory enforcement depends more on the number of regulators rather than the number of regulations to have interesting implications. There’s an argument to be made that to implement an effective regulatory regime, it’s just as important to get rid of useless old regulations as it is to implement new ones.

Profanity limits your audience

When is it OK to swear? Scott Hanselman takes on the issue of using profanity in conference presentations, blog posts, and other public communications. I find this interesting because he brings it up in light of the Don’t Give Your Users Shit Work blog post that I linked to the other day. The main reason I didn’t link to it in the first place was the title. I don’t normally use profanity here, and I wasn’t really sure about using it even in a direct quote.

The thing is, I’m not a shrinking violet. In fact, I generally describe myself as being nearly impossible to offend, and I am never offended by profanity. However, I share Hanselman’s concerns about using profanity:

My question is, do swear words add as much as they subtract? Do they increase your impact while decreasing your potential audience? I believe that swearing decreases your reach and offers little benefit in return. Swearing is guaranteed to reduce the size of your potential audience.

In my opinion, using coarse language in public, whether it’s in a blog post, a conference presentation, or a meeting with a bunch of people don’t know well, violates the Robustness principle:

Be liberal in what you accept, and conservative in what you send.

If the impression people take from something I wrote or said was, “That guy has a foul mouth,” then chances are that I wasn’t able to get my point across. Besides, if you are sparing in your use of profanity, when you do swear, people really pay attention.

Older posts Newer posts

© 2024 rc3.org

Theme by Anders NorenUp ↑