What should the technology industry be famous for?
2

What should the technology industry be famous for?

I had one of those “reading your own obituary” moments this morning when I picked up the New York Times. In an article about how banks are trying to change the customs that coerce junior employees to work nights and weekends, the writers bring up the increased competition from the technology industry. Here’s how they put it:

Though the prospect of a large salary and experience in finance still draws many college graduates to the industry, some ambitious students are considering other career paths, including those in the technology industry, famous for its employee perks like free oil changes or staff masseuses.

It irritates me to see that the industry in which I’ve spent my entire career is defined to outsiders by the perks offered, and not just because I’ve never gotten a free oil change or a massage at the office. I have plenty of friends who work in industries full of people who are motivated by compensation, perks, prestige, and everything but the work itself. Most of them are eagerly looking forward to retirement.

Weeding out those sorts of people is probably my number one goal in the interview process. As I said when I reviewed The Soul of a New Machine, for me, the work is the reward. If you don’t see it that way, you should consider the opportunities available in finance.

How to hire a superhero
0

How to hire a superhero

University of Rochester computer science professor Philip Guo has written a great post, shining a light on the bias in the computer industry in favor of people who look like him (or me). In it, he explains how he has gotten the benefit of the doubt in situations simply because he looks like what people expect a computer programmer to look like. It’s a first-hand account of what John Scalzi talked about when he wrote about straight white males playing the game of life on easy mode.

Here’s a bit from the post, Silent Technical Privilege:

… people who look like me can just kinda do programming for work if we want, or not do it, or switch into it later, or out of it again, or work quietly, or nerd-rant on how Ruby sucks or rocks or whatever, or name-drop monads. And nobody will make remarks about our appearance, about whether we’re truly dedicated hackers, or how our behavior might reflect badly on “our kind” of people. That’s silent technical privilege.

He contrasts that with the challenges placed before people who are outside the demographic sweet spot that so many people associate with being a computer programmer. As he says, you don’t have to be a superhero to succeed in software development, and we shouldn’t demand that people who don’t enjoy these privileges be superheros to make it in this business.

This leads me to the theory that if you want to hire superheros, you should ditch your biases and expand your hiring pool. The people who have persisted and succeeded in this industry in the face of the inequeties that Philip documents are much likelier to be superheros than your run of the mill bro who shows up to the inteview in his GitHub T-shirt with a cool story about making money in high school by creating Web sites for local businesses. There are plenty of superheros that fall into the privileged category, but they already have good jobs because they will never, ever be overlooked in the hiring process. The awesome programmer who also seems like they’d be super fun to have around the office? They get a job offer every single time.

The enterprising hiring manager has to think outside the box a bit. You know the feeling. The person seems like they’d be great at the job, but is different enough from everyone else that you wonder a bit how hiring them would affect team chemistry. I’m not talking about personality here, but demographics. (I’d never recommend that a non-competitor hire brilliant jerks.) The hypothetical candidate I’m talking about here has to deal with these unspoken questions every single time they apply for a job. That’s not fair, and it’s incumbent on our industry to change, but in the meantime, it’s possible to exploit these widespread biases to hire the superheros that most companies won’t even consider. At least that’s the theory.

Felix Salmon on the evolution of Netflix
0

Felix Salmon on the evolution of Netflix

Felix Salmon: Netflix’s dumbed-down algorithms

Good piece on the evolution of the Netflix user experience, necessitated by the transition from the “rent movies by mail” model to the streaming content model, and the licensing fees for streaming content online. I find that Netflix is OK if you just want to watch something, and mostly terrible if you want to watch a specific thing. The Netflix experience has been changed to discourage you from attempting the latter.

Management, the hard parts
0

Management, the hard parts

Camille Fournier: 2013: The Constant Introspection of Management

Nice post on the hard parts of being a manager. One thing I’d add is that as a manager, the mistakes you make all too often have an impact on real people, an impact that’s impossible to undo. If you put someone on the wrong project and they spend six months not really learning anything, that’s six months they can never have back. It’s a tough job.

Some thoughts on delegating
0

Some thoughts on delegating

John D. Cook has a post on one of my favorite management topics, when to delegate.

My thoughts on delegation are probably fairly radical. The first principle is that if a project strikes me as interesting to work on, I must delegate it immediately. Managers almost never have the bandwidth to tackle interesting technical problems, it’s the nature of the beast. When I delegate projects, I can remain engaged to a degree that allows me to learn and to hopefully add some value, without stalling progress because I’m too busy dealing with the other million distractions that are part of manager life. I tend to stick with working on the tasks that are too boring or repetitive to bother delegating to other people – they’re generally about the right size to get out of the way when I’m between other things.

The second principle is, delegate everything, especially if you’re in a new role. When you take on a new role, there is no way to estimate how much work is going to be coming your way, or to list what all the demands on your time will be. All of the stuff you were doing is either going to keep you from succeeding in your new role, or just isn’t going to get done, unless you can delegate it to someone else. Of course, in reality you’ll never be able to delegate everything, but proceeding as though you can will hopefully keep you mindful of the fact that you should always be looking for opportunities to delegate.

The point I’m really getting at is that most people are far too reluctant to delegate, and far too selfish when it comes to delegating. Usually when you’re in a position to delegate, you are also at least partially responsible for the professional well-being of the people you’re delegating things to. The things you delegate should hurt a little to give away.

Paul Graham is wrong about it being “too late”
1

Paul Graham is wrong about it being “too late”

Katie Siegel: It’s Not “Too Late” for Female Hackers

Paul Graham’s comments in an interview behind the paywall at The Information (and unflatteringly excerpted by ValleyWag) have been the subject of much discussion this week. Siegel’s post gets at what was really problematic about them. I’m one of those people who started programming at age 13, and it has always been easy for me to advocate hiring people like myself, for exactly the reasons Paul Graham gives. Siegel ably makes the point that this line of thinking is unfair to everybody who doesn’t fall into his select group, but I’d argue further that it’s also just bad business. More on that later.

Update: Paul Graham says he was misquoted. I’m glad Graham cleared things up. In his update, Graham says:

If you want to be really good at programming, you have to love it for itself.

That’s a sentiment I agree with completely. One reliable indicator that a person loves programming for itself is that they started as a kid. The problem arises when we treat that as the only reliable indicator, which Graham does, at least based on this passage from the transcript:

The problem with that is I think, at least with technology companies, the people who are really good technology founders have a genuine deep interest in technology. In fact, I’ve heard startups say that they did not like to hire people who had only started programming when they became CS majors in college. If someone was going to be really good at programming they would have found it own their own. Then if you go look at the bios of successful founders this is invariably the case, they were all hacking on computers at age 13.

Great reads of 2013: why you should A/B test on the Web
0

Great reads of 2013: why you should A/B test on the Web

I enjoy the great privilege of working with Dan McKinley, whose blog is at mcfunley.com. He doesn’t post very often and his posts are reliably great. I want to call out one post specifically … Testing to Cull the Living Flower. Here’s one sentence:

When growth is an externality, controlled experiments are the only way to distinguish a good release from a bad one.

In it, he argues for why sites that are enjoying rapid growth must use A/B testing to build features if they want to be confident that they are making a measurable, positive impact on that growth rate.

Great reads of 2013: Jay Kreps on logs
0

Great reads of 2013: Jay Kreps on logs

For the past 18 months or so, I’ve been working on data engineering at Etsy. That involves a pretty wide array of technologies – instrumenting the site using JavaScript, batch processing logs in Hadoop, warehousing data in an analytics database, and building Web-based tools for analytics, among other things. One of the persistently tough problems is moving data between systems in an accurate and timely fashion, sometimes transforming it into a more useful form in the process.

Fortunately, Etsy is anything but the only company in the world with this problem. When I was preparing for 2014, I scheduled meetings with my counterparts at a number of other companies to learn what they were doing in terms of data and analytics. When I came back, I had a pretty good idea of what we needed to do next year to harmonize all of our dispirate systems – set up infrastructure to allow them to communicate with one another using streams of events, otherwise known as logs.

Then, just about the time I was finished writing the 2014 plan, Jay Kreps published The Log: What every software engineer should know about real-time data’s unifying abstraction on the LinkedIn Engineering Blog. In it, he documents the philosophical underpinnings of pretty much everything I talked about in the plan.

The point of his post is that a log containing all of the changes to a data set from the beginning is a valid representation of that data set. This representation offers some benefits that other representations of the same data set do not, especially when it comes to replicating and transforming that dataset.

As it turns out, even though most engineers use logging, they are probably underexploiting logs in the form that Kreps describes in a pretty serious way. How do we scale up on the Web? We move more operations from synchronous to asynchronous execution, and we diversify our data stores and the ways in which we use them. (Even at Etsy, where we are heavily committed to MySQL, we use it in a variety of ways, some of which resemble key/value stores.) Representing data as a log makes it easier to handle asynchronous operations and diverse data stores, providing a clear path to scaling upward.

Kreps’ thoughts on logs are a must-read for any engineer working on the Web. I expect I’m not the only person who found that post to be perfectly timed.

Great reads of 2013: 1491
2

Great reads of 2013: 1491

I’m going through some of my favorite long reads from 2013 that I didn’t get around to blogging at the time I read them.

My first pick is actually not from 2013, it’s from 2002, but I happened to read it this year. Furthermore, it was later expanded into a book that was published in 2006. I plan to read it in 2014. The book and the article, written by Charles C. Mann, are both entitled 1491. Mann’s argument is revisionist history at its best, and I mean that as a compliment.

From both school and popular culture, I learned that the Americas were a sparsely populated pristine wilderness, largely unsullied by human impact. Mann sets out to demolish that narrative. His argument is that the evidence points to pre-Columbian America being far more densely populated than we’ve been taught, and the landscape having been heavily shaped by human habitation. The America encountered by later colonists had been depopulated thanks to pathogens carried by earlier European voyages, and was in transition due to the near-extinction of its keystone species–native Americans.

1491 is probably the most interesting thing I read in 2013.