Strong opinions, weakly held

Tag: thinking

Be suspicious of the worst-case

Bruce Schneier cautions people to be wary of worst-case scenarios:

There’s a certain blindness that comes from worst-case thinking. An extension of the precautionary principle, it involves imagining the worst possible outcome and then acting as if it were a certainty. It substitutes imagination for thinking, speculation for risk analysis, and fear for reason. It fosters powerlessness and vulnerability and magnifies social paralysis. And it makes us more vulnerable to the effects of terrorism.

Worst-case thinking means generally bad decision making for several reasons. First, it’s only half of the cost-benefit equation. Every decision has costs and benefits, risks and rewards. By speculating about what can possibly go wrong, and then acting as if that is likely to happen, worst-case thinking focuses only on the extreme but improbable risks and does a poor job at assessing outcomes.

I never really thought about the fundamental laziness involved in obsessing over the worst-case before.

The 2010 Edge Annual Question

This year’s Edge Annual Question is, “How is the Internet changing the way you think?” Follow the link to see answers from a lot of smart people. My answer follows.

The Internet has trained me to be less reactionary and actually consider the positions I take before I take them. I have written so many blog posts, blog comments, discussion group posts, and emails over the years and been embarrassed by my own half thought out positions enough times that I’m better at thinking things through than I once was. A fair amount of the time, I realize that my original argument is not correct, and I wind up thinking differently. That results from putting one’s opinions on display in a medium where it’s possible to get instant feedback from people who have little to dissuade them from being honest, and where it’s easy to find contrary and complementary arguments to measure yours against.

More importantly, though, the Internet provides a go-to community of experts on just about any topic. The widespread availability of massive amounts of raw data and trenchant analysis makes the Internet age unlike any other. The Internet makes it easier to interact with knowledgeable people. In what other era would it be easy for people to get questions about Mexican cooking answered by Rick Bayless, its foremost evangelist?

Let’s say you want to get up to speed on Yemen, given the connection of terrorists based in Yemen to the underpants bomber. Here’s a blog by a Yemen expert, Wal Al-Waq. Here’s Middle East and Islam expert Juan Cole. Here’s Middle East expert Marc Lynch. And that’s just the beginning. For basic Yemen facts, there’s the Yemen page in the CIA World Factbook and the Yemen article in Wikipedia. You can brief yourself on quite well on Yemen over your lunch break. Twenty years ago, access to the same kind of knowledge was simply unavailable. You could go to the library and pick up books and journal articles by those authors, filled with information that is likely to be dated. The most recent information would probably be in encyclopedia articles that are updated annually.

The second order effect of having access to all of this information has been to make it easier to apply the lessons of other fields to my profession — software development. I find it fascinating to find patterns in economics, or cooking, or sports, or military strategy that can be applied to making better software. Sifting through all of that information to look for useful bits would have been too time consuming in the age before the Internet, but now it’s almost easy.

Altogether, the Internet is the best tool for getting smarter and better informed we’ve ever known. The key is learning how to use the tool. I wouldn’t pick any other era to live in.

On a related note, see Tyler Cowen on blogging as a learning mechanism.

Context is everything

I don’t trust anyone who doesn’t appreciate context.

I write that sentence after reading Andrew Brown’s post on how the failure to appreciate history on its own terms clouds the thinking of fundamentalists. (In this case, he’s talking about fundamentalist atheists.)

This is the paragraph that grabbed me:

Thinking about the ignorant, angry atheists who infest the Guardian’s comment pages I realised one thing they have in common with scriptural fundamentalists: they have no idea of history. They live in an eternally dazzling present and they can’t imagine that there is anything outside it. Oh, sure, they have legends — the inquisition, the crusades, the middle ages — but within these legends the actors move, as they do in renaissance paintings, entirely in contemporary dress. There is no sense of the strangeness and difficulty of the past; no sense that many things have been tried and failed; no sense that words once meant things entirely different and possibly inexpressible now.

It’s impossible to properly appreciate anything without understanding, to some degree, where it came from. Failure to appreciate things in their own context is a problem I often find when people talk about software development. I read arguments about the superiority of Ruby on Rails to J2EE without any appreciation of the fact that Ruby on Rails is built upon many lessons that were learned the hard way as Java frameworks evolved. Without the 1999 article Understanding JavaServer Pages Model 2 architecture, Struts, and plenty of other lessons learned along the way, there would be no Rails as it exists today. Without Active Server Pages there would have been no Java Server Pages. Without CGI there would have been no ASP. Without Perl and Lisp and Scheme there would have been no Ruby.

Whether the topic is programming, history, politics, or music, attempting to explain or criticize things without judging within their own context is waste of time and energy. The only upside is that when someone persists in doing so, it’s a good signal that their analysis can be safely dismissed without further consideration.

© 2024 rc3.org

Theme by Anders NorenUp ↑