Worse Really is Better

On outcomes over nerd snipes.

Xander Dunn, 19 March 2024

In 1991 a now (in)famous essay The Rise of Worse is Better was published online. Go read it. It's worth it.

The author originally framed Worse is Better in terms of the implementation complexity of programming languages. I'm going to push this concept of "Worse is Better" a bit beyond the author's original intention to make my own point. I'll say the central point of Worse is Better is this:

The thing that succeeds is the thing that works just well enough to get into the most hands and iterate the fastest. This is both the iteration rate of the thing itself, as well as the iteration rate it enables for the people who are using it.

Here I define success for a technical outcome as having users and solving problems for people. Generally that is going to be the end goal. A business that never has any users won't be a business for long. An open source project that doesn't have any users will stop being maintained. A non-profit that never reaches its target audience has no reason to exist. Even a pure mathematics researcher cares that their paper reaches peers and gains their approval.



A scene from the dystopic sci-fi movie Brazil where everyone's job is to create paperwork for the sake of paperwork.

What "Worse is Better" is Not

I spent this whole essay burying the Better approach, but it's important to realize that Worse<----->Better is a long, continuous spectrum, and I am not advocating for the absolute worst of engineering practices. Worse is Better is not about being a lazy, sloppy engineer. It's not about never writing unit tests, never commenting your code, or writing an O(n²) implementation when the O(n) would've been just as simple. There is a very large gamut between "let's rewrite everything in Haskell and make sure no PR is merged unless test coverage is >98%" vs. "let's copy-paste Jupyter notebooks and use email as our revision control system and what even is a unit test?" Be somewhere in the middle.


I think the original Rise of Worse is Better article actually didn't go far enough. In the conclusion the author writes:

A wrong lesson is to take the parable literally and to conclude that C is the right vehicle for AI software. The 50% solution has to be basically right, and in this case it isn’t.

Lisp was everyone's favorite AI language for years around this time. But here we are in 2024 and it turns out CUDA kernels are written in C! So actually C is the right vehicle for AI software, even if it's the janky 50% solution.

As an engineer and technically minded person I still feel the nerd snipe urge. But now rather than eagerly following that signal, I typically view it as a negative signal and steer clear of it. I want to work on things that see the light of day, have users, solve problems, and impact the world. Reinventing the wheel hones my engineering skills and scratches a theoretical itch, but I'd rather be pulling my hair out on a janky system that changed the world than working on a beautiful system no one's ever used.