Some universal code optimization rules

From time to time I have advocated for design approaches which value consistency and ease of change over raw operations per second. I realize that this strategy makes some programmers uncomfortable. So I thought I’d document some guidelines for optimizing your code.

In my experience these rules are reliable regardless of programming language.

Baseline

Consistently styled, well-factored code which makes it easy to apply targeted optimizations based on profiling data, in response to business performance requirements.

5x slowdown

A mostly consistent, well-factored codebase in which certain modules break the rules because “this part needs to be fast”.

10x slowdown

Code that has been optimized based on algorithmic profiling without taking I/O time into account.

20x slowdown

Widespread, pervasive duplication and violation of encapsulation mandated by a style guide built on one influential developer’s anecdotal past experience with slow code.

50x slowdown

“We expect to eventually scale up to a million concurrent users, therefore…”

1,000,000x slowdown

Spending time arguing with a developer who is convinced that their microbenchmark trumps all other considerations.


Naturally, the numbers above refer to the relative performance of the team in delivering on business goals.

Did you want to talk about operations per second? Sure, I’d be happy to. Except I just received a SIGINT  from a watchdog process I set up to detect and halt infinite busy loops. Maybe another time?