rants: Fri Nov 10 01:26:00 2006

Fri Nov 10 01:26:00 2006

Premature Optimization and Object-Oriented Blame

The year is 1989. A small project team, working on a C++ project (at a time when "5 years of C++ experience" meant your name was Bjarne or Andrew); two adventurous types who'd used the language on toy projects, one experienced but not terribly detail-oriented one, and an enthusiastic newbie. Rush-rush application, starting from a throwaway prototype in smalltalk that showed great UI but 1% of the required performance; thus, sufficiently neophilic management to buy into the idea of doing rapid development with brand new tools - or maybe they felt that if we were wrong, we'd at least fail quickly too :-)

About two months in, the UI is going well (one of the things the C++ object model was actually quite well suited for is graphics classes, something like

and maybe another layer of input handling, broken down along the same geometric hierarchy, oh and replacing drawable with implementations in SunView, Domain, and X11 in about a day, made us very popular) but we're starting to load enough data to start seeing slow-to-reproduce difficult-to-narrow-down bugs in the underlying data structures.

After some in-fighting over whose classes were the problem (the UI side or the data-wrangling side, different developers) we come upon our first significant application of the true power of C++: object oriented blame. Since the UI classes and the data classes talked to each other over a very narrow interface, it was easy to stub out one or the other with a "stupid but obviously correct" implementation for testing... UI classes that just printed text instead of drawing anything, data classes with hard coded values, that sort of thing. Not only does this technique work for finding bugs - in 2006, we call it Mock Object Testing - but it has some psychological benefit for the programmers involved, in terms of keeping the mistake-finding more reality-based.

In this story, the suspected data structure was a doubly linked ordered list, with cached "recent" pointer to (attempt to) speed up insertion. (Remember this was 5 years before the SGI STL release...) That's kind of complex, and the performance aspects were entirely speculative. Since a test harness would have been on the "wrong" side of the argument (that is, the arguments that it "wasn't the data structure" would have been wielded against the tests too; yes, we know better today) the alternative was a testing-only implementation of a simple vector based on realloc and memcpy. Completely naive implementation (on the principle of making it easy to test): no buffer gap or indirection, every insert was a realloc and data move, every reference was a linear search.

As might be expected from the tone of the story, dropping in the naive implementation caused the bugs to vanish, and pretty effectively ended the argument... but it turned out to be a bigger hammer than expected: after finally fixing the bugs in the original data structure, some "large scale" data load tests could finally get run. They took... rather a long time. Swapping in the naive implementation solved that problem - graphing the two showed an apparent difference in order! Empirically, the memcpy based implementation took time linear in the quantity of input data, while the list was at least quadratic. This was probably due to the underlying realloc doing bucketing for us, and having most of its overhead be per-object rather than per-byte, so the single block technique won out over many smaller allocations.

At the time, the idea that "this application is all about manipulating this set of data in this way - therefore that needs something sophisticated enough to perform well" made sense. Perhaps we can be forgiven for this - after all, the popularization (by Knuth) of Hoare's dictum about premature optimization also dates to 1989. But clearly the smarter approach would have been to take our knowledge about the structure of the problem and conclude that "therefore the data performance should be measured more closely."

This is a story I've told a number of times in person, and has held its relevance for over 15 years.