Software has a state where bugfixing is just as likely to make it
worse as it is to make it better. Metacity is kind of in this stage;
most (but not all) of the open bugs really are not very important,
relative to other aspects of the desktop that people could work
on. And a more interesting observation, it’s not clear that fixing the
last 1% ever gets you to 100%. Why is this?
Some minor bugs require large code changes to fix and thus fixing them will
probably introduce another set of unintended bugs worse than the
Even if the code changes are small, there’s a nonzero chance your
bugfix is broken or inadvertently creates a new bug, and if the bug
you’re fixing is lame enough, this chance outweighs the value of the
Some “bugs” are really just arbitrary decisions that were made, and
“fixing” them leads to toggling the arbitrary decision back and forth
every couple years, which is worse than just picking something and
leaving it alone.
A UI design is always about tradeoffs (if only because human tolerance
for complexity is finite), so there are a bunch of deliberately
traded-off downsides to a design, and these are “bugs”
After some amount of time, bugs become features because people are
used to them and already worked around them; and you’re likely to
break a lot of working code or human work habits by fixing the bug.
In short, there are some serious diminishing returns on bugfixing
after a certain point.
I think one of the important things software developers only learn
through experience (you can’t learn it from books) is how to tell the
difference between a bug that should be fixed and a bug that should be
Most open source developers just leave these “would be nice to fix
someday, I guess” bugs open in the bugtracker forever rather than
argue with the bug reporter, even though an experienced maintainer
knows they aren’t going to get fixed and aren’t worth fixing.
I’m not trying to trivialize the worthwhile bugs, and there certainly
are a few left in metacity (which we would love help with).
Also, I think major changes to something like metacity remain
interesting, for example adding a compositing manager. Or a full
rewrite of some kind, if it had some good goals. Directed major
changes remain valuable even when random bugfix churn is a negative.
The “it will never be finished, ever” aspect of software can be very
depressing. Sometimes I think the videogames industry would be cool
for this reason, especially writing console games. Because you ship
the game, it’s burned to the CD, and that’s it. The code is set in
stone. Much more like other creative endeavors such as painting or
carpentry. None of this “updates” and “next version” nonsense.
(This post was originally found at http://log.ometer.com/2005-07.html#6.2)