03 April 2016
A few weeks ago I wrote an article titled Inheritance Is Terrible in which I argued that, well, inheritance is terrible. The post generated a lot of discussion on Hacker News, Reddit, and Lobsters. Someone on Lobsters even posted an old Java world article as a better explanation of why inheritance is bad.
In the article, I presented an example of usage of inheritance as I've seen in the wild. I then showed how this usage ran into problems as the code changed. Many commenters, however, pointed out that this didn't really show a problem with inheritance as much as a problem with how it was used. To quote one comment:
The first implementation was weak from the start. By exposing the instance variable directly, you threw away all the power of inheritance and guaranteed a future meltdown. If you had correctly implemented setters and getters from the beginning, you would have been perfectly prepared for what would have been a minor and appropriate change to the pricing scheme that are the whole point of encapsulation.
This is a fair point; running into problems after mis-using a tool doesn't mean the tool is bad. I still think inheritance is a bad tool, though: it's easy to mis-use, rarely adds much value, and there are safer tools that do the same job. However, I no longer think it's possible to demonstrate this in a short blog post with code samples.
When discussing the article with my coworkers, someone pointed out that this is a problem that often comes up when discussing programming languages. Consider C, for example. If you wrote a blog post arguing C is a bad programming language and "proved" it by presenting a toy example with buffer overflows, many readers would simply say that the problem isn't with the language but with the way you used it. "Well-written C", they would say, "checks for this problem and handles it".
The readers are technically correct. But they're missing the forest for the trees. In general, you can defend bad tools by arguing they don't cause problems when used correctly. The question to ask, though, isn't whether or not the problems bad tools cause can be avoided through careful use -- it's whether alternatives exist that can eliminate these problems altogether. In the case of C, decades of work has gone into languages with safer ways to manage memory; is manually checking for overflow still a good use of our time? (Not coincidentally, the person who raised this line of thinking is a big Rust advocate.)
The quality of a tool can't be decided by looking at the tool on it's own; you can only measure it's usefulness by comparing it to other tools that do the same job. A good tool solves problems with few downsides and doesn't create extra work for the user. A bad tool is bad not because it fails to solve the problem, but because it introduces more problems in the process.
That is why I still believe inheritance is terrible. It's not because the fragile base class problem is unavoidable -- it's that by using other techniques, you never have to worry about avoiding it in the first place.
Inheritance is a largely useless technique that is only prominent because the software development community has not yet recovered from the OO mania of the 90s. It is strictly inferior to two other, safer techniques: interfaces and composition. Don't use it!