The Once and Only Once principle can be thought of as a subset of the Don’t Repeat Yourself principle, and is one of the most fundamental principles of software development.
Duplication of behavior is one of the most common sources of bugs in software systems, since it becomes increasingly likely that changes to behavior defined in one location may not be propagated to all locations where this behavior is defined.
Eliminating the duplication caused by not following the Once and Only Once principle is one of the primary reasons for refactoring and is also at the core of many design patterns.
One of the main goals (if not the main goal) when ReFactoringcode.
Each and every declaration of behavior should appear OnceAndOnlyOnce.
Conceptually analogous to normalization in the RelationalModel. See also DontRepeatYourself.
Code wants to be simple.
If you are aware of CodeSmells, and duplicate code is one of the strongest, and you react accordingly, your systems will get simpler.
When I began working in this style, I had to give up the idea that I had the perfect vision of the system to which the system had to conform.
Instead, I had to accept that I was only the vehicle for the system expressing its own desire for simplicity.
My vision could shape initial direction, and my attention to the desires of the code could affect how quickly and how well the system found its desired shape,
but the system is riding me much more than I am riding the system. -- KentBeck, feeling mystical, see MysticalProgramming
Beware of introducing unnecessary coupling (CouplingAndCohesion) when refactoring for OnceAndOnlyOnce.
Refactoring is the moving of units of functionality from one place to another in your program. Refactoring has as a primary objective getting each piece of functionality to exist in exactly one place in the software. -- RonJeffries
It's not OAOO, and this comment probably ought to be somewhere else, but doesn't refactoring also cover replacing one piece of code with another, simpler piece of code that has the same external "appearance" and function?
Yes - good point. Note Ron's subtle use of "a primary objective" instead of "the primary objective". I personally use two "first tier" refactoring rules - OnceAndOnlyOnce and SeparateTheWhatFromTheHow (my name - the common name is ComposedMethod)
OnceAndOnlyOnce is a profound concept, but difficult to apply. I've spent my entire professional life (25 years) learning how to apply it to programs. This page [many versions ago] ... was rewritten to make OnceAndOnlyOnce seem like a simple rule to apply, instead of a prime principle. OnceAndOnlyOnce is NOT easy! And it was wrong to refactor this page so that all hints of tension and disagreement are removed from it.
OnceAndOnlyOnce is not a pattern. A pattern is something you can teach someone to do in a fairly short amount of time. A day, usually. Perhaps a few weeks.
But learning how to refactor classes to form a TemplateMethod does not help you see how to use XML to represent your user interfaces (a recent OnceAndOnlyOnce technique applied to Squeak), or how to make a good virtual machine. These are patterns; OnceAndOnlyOnce is not a pattern. OnceAndOnlyOnce is a principle. -- RalphJohnson
Well said. OnceAndOnlyOnce is not just a simple rule, but one of the core goals of all software design. It's why functions were invented. Remember that your program could have been written as a single long function using only ifs, whiles, and try/catch blocks for flow control, and primitives for all the data. Consider what that would look like. For "hello world", it's the default.
I once saw Beck declare two patches of almost completely different code to be "duplication", change them so that they WERE duplication, and then remove the newly inserted duplication to come up with something obviously better. -- RonJeffries, from the XpMailingList
or, the long version...
I recall once seeing Beck look at two loops that were quite dissimilar: they had different for structures, and different contents, which is pretty much nothing duplicated except the word "for", and the fact that they were looping - differently - over the same collection.
He changed the second loop to loop the same way the first one did. This required changing the body of the loop to skip over the items toward the end of the collection, since the previous version only did the front of the collection.
Now the for statements were the same. "Well, gotta eliminate that duplication, he said, and moved the second body into the first loop and deleted the second loop entirely.
Now he had two kinds of similar processing going on in the one loop. He found some kind of duplication in there, extracted a method, did a couple of other things, and voila! the code was much better.
That first step - creating duplication - was startling. -- RonJeffries, from the XpMailingList
It isn't so startling. That technique is necessary for the more powerful space optimizations of program code: you reduce everything as much as possible, then you find two subgraphs that are similar, you add conditional nodes to them until they're identical (and inject the proper conditions), then you combine them. Repeat until the apparent gains aren't worth the resource cost to acquire them. Beck is just doing it by hand.
It is also frequently used to allow consistent idiomatic expressions to emerge. This allows bugs to be detected via inspection for all cases where the idiom was not followed properly. This is more than superficial coding standards. For example: a loop guard of < n or <=n will behave differently if n starts at 0 vs starting at 1. It is best to pick one idiom and stick with it. I suspect the duplication here was semantic in nature and culling it out allowed a reuse opportunity to emerge.
It isn't so startling. That technique is necessary for the more powerful space optimizations of program code: you reduce everything as much as possible, then you find two subgraphs that are similar, you add conditional nodes to them until they're identical (and inject the proper conditions), then you combine them. Repeat until the apparent gains aren't worth the resource cost to acquire them. Beck is just doing it by hand.
It is also frequently used to allow consistent idiomatic expressions to emerge. This allows bugs to be detected via inspection for all cases where the idiom was not followed properly. This is more than superficial coding standards. For example: a loop guard of < n or <=n will behave differently if n starts at 0 vs starting at 1. It is best to pick one idiom and stick with it. I suspect the duplication here was semantic in nature and culling it out allowed a reuse opportunity to emerge.