9th February 2020 - 8 minutes read time
The other day I was conducting a code review and found that a developer had used a trait to give two classes the same group of utility methods. Whilst there was nothing wrong with this in terms of functionality, I asked the developer why they had chosen to use traits instead of inheritance. We eventually decided that an inheritance model would be better suited to the situation but I thought I would go through some of the thought processes here.
What Is A Trait?
A trait, if you weren't aware, is like a class, but you don't instantiate it directly. Traits are defined using the trait keyword and are otherwise quite like a class in structure.
The idea is that code is essentially copied into the class you want to use it in from the trait and the class acts like it had that code all along. For example, let's take a simple trait.