Software craftsmanship: adding new dependency is a breaking change
No classes updated, only new dependency added to the “build.gradle” or “pom.xml”. Is it a breaking change?
Software craftsmanship is not only clean code but also ensuring backwards compatibility, following semantic versioning and thoughtful dependency management. If you don’t change the code at all, but only the dependencies of your code, you can still introduce backwards incompatible changes that potentially break your clients.
This post was triggered by a discussion by recent new feature coming with Mockito 2. The new feature brings opt-in support for mocking final classes and methods. We are very excited about it. Mockito keeps improving testing experience!!!
Back to dependencies...
Adding new dependency is a breaking change
- New dependency is not allowed in corporate repo of your client (for example, due to poor reputation or security concerns). The client cannot use the new version.
- New dependency is already declared by the client, but with different, incompatible version. Conflict resolution picks up later version causing failures at runtime.
Updating dependency could be a breaking change
If your code depends on version 2.0 and you bump the dependency to 3.0, it is most likely a breaking change to your clients. It’s because your client may already be using this dependency, at version that is incompatible (for example, at version 2.0).
In the same scenario, If you bump the dependency 2.0 -> 2.1, you should be safe, so long the minor version is truly compatible.
Removing dependency is typically safe
Software components should not rely on the presence of transitive dependencies but declare what they need explicitly. Say you remove a dependency and ship your software. If your client breaks, it means he has incorrectly declared his own dependencies.
Happy semantic versioning!
I think the root issue is the usage of wildcard versions. All of these changes won't break the build of an unchanged project unless your use wildcard dependencies.
I love these short, insightful, posts! I've been bitten in the past by transitive dependency bumps, but I hadn't thought of it formally like that. It sounds very strict at first glance, but when you come to think of it, I think it makes a lot of sense.
(Typo in title :p) Sof*t*ware
Different context; I live in apps with hundreds of dependencies (both internal and external), which bring in massive major version dependency shifts, and at times require Java 5 compatibility. You name a framework, and I probably have at least two major versions of in the classpath, sometimes not even evicted. There is a premium to pay and occasional problems to troubleshoot, but overall the update ball keeps rolling and fragility due to dependencies is not a major concern slowing development. The entire Java dependency/classpath system is set up for fragility from day one. It is interesting why it works so seamlessly in practice and has served developers for nearly 20 years. Particularly, with the most common scenario, when a dependency bump "Could/should be a breaking change," but it rarely is. This is probably not where you wanted to go with your post, but it does keep me up at night. Could it be that we don't need OSGI and Jigsaw, and that our problem is being organically solved by some high level process resembling homeostasis or natural selection?