Eliminating Technical Debt
Technical debt has long been the dirty little secret of the overwhelming majority of software development endeavors. Some would even argue that it is an unavoidable cost of doing the business of software development. This sentiment runs so deep that we as an industry have learned to accept it. To the contrary, it has been my long standing opinion that we can do better, in fact, technical debt can be more than reduced, it can be virtually eliminated altogether.
However before we go further, let's consider what technical debt really is. There is a plethora of definitions of this development phenomenon, so let's informally and loosely define it in order to form a basis for the discussion. Technical debt is existing code that is usually created unintentionally that slows the development of future releases. Technical debt causes large tails of effort that take place after the deployment, much more than the normal defect cycle. It prohibits software from morphing with the needs of the business and can lock it into technologies that inevitably become stale and unsupported over time.
If one subscribes to the theory that the company with better technology has a competitive advantage, than technical debt is the bane of the company's existence. It will either cause the company to invest in an ever increasing amount of resources just to maintain the software or put it out of business as other companies come along with better software and less technical debt.
Whatever your definition, all can agree that technical debt is a serious road block to a company's health and in many cases, cause its competitors to overpower the company in the market place. Unfortunately, measuring technical debt and indeed the efficiency of a software effort is difficult, almost always misunderstood, and certainly mismanaged. There are those who believe that if software is deployed on-time, it is a successful project. This type of thinking is at the very core of the industries misconception of what separates the great software shops from everyone else.
When a house is built, it can be closely compared to another house of similar size, design, quality, and ultimately, cost. The same is true with automobiles, clothes, electronics, food, and most every other industry. With software however, the number of variables with respect to features, defects, time-to-market, performance, scalability, volatility, extensibility, maintainability and yes, technical debt make these comparisons a very difficult task, if even possible. The lack of the ability to compare diminishes and clouds the ability to measure the real effectiveness of the development effort and reasonable cost, both before release and after.
Technical debt can be reduced to near-zero through great architectural practices, superior tooling, and the leadership and mentoring to execute each. While these concepts are completely different and carry different skill sets, it is the combination of them that form the basis for significant competitive advantage in the market place.
Architecture is a very nebulous and misunderstood part of software development. Architecture is the intersection of science and art. Computer science programs in academia are very good at teaching the science of software, but true architecture is not taught because it is not understood. Architecture is not boxes, flow charts, class diagrams, and activity diagrams. It is far more akin to building architecture than the mathematical concepts taught in computer science. It has little, if anything to do with "Data Structures", "Algorithms", and "Computational Theory" that form the basis of computer science curriculums. It is no wonder that architecture in software does not carry the weight that it does in other industries such as building construction.
Software architecture starts with an understanding of the domain at a high level. The domain knowledge along with the basic, not detailed requirements, form the basis of software's architecture. It is clear that the requirements will change, sometimes dramatically, and sometimes before completion of the project. Why then, would we want to architect to functions detailed by requirements? Surely, we are setting ourselves up for technical debt. Conversely, if the architecture conforms to high level requirements and is built with the volatility of the detail requirements in mind, it will be far less susceptible to ever-changing requirements and more capable of morphing with the competitive nature that business demands. If the architecture is on solid ground, technical debt will be minimized and a competitive advantage will be realized.
While intensive mathematical, computer science, and business courses along with countless conferences, books, presentations and probably in the order of 100,000 hours of coding has afforded me a grasp of the technical nature of implementing software and the business of running a software development shop, it is building architecture that has allowed for breakthroughs in the very nebulous field of software architecture. It has provided a unique view point of what architecture really is and how it can be used to gain competitive advantages in business.
As stated earlier however, it takes more than just architecture to minimize technical debt, it also takes great tooling. I have been working on Frameworks in one form or another since my days as an architectural student at Wentworth Institute of Technology. It has matured over the years into a tool that not only promotes superior architectural practices, but allows for standardization of code through modeling and code generation across the enterprise. Indeed its usage has reduced the technical debt to near-zero levels. Let's consider some of the features of Frameworks so that we can substantiate some of these claims.
Frameworks is a modeling tool, code generator, build tool, deployment tool, project design tool, and core library. All of these things reduce technical debt and the sum is greater than its parts.
Modeling is at the heart of nearly everything that is built, in every industry across the globe. The types of models change from industry to industry, but few would deny the advantages and indeed the necessity of it. Frameworks promotes modeling with three types of models, the Static Diagram, the Call Chain, and the Detailed Design models. Models allow for developers to quickly understand software at a high level because the old adage of "a picture is worth a thousand words" certainly holds true. Models form the basis for discussions and help files for QA, and aids tremendously in white box testing efforts.
Models are great ways to develop and express architecture, but could also be the basis of so much more. There are three types of models that are produced in Frameworks by an architect.
The Static Diagram model provides a very high level view of the software components that comprise the domain. It also captures information that feed into Frameworks other modeling artifacts. Finally, it creates Visual Studio solutions at sub-second speed. Solutions contain Projects, and Projects contain references, directories, configuration and other building blocks. Creating a Solution with the proper Projects, references, directories, and other characteristics is an extremely tedious and time consuming process that is fraught with inconsistencies and error. Even smaller micro-service development shops can have hundreds of Solutions and thousands of Projects. The fact that each solution looks exactly the same to the developer, allows the developer to know where everything is without prior knowledge of the Solution and this serves to dramatically reduce dependence the company has on an individual developer. This standardization provides for tremendous time savings and reduces technical debt while increasing developer satisfaction.
Since Frameworks intimately understands each Solution and Project, it has the ability to change the solution without the need for hand coding. This is a powerful debt reducer that will be discussed later.
The Call Chain models depict the software components and the dependencies between them needed to implement one or more use cases. The sum of all Call Chain models forms the basis of the Project Design because it describes the dependencies of software components. Dependencies form the basis of any real project management tool as it is the only way to determine how and why a project will be at risk, on time, or late.
The Detail Models sit at a lower level and are intimate with a single service. It depicts many aspects of the service and code generates more than 50 types of artifacts each time the model is saved. I encourage the reader to ponder this last sentence. It generates more than 50 types of artifacts. These artifacts are code that the developer would have to design, write, and test. Further it would need to be tested by QA, and the defect cycle would be in play. These artifacts formulate the foundation for business rule coding. Without the need to write, test and debug non-business rule code, the software is assembled faster, with fewer defects, using chosen standards, all at a greatly increased velocity. It does this by responding to the detail architecture quickly and efficiently. This reduces error, promotes standardization and increases developer satisfaction.
Core libraries, modeling, code generation, promotion of business rules, and reduction of defects and testing are all valuable tools to have at ones disposal. There is yet another feature that reduces technical debt to near-zero levels. Technologies naturally come and go and when a new technology or pattern avails itself as having an advantage, legacy code needs to be refactored to employ the new technology. Rewrites and refactoring is an extraordinary expensive undertaking in a large code base. The expense is often so great that on most occasions it is simply not feasible. Companies then pile up dangerous levels of technical debt and eventually lose their competitive advantage.
Frameworks combat this in a very elegant way as it accounts for all future technologies. Part of the Frameworks family of tools is libraries that use the new C# compiler, Rosalyn. Rosalyn can be used to analyze code at a very intimate level, a level that only a compiler can provide. Rosalyn is the only compiler in the world that exposes this information as a service.
When a new technology avails itself, Frameworks uses Rosalyn to analyze the entire code base, and regenerates the code for the new pattern or new technology. This is NOT a sophisticated find and replace feature, rather it is the ability to rewrite old patterns or old technologies with new patterns and new technologies across the entire code base. Old patterns spread across the code base can be analyzed at the code level, refactored and rewritten at will, just like a human being would do it, only at lighting speeds and without error.
To the educated reader, this probably sounds near impossible. However, this is exactly what happened when a company moved to the ServiceFabric from WCF. If you know of the Windows Communication Foundation (WCF), you understand that it is an incredibly powerful but expert technology. Using it to its fullest takes years of experience and education. The company was completely married to WCF as it was entrenched into all 400+ services. Refactoring this technology to take advantage of the much superior model of the new ServiceFabric would have been an extremely difficult task and at a minimum, cost prohibitive.
Utilizing Frameworks, we were able to decipher the deep WCF patterns and unending configuration and variables. Frameworks was then trained to rewrite the WCF code into ServiceFabric code across all services. The result was the rewriting of all the WCF code into ServiceFabric code without the benefit of a team of developers. In fact, this took place without disrupting the team's velocity and in fact, the only reason they were aware of the transformation is because they were told so. This is not a fantasy or theory as I offer it as a real use case that took place last year.
There is a fact that is intimated that needs to be emphasized. WCF is an expert technology that takes years to master but there is no need because Frameworks was writing ALL of the WCF code. ServiceFabric is a great new technology that developers would normally need to train on and get used to. Once again, there is no need because Frameworks writes ALL of the ServiceFabric code on behalf of the developer. The same is true for the ServiceBus, NHibernate and other staples of the development environment. Developers only write business rules and thus the ROI of each developer is greatly increased. This is only possible because of Frameworks and superior architectural practices employed by the companies using it.
In closing, utilizing superior architectural techniques, a tool with unparalleled capability, along with leadership and mentoring allows us to produce considerably better software with less technical debt at an increased rate of velocity. It also allows the ever-growing code base to continuously utilize emerging technologies without the need for tremendous efforts by human beings. Consequently, a code base need not grow old and out of date.
Interesting points and insights you get forward John. Technical debt could also be linked to the increased cost of maintaining software - largely because the architecture was not sound or alternatively not implemented with the rigor it was defined with. While a good architecture is important, I believe that focus (right from the top) to ensure the architecture is implemented as defined is equally important.