One of the most important metrics one can and should have on a project is technical debt. In projects I've seen during my career, the technical debt was usually forgotten or consciously omitted. In the rare cases where it's taken in account, it is usually assessed informally, without using clear model, and neither the persons evaluating the debt or the ones who use the assessment know exactly what is hidden behind.

The model presented below and the assessment tool make it easier to rate technical debt of your project, communicate clearly the metrics and, more importantly, explore the opportunities to mitigate the debt in the project.

The model is designed to be language-agnostic and without preference for one software development process methodology, such as Waterfall, over another, such as XP. The model tends also to be scale-agnostic, but keep in mind that while it is crucial to run at level 3 for a large-scale project, there is nothing wrong to run at level -1 for a tiny personal project.

The name of the model—ABCDE-T—reflects the six factors it takes in account: architecture, bugs, code, documentation, environment and tests. Every of those six factors is important, and being at level 3 for one of them doesn't justify remaining at level -1 for another.

Discover the model

    Level -1   Level 0   Level 1   Level 2   Level 3  
Architecture and design (project)   There is no architecture or design, but just code written from scratch.   Informal architecture and design are done at the beginning, but not kept up to date.   Architecture is summarized in an informal document which is updated on major changes.   Architecture is explained in an informal document kept up to date. The team seeks to improve the document.   Architecture is explained clearly in a formal document kept up to date. Design choices are justified, unless they are obvious.  
Architecture and design (developers)   The team is unaware of design patterns.   The team knows some basic design patterns but don’t use them or often use them wrongly.   The team knows some design patterns and use them in obvious cases.   The team knows some design patterns and use them. Code reviews help improving the usage.   Every member of the team knows most used design patterns and use them appropriately.  
Bugs (project)   The bugs are not reported or collected. There are no metrics.   The bugs are reported and collected ad hoc without uniform, documented approach.   The bugs are reported and stored in a single location. Metrics are not gathered.   The bugs are reported and stored in a single location. Metrics are used only by the management.   Every bug is reported and stored in a single location. Metrics are public. Retrospection uses BI and data mining.  
Bugs (developers)   The team fixes bugs on demand, often by using dirty hacks when under time pressure.   The team fixes bugs on demand, trying to avoid dirty hacks even under time pressure.   The team prioritizes bugs and searches for possible occurrences of the same bug in the code base.   The team is proactive in fixing bugs and doesn’t write new code before fixing all known issues.   Developers seek to improve the process to avoid the same bug to appear in the future.  
Code   Every programmer writes code according to his taste. The code base grows organically, resulting in spaghetti code and code duplication.   Style is enforced manually. Programmers tend to structure their code and to reuse existent blocks.   Style is enforced on commit. Code reviews focus on code duplication and other maintenance issues.   Style is enforced on commit. The team knows and regularly uses basic tools used to analyze code in order to find and improve the worst parts.   Style is enforced on commit. Code analysis is integrated to CI. Metrics are used to perform continuously improving analysis and mitigate risks.  
Documentation   What documentation?   Some parts of the code are documented through code comments and general architecture may be documented, without being up-to-date.   The team cares to document new choices. The documentation is partially updated when and if the team has time.   Code comments are reviewed as carefully as code. Architecture is documented, as well as tricky design choices, following existent patterns and models.   Documentation is written by developers and professional technical writers working together and is continuously adapting to the project changes.  
Environment   The team shares code by e-mail or through shared folders. Some parts of code or configuration may be kept by a single person. Deployment is manual.   The team uses basic features of version control, but deploys manually. Configuration and database schema are not under version control.   The team uses version control to its full extent, and CI to automate testing. Rollbacks exist and are completely or partially manual.   The team uses CI to partially automate the integration and deployment. Delivery is manual.   DevOps use CI to fully automate the process of testing and delivering the product. CI reports are public.  
Tests   Programmers test manually when developing a feature.   Unit tests are used to cover cases where they are straightforward to write. Only some members write tests.   Every member of the team writes unit, integration and system tests on daily basis. Tests are focused on tricky parts of the code base.   The code coverage is measured during CI and reported to developers. Functional tests cover every case of the backlog.   The team has good code coverage for both unit/integration/system and functional tests. The results are public. The team invests time in pdiff, A/B testing, formal proof, etc.  

Assess your project

Architecture and design (project)

Do or did you have architecture?

Architecture and design (developers)

Are design patterns used within the team?

Bugs (project)

Do you use a bug tracking system?

A simple Excel spreadsheet qualifies.

Bugs (developers)

Are dirty hacks accepted when fixing bugs under time pressure?


Do the team members tend to take care of code duplication?


Do you have any documentation (simple comments in code qualify too)?


Do you use a version control?


Do you have automated tests?


Architecture and design N/A (N/A, N/A)
Bugs N/A (N/A, N/A)
Code N/A
Documentation N/A
Environment N/A
Tests N/A
Overall N/A

Understand the result

The level you obtain should be interpreted according to the scale of the project. For a small personal project, obtaining -1 as a final result has nothing alarming: improving the score may slightly benefit the project, but wouldn't be particularly useful. On the other hand, for a large project involving hundreds of developers, a score inferior to 2 indicates a serious issue.

Contact me

Do you have questions, remarks or constructive criticism? Contact me:

Arseni Mourzenko E-mail | Official profile |