Blog

go back to the blog

Agile Tester vs. Technical Debt

  • 10/10/2013
  • 8272 Views
  • no comments
  • Posted by EuroSTAR
-->

Below is an abstract from the book “Agile Testing – Der agile Weg zur Qualität” (Agile Testing The agile way to quality) from the authors Manfred Baumgartner, Martin Klonk, Helmut Pichler, Richard Seidl and Siegfried Tanczos.

“Technical Debt”, is a term for the work required to put a piece of software in the state that it should be in. It may be that this is never done and the software remains in the substandard state forever, but still the debt remains. In other words, technical debt is a term for substandard software. It was coined to describe poor quality software in terms that business managers can relate to, namely in terms of money, money required to fix a problem [KrNo12]. Managers should become aware of the fact the neglect of software quality costs them money and that these costs can be calculated. The notion of “debt” should remind them that someday they have to pay it back or at least try to reduce it, just like a country should reduce the national debt. The size of the national debt is an indicator that a national economy is spending more than what it is producing. It has a negative balance. The size of the technical debt is an indicator that an organization is producing more software than it can make correctly as it should be. The amount of the national debt can be measured absolutely in terms of Dollars or Euros and relatively in relation to the gross national product. The same applies to the technical debt. It too can be measured absolutely in terms of money required to renovate the software and relatively in terms of the costs of renovation relative to the development costs. Just as a country whose national debt exceeds its annual gross national product is in danger of bankruptcy, a software producer whose technical debt exceeds its annual development budget is in danger of collapsing. Something must be done to eliminate or at least to reduce the size of the debt.

What is Technical Debt?

The notion of “technical debt” was coined by Ward Cunningham at the OOPSLA conference in 1992. The original meaning as used by Cunningham was “all the not quite right code which we postpone making it right.” [Cunn92]. With this statement he was referring to the inner quality of the code. Later the term was extended to imply all that should belong to a properly developed software system, but which was purposely left out to remain in time or in budget, system features such as error handling routines, exception conditions, security checks and backup and recovery procedures and essential documents such as the architecture design, the user guide, the data model and the updated requirement specification. All of the many security and emergency features can be left out by the developer and the users will never notice it until a problem comes up. Should someone inquire about them, the developers can always say that these features were postponed to a later release. In an agile development they would say that they were put in the backlog. In his book on “Managing Software Debt – Building for Inevitable Change”, Sterling describes how this debt accumulates – one postponement at a time, one compromise after another [Ster10].

 

There are many compromises made in the course of a software development. Each one is in itself not such a problem, but in sum the compromises add up to a tremendous burden on the product. The longer the missing or poorly implemented features are pushed off, the less likely it is that they will ever be added or corrected. Someone has to be responsible for the quality of the product under construction. It would be the job of the testers in the team to insist that these problems be tended to before they get out of hand. That means that the testers not only test but also inspect the code and review the documents. Missing security checks and exception handling are just as important as incorrect results. For that the testers must be in a position to recognize what is missing in the code and what has been coded poorly. This will not come out of the test. The test will only show what has been implemented but not what should have been implemented. What is not there cannot be tested. Nor will it show how the code has been implemented. Testers must have the possibility and also the ability to look into the code and see what is missing. Otherwise they would have to test each and every exception condition, security threat and incorrect data state. That would require much too much testing effort. For this reason testers should also have good knowledge of the programming language and be able to recognize what is missing in the code. Missing technical features make up a good part of the technical debt [LTSe12].

the accumulation of technical debtThe other part of the technical debt is the poor quality of the code. This is was what Ward Cunningham was getting at when he coined the term. In the heat of development developers make compromises in regard to the architecture and the implementation of the code in order to get on with the job or simply because they don’t know better. Instead of adding new classes they embed additional code into existing classes because that is simpler. Instead of calling new methods they nest the code deeper with more if statements, instead of using polymorphic methods, they use switch statements to distinguish between variations of the same functions, instead of thinking though the class hierarchy, they tend to clone classes thus creating redundant code. There is no end to the possibilities. In their book on “Improving the Design of existing Code”, Fowler and Beck identify 22 code deficiencies, which they term as bad smells, or candidates for refactoring [Fowl99]. These smells such bad practices as duplicated code, long methods, large classes, long parameter lists, divergent change, shotgun surgery and feature envy. In a contribution to the journal of software maintenance and evolution with the title “Code Bad Smells – a Review of Current Knowledge”, Zhang, Hall and Baddoo review the literature on bad smells and their effects on maintenance costs [Zhan11]. The effects of bad smells range from decreasing readability to causing an entire system to be rewritten. In any case they cause additional costs and are often the source of runtime errors which cost even more. The fact is undisputed that certain bad coding habits drive up maintenance costs. Unfortunately these costs are not obvious to the user. They do not immediately affect runtime behavior. The product owner in an agile development project will not recognize what is taking place in the code. However, if nothing is done to clean up the code, the technical debt is growing from release to release.

 

Early Recognition of mounting Technical Debt

The main contribution of agile testing is to establish a rapid feedback test to development. This is the rational for having the testers in the team. When it comes to preventing quality degradation and increasing debt, the testers in an agile project have more to do than just test. They are there to assure the quality of the product at all levels during construction. This should be accomplished through a number of control measures, measures such as reviewing the stories, transforming the stories into testable specifications, reviewing the architectural design, inspecting the code, validating the unit tests, performing continuous integration testing and running an acceptance test with the user.
The testers in an agile development team must strive to be only one or maximal two days behind the developers [Bloc11]. No later than two days after turning over a component or class, the developer should know what he has done wrong. “Continuous Integration” makes this possible [Duva07]. The tester should maintain an automated integration test frame into which he can insert the new components. The existing components will already be there. Every day new or corrected components are taken over into the build. In the case of new functions, the test scripts and test data have to be extended. Not only will the new functions be tested, but the test of the old functions will be repeated. The setting up of such an automated regression test may take days, but after that the execution of each additional test should take only few hours. The goal is to keep a steady flow of testing. The actual data results should be constantly compared with the specified results by means of an automated data comparator. Should new input data be required, it should be generated by an automated test data generator. The two tools should be joined by a common test language which ensures that the test outputs match with the test inputs. It is imperative that all deviations from the expected behavior of the software be automatically registered and documented. It should go so far that the defect reports are also automatically generated. The tester should only have to enhance them. In agile development time is of essence and anything which can save time should be introduced.

The Agile Tester

The main goal of agile development is rapid feedback. Problems should be reported back immediately to the developer to be resolved before they become acute. This is also the great advantage of agile development visa via classical phase-oriented software development where it often took weeks if not months before the problems were detected. In the meantime, the developers assumed that all was honky dory and went on developing new erroneous components or, in the worst case, they did nothing and just waited on the feedback from the testers. In this way many valuable hours of tester time were lost. Thus, it is truly beneficial to have component integration testing running parallel to component development [Humb11]. The early recognition of defects and the quick non bureaucratic handling of defect reports is one the main advantages of the agile development process over the classic waterfall process [Cock02]. To function well the testers should be in constant contact with the developers. Whether this contact must be physical is disputed. The authors of the agile manifesto put great emphasis on “face to face” communication. Many proponents of agile development agree with that and insist that the testers be physically together with the developers. Others argue that it is enough for the testers to be in virtual contact with the developers. Physically they might be in another continent. The proponents of the virtual team claim that it suffices to communicate daily with a social networking system and video conferences via the internet. Whether this really is feasible or not remains to be seen. For the moment there remains two schools of thought [Bava12].

 

Abstract from the book Agile Testing The agile way to quality
agile content CTA

References

[KrNo12] Kruchten, P./Nord, R.: “Technical Debt – from Metaphor to Theory and Practice”. IEEE Software, Dez. 2012, S. 18

[Cunn92] Cunningham, W.: “The Wgcash Portfolio Management System” Proc. of ACM Object-Oriented Programming Systems, Languages and Applications – DOPSLA, New Orleans, 1992, S.29

[Ster10] Sterling, C.: Managing Software Dept – Building for inevitable Change, Addison-Wesley, 2011

[LTSe12] Lim, E./Taksande, N./Seaman, C.: “A Balancing Act – What Practioneers say about Technical Debt”, IEEE Software, Dez. 2012, S. 22

[Fowl99] Fowler, M.: Refactoring – Improving the Design of existing Code, Addison-Wesley, 1999

[Zhan11] Zhang, Y./Patel, S.: Agile Model-Driven Development in Practice”, IEEE Software, March, 2011, S.84

[Bloc11] Bloch, U.: “Wenn Integration mit Agilität nicht Schritt hält”, Computerwoche, Nr. 24, Juni 2011, S. 22

[Duva07] Duvall, P./Matyas, S./Glover, A.: Continuous Integration – Improving Software Quality and reducing Risk, Addison-Wesley, Reading Ma., 2007

[Humb11] Humble, J./Fairley,D.: Continuous Delivery, Addison-Wesley,Boston,2011

[Cock02] Cockburn, A.: Agile Software Development, Addison-Wesley, Reading, Ma., 2002

[Bava12] Bavani, R.: “Distributed Agile Testing and Technical Debt”, IEEE Software, Dez. 2012, S. 2

Blog post by

go back to the blog

eurostar

Leave your blog link in the comments below.

EuroSTAR In Pictures

View image gallery