Your Testing Standard

“At least my standards are at a height from which they can fall.” These words were uttered by a friend many years ago and I am reminded of them over and over again when I hear than an organization is struggling with its testing and needs help.

Or, they think that they’re struggling with their testing but what they’re actually struggling with is the fact that the information being provided by their testing team contradicts the information provided to them prior to commencing testing regarding the quality and readiness of the software for deployment. Add to this the various interpretations of terms like “Done”, “Ready” and even “Quality” and you’ve got yourself a problem – but not solely a testing one.

Last year I joined a gym. I walked into that gym telling myself that I’d fit in a workout whenever I could, perhaps twice a week if I were lucky but my underlying assumption was that twice a week was probably good enough for me to attain the results I wanted. I felt good saying this because it showed I was going out of my way to train and I was making an effort wasn’t I? Much to my surprise (replace surprise with “shock and horror”), the trainer behind the desk told me to walk away and not bother joining if that was going to be my attitude. He told me that unless I sorted out my priorities, put a training plan on my daily schedule and stick to it, increasing the intensity every time the pain lessened, I was wasting my time. After initially feeling indignant and then humiliated I decided I was going to dig my heels in and make an effort and over time, I got the results I wanted.

Quality is a bit like that. It’s not a “I’ll get around to it at some point after all the real work (aka feature development) is done”. The trainer in question was doing me a huge favour by telling me the truth without sugar coating it. I may not have liked it but I needed to hear it.

In theory, having an agreed Definition of Done should alleviate some quality issues; when we commit to implementing a story then we commit to meeting the Definition of Done for that story. But, what if in meeting the DoD for a story we actually break functionality for another already-implemented story – when do we get around to fixing that? If we push out the fix aren’t we also pushing out the technical debt associated with it? Maybe we need to aim higher with our DoD’s – I really believe that in the majority of cases, deferring bug fixes for these kinds of issues is false economy and just moves costs from Development to Support.

Problems also happen when we make an exception for one thing and suddenly we’re making exceptions for everything. Maybe we are told that a customer absolutely must get something on a particular date or the world as we know it will end so we cut corners to hit that date intending to go back and fix things up afterwards. The odds are that something else will then come in as another high priority and we never get around to that much needed clean up; it just remains an item in an ever-growing backlog of bugs masquerading as enhancement requests (and the world doesn’t end).

Maybe it’s time to draw a line in the sand, set some higher standards and establish a culture where we encourage the best quality from the onset – quality of features rather than number of features. Lowering standards in order to get into the testing phase and tick the “Done” box will always come back to bite and that bite will hurt a lot more than enforcing higher standards in the first place.

If you’re in an organization where there’s constant pressure on teams to bypass these standards then using software to enforce gating processes can contain poor quality software within an area until it’s improved and it has the added benefit of removing the blame from the person who says No.

The contrary is worth thinking about – every time a piece of poor software is knowingly allowed out of a Development team, it costs the company money either directly in terms of reputation or indirectly in terms of increasing the time required to deliver new features. Eventually the standards (and software) are at a height from which they can fall no further and you’re looking at a vision of ugliness.

So, whatever you’re doing, please make sure your standards are at a height from which they can fall. Your customers will thank you for it.

If you are developing or testing RESTful APIs, we can help you start raising your bar right now. Start with static analysis of your API definition file by using our Swagger / OAS validator for free, just click here:


You can help! Please share this tool by clicking here:

Tweet Share on LinkedIn

Within seconds you will have a list of errors, warnings and useful suggestions for your dev team to work on to ensure your definition file is not just compliant but solid. This is the first step to RESTful API happiness. Please use the tool regularly and please share this useful tool with your network and let us know what your thoughts are.

Book your FREE one hour consultancy session right now