Monday, April 21, 2008

QA is not just another place to cut costs

In every professional development job I've had QA is the bastard child of software development, perhaps in all development.
For software development there's a particularly harmful trend to the role of QA. QA is treated like the proving grounds for developers. Software development shops will bring in inexperienced software developers and have them 'pay their dues' in QA until they prove themselves worthy of getting a development role. I think you're better off getting a bunch of people off the street to do your QA testing than you are having people who want to develop perform the testing. Reasons:
  • People who lack specialized computer training are going to reflect the actual users of a system better than people who spend 10 or more hours on a computer a day.
  • People who lack specialized skills are going to appreciate the role of a QA tester more than a person who has the training and ability to contribute as a software developer.
  • People who have gone through the effort of earning a degree in computer science with the intention of programming, will take the first opportunity to get a programming job. It may be outside your organization.
  • Along the same vein, why waste someone's development abilities? Are you having a developer test code directly?
  • Lastly, in the same way that Bruce Schneier describes The Security Mindset people who want to create software don't necessarily have the ability to test software for flaws. There are people who are better suited for finding flaws.
The only thing I really like about the approach of having software developers pay their dues in QA is it provides an opportunity to learn the products and it gives soon to be developers the opportunity to see the flaws in a system before they work on it.
With that said, having people regularly rotate into QA roles may actually be a healthy activity. If the worst side of the product is apparent to the entire organization, then it's difficult to ignore it. This can work especially well if the people who have the ability to make changes are exposed to the correctable pain points of QA testing.
That's not the practice in most places. In reality, QA has a role ranging from a speed bump to actually reducing the number of defects that get propagated to production. Unfortunately, most QA departments trend toward the former. The purpose of organizing people into QA departments is to prevent defects from reaching the customers. I think that everyone can agree that customers seeing defects is bad.
There are a few reasons that I believe QA departments become ineffective.
  • Gaps between QA and production environments. Production equipment can be expensive. Many organizations do not believe that having an identical, or sufficiently similar test environment justifies the expense. If you aren't going to replicate production as closely as can reasonably be done, why bother with having a QA environment? It would be far quicker to go from dev to prod. If you're interested in having a place where defects can be found and deleted, then build a QA environment that's like production.
  • QA environments that are like production, but still have gaps. If there are cron jobs on prod, have the same jobs on QA. If prod is behind a firewall, put QA behind the same one. If you can safely replicate prod's data into QA, do it. Every difference between QA and prod is an opportunity for someone to dismiss a defect as an environmental issue.
  • Productivity initiatives. Managers who want to get the most out of their QA resources will encourage them to script and streamline their work. That's good, take it a step further and have a computer do that. Use Selenium. Seriously, anything that can be scripted for a human to do can be scripted for a computer. Humans don't like that kind of work, let them focus on things that humans are good at, like exploring.
  • Let me restate the last point, people who use software don't use that software according to a script. Even when they do, they won't stick to it for long. They also don't use software consistently. Users forget what they should be doing, or they play around with the interface to discover functionality. If the actual users are going to use the software that way, shouldn't the testers test software to accommodate the users?
To many of those who manage QA departments, the previous four points cost more money than they need to. They see exploratory testing as inconstant and wasteful. They prefer to have detailed standardized test plans that anyone can follow. I say they don't 'get it' when it comes to Quality assurance.
I would argue that if you're going to go through the trouble of writing those test plans, why not write them in Selenium and have computers run them as JUnit tests. Better yet, integrate them into a continuous build server.
Why not keep a QA environment that is as close to production as reasonably possible? If the worst were to happen, it could serve as a failover, or act as an emergency production server.
The thing that probably bothers me the most is when a tester is encouraged to get through their test cases as quickly as possible and discourage them from looking into anomalies.
A big part of this comes from bad scheduling. Testers tend to get stuck under a time crunch because their allotted QA time is cut short because the rest of the schedule fell behind.
QA analysts are often put under the gun to finish their testing as quickly as possible. If you aren't going to give QA time to find the defects, why even go through the motions of having QA? As crazy as that sounds, I believe that most QA departments are sadly adding negative value. If QA only slows the project without finding defects, then they aren't adding value.
We can all agree that even the best of us make mistakes. Those of us who are married can attest that when we make mistakes it will only be a matter of how long, before someone notices. Let's call it the when. In software development the sooner you can make that when, the better. If the when is when the customer is depending on your product, it will cost you much more than if the when occurs when you're designing the application. Testers and developers should strive to make that when now.
To achieve this you need to get the testers involved as soon as possible. Provide them something to test and give them the freedom to get creative in their testing. Have testers of varying levels of experience with the product. Have testers with no experience with the product and let them loose on it. You're going to have untrained users, why not have untrained testers?
The biggest expense with untrained testers is the draw it takes on developers when their defects are allowed to reach developers untriaged. There's a simple way to prevent this, have someone with experience act as the point person for development and QA. That person can find duplicates and can build a relationship with the developers.
Although, this may appear to be expensive, it's cheaper in the long run to invest in quality. Defects are expenses that cost more than can be clearly quantified. They cost in terms of credibility and reputation, which does turn into real money.
Think of it this way:
My first two cars were a Chrysler and a Chevy. Neither one of those vehicles would run for more than six weeks without breaking down.
My third car, was a 1987 Toyota Camry. The Camry lasted me four years without a single breakdown. Even the year when I didn't change the oil.
Since then I've owned another Toyota and a Subaru. Both of them have provided me with a perfect record of reliable transportation.
The next time I buy a car, do you think I'm even going to consider a GMC or Chrysler?


Anonymous said...

QA is a good job to start IT career. You don't need serious background, just some knowledge that you could find in books and internet. It's not so prestigious as developer, but still very good and well paid job. I start as trainee in 2 years ago and now i am qa engineer.

avani said...