Concurrent testing is the concept that as software is being developed, it is also being tested. Concurrent testing can be done in several ways; one of the most common is to perform testing at the system level. As a development team completes coding requirements for an application or system, this required code becomes testable, while other team members can execute test cases against the completed code.
Personal experience within Agile testing using SCRUM, XP, Crystal, and other processes has revealed that concurrent testing can be incorporated to shorten the feedback loop to the Agile team. This helps solve issues within the Cost of Failure concept. Cost of Failure is defined as the concept that the longer you wait to get something fixed, the higher the price/cost to fix it. As the feedback loop is shortened, the cost of the defect/bug is reduced.
TASK SCENARIO
Here is a sample scenario where concurrent testing can add value.
Team Process Scenario
You are working in a software development company where Agile practices have been adopted; SCRUM is also used with the Agile team. The team consists of:
- 4 developers
- 1 QA Analyst
- 1 GUI Tester
- 1 Technical Writer
Average cost per team member is $72/hr, which includes energy costs, application hardware and software usage, etc.
Development Process Scenario
Currently in Iteration/SPRINT 3, the developer has:
- Set up test cases (based on Test-Driven Development processes)
- Created tasks
- Executed the test cases and experienced failure
- Developed solutions via software to ensure the tests pass
- Senior Developer has reviewed code and agreed that coding has been done within Story guidelines (tasks completed, tests passing)
Testing Process Scenario
- Tester is required to run test cases in a like-for-like system environment (virtual machines?)
- Tester discovered 2 critical defects that the developer missed (negative tests were executed; not included in development Unit tests)
- Tester and Developer worked on problem discovery and solutions, OR tester documented issues and moved on
Aftermath Scenario
- 2 weeks later, all stories completed, review and retrospective done, planning for next SPRINT commences
- Developer asks about Defect discovered in SPRINT-3; after 30-minute review with team, the Product Owner agrees that this is a major defect and requires a fix
- After spending 5 minutes estimating scope of fix, QA Analyst works with tester to create test cases to validate the defect fixes (10 minutes)
- Developer spends 15 minutes during SPRINT-4 reworking code, 5 minutes getting tests to pass, and returns tests to tester for final execution
Cost Comparisons
- 8 team members in review for 35 minutes (280 minutes total)
- QA Analyst and Tester rebuild test cases for defect (20 minutes total)
- Developer re-work (20 minutes total)
- 320 minutes X 72/Hr. = $381.60 for work during SPRINT-4
- If re-work completed in SPRINT-3 = $48
- ROI on concurrent testing when used with short feedback loop = 800%
Conclusion
Significant savings if defect discovery and solution performed in SPRINT-3; planning removed from equation and less man/hours required if fix completed in SPRINT-3.
Additional Solution Scenario
Work on a model where issues are brought up in Daily Stand-Up Meeting and determine if team should immediately schedule and work on fix. Typically, the idea can be adopted that if it takes less than 15 minutes to fix defect immediately, do it. Otherwise, document the issue and include it on Product Backlog for next iteration. This works well unless the defects begin to keep developers from daily work schedule for their tasks in the current iteration. This is a textbook example of scope creep, and team may not deliver finished product for iteration.
About the Author
Bob Small has 10 years in the IT industry. Bob has been a developer for a Professional Senior care provider. Bob started as a System Tester for the number one domain registrar in the world. Bob continued his career in testing and advanced into Quality Assurance at a leading contact center solution provider. Bob has recently started guest lecturing at local Universities and colleges. Bob has won worldwide online testing contests.
Bob continues to learn Agile techniques and mentors those around him in testing techniques and methods. Bob has taught developers and mentored junior QA analysts in testing methodologies and QA responsibilities. Bob’s favorite quote is: “Plan your work, work your plan.”
Chuck Gadinis is a certified quality assurance analyst with over 20 years experience in information technology. Chuck has a wide-ranging background in the testing and use of object-oriented software and languages, as well as mainframe software. Chuck has managed the installation, use, and training of the full range of HP-Mercury automated testing tools, including Quick Test Pro, Quality Center, and Load Runner. He has served in numerous leadership roles, from Project Management to the creation and support of all documentation used within a system development life cycle. Chuck has fully utilized his quality assurance/system test skills in numerous industries, such as public utilities, financial, educational, and healthcare services, and telecommunications.