Conference Presentations

Managing a Successful User Acceptance Test

It's just days before you plan to go live with the new system. The User Acceptance Test (UAT) is the only thing that stands in the way. Will it be successful? Will the users devote the time they committed to so long ago to perform the tests? Will there be agreement among the users about whether the system is "acceptable" to them?-"It doesn't do what I want" vs. "It meets the specifications." Sara Jones describes strategies that empower test teams and users to plan and execute an efficient UAT. She describes techniques she uses to secure and maintain a time commitment from users, ensure the users are ready for UAT, manage scope creep during UAT planning, and process feedback from the users. Learn how to present the UAT results in a complete and understandable format to quickly enable the correct "Go" or "No Go" decision.

Sara Jones, SAIC
STARWEST 2009: Resistance as a Resource: Moving Your Organization to Higher Quality

As a tester, you are an agent of change and a creative, intelligent, and insightful member of your team. You have good ideas about how to improve your organization and its products. You make your proposal. You hear: "We tried that before, and it didn't work"; "We've never done that before"; “That's not a bug, it works as designed"; and a chorus of "No real user would ever do something like that!" You're getting resistance. So, what do you do? Dale Emery explores an approach that works to resolve resistance and help your organization move to higher quality. Whatever else it may be, resistance is information-information about others’ values and beliefs, about the organization, about the change you are proposing, and about you as a change agent.

Dale Emery, Consultant
Seven Factors for Agile Testing Success

What do testers need to do differently to be successful on an agile project? How can agile development teams employ testers’ skills and experience for maximum value to the project? Janet Gregory describes the seven key factors she has discovered for testers to succeed on agile teams. She explains the whole-team approach of agile development that enables testers to do their job more effectively. Then, Janet explores the “agile testing mindset” that contributes to a tester’s success. She describes the different kind of information that testers on an agile team need to obtain, create, and provide for the team and product owner. Learn the role that test automation plays in the fast-paced development within agile projects, including regression and acceptance tests. By adhering to core agile practices while keeping the bigger picture in mind, testers add significant value to and help ensure the success of agile projects.

Janet Gregory, DragonFire Inc.
STARWEST 2009: Seven Habits of Highly Effective Automation Testers

In many organizations, test automation is becoming a specialized career path. Mukesh Mulchandani and Krishna Iyer identify seven habits of highly effective automation specialists and compare them with Stephen Covey’s classic "Seven Habits of Highly Effective People." Mukesh and Krishna not only describe behavior patterns of effective automation testers, but also discuss how to internalize these patterns so that you use them instinctively. Drawing on their experience of managing large test automation projects for financial applications, they describe obvious habits such as saving and reusing tests. They then describe the uncommon but essential habits of strategizing, seeking, simplifying, selling, and communicating. Learn how to avoid the bad habits that automation test novices-and even experts-may subconsciously adopt.

Mukesh Mulchandani, ZenTEST Labs
Make Defects Pay with Root Cause Analysis

Although finding and fixing a defect can improve software quality, often its greatest value is to use the defect as a catalyst for preventing a similar problem in the future. If you identify a defect's preventable cause and permanently correct the issue, your organization can quickly recoup the costs to find, fix, and clean-up a defect. Root cause analysis is a powerful technique that has long been used in manufacturing industries to learn from mistakes. Randy Rice presents a simple way to adapt this technique to the software in your organization. Randy recommends practicing root cause analysis on different classes of defects before deploying it on a wider scale. The beauty of this simple approach is that any organization can apply it with minimal investment. Learn the pitfalls to avoid and how to isolate the root cause from other contributing causes to make your defects pay.

Randy Rice, Rice Consulting Services, Inc.
Managing a Globally Distributed Test Organization

Although many businesses have successfully outsourced software development and testing activities, managing a truly globally distributed test organization comes with a unique set of challenges. Traditional test processes often break down under the pressure of multiple time zones, varying cultures, and numerous technology issues. Communicating standard test procedures, managing exit criteria, and determining release readiness are all more difficult. Anu Kak provides insight into the practices he has employed in his professional career to manage multi-country, distributed test organizations. These practices range from a one-stop portal for communicating goals and status for all phases of development and testing to opportunities for cross-global teams to collaborate on solving test, development, and automation issues.

Anu Kak, PayPal
Detective Work for Testers: Finding Workflow-based Defects

Workflow-based Web application security defects are especially difficult on enterprises because they evade traditional simple point-and-scan vulnerability detection techniques. Understanding these defects, and how and why black-box scanners typically miss them, is the key to creating a testing strategy for successful detection and mitigation. Rafal Los describes the critical role that application testers play in assessing application workflows and how business process-based testing techniques uncover these flaws. Rafal demystifies the two main types of workflow-based application vulnerabilities: business process/logic vulnerabilities and parameter-based vulnerabilities. As the complexity of Web applications continues to increase, learn how to adjust your testing strategy to make sure you don’t miss these unique types of defects.

Rafal Los, Hewlett-Packard Application Security Center
Coloring Outside the Lines: Web Services Interoperability Testing

Web services interoperability testing is complex, subjective, and unlike traditional testing in many ways. The Web Services Interoperability (WS-I) organization provides a wealth of materials and tools about interoperability. With this help, his organization increased the confidence in the interoperability of their Web services-beyond the scope of vendors' testing tools. Christopher describes the WS-I test information that helped justify additional interoperability testing. In addition, WS-I testing tools helped him determine when their Web services "colored outside the lines" by going beyond what the Web services community has agreed upon as the foundations for interoperability. Learn to assess your specific implementation against the WS-I Profiles requirements while determining if your application's Web services extensibility points could adversely affect interoperability.

Christopher Ferris, IBM Software Group, Standards Strategy
Test Automation Objectives

Test automation efforts frequently fail because of unrealistic expectations, often the result of choosing poor objectives for automation. Dorothy Graham explains the pitfalls of a number of commonly-held objectives for automation and describes characteristics of good automation objectives. These objectives seem sensible at first and are common in organizations-find more bugs, run regression tests overnight and weekends, reduce testing staff, reduce elapsed time for testing, and automate x% of the testing. Finding more bugs is a good objective for testing, but not for automation, especially automation of regression tests. Running tests outside working hours is only worth doing if the tests are worth running. Reducing testing staff is a management issue, not an automation objective-in the majority of cases, more staff is needed, not less!

Dorothy Graham, Consultant
Testing Lessons from Classic Fairy Tales

Once upon a time, in testing conferences not so long ago, Rob Sabourin presented useful testing lessons from the most unlikely sources: the Looney Tunes gang, the Great Detectives, Dr. Seuss, Hollywood movies, the game of baseball, Monty Python, labor and delivery nursing, and the Simpsons. Now he turns his attention to lessons from classic fairy tales, those timeless fables designed to entertain and teach simple moral truths to children that also have important lessons for testers. What can the Three Pigs teach us about contingency planning? Can Mother Goose teach us to be great test leads? Can testers get the message across without crying wolf and live to tell the tale? What does Red Riding Hood teach us about critical thinking? Can we learn fundamental test design approaches from Goldilocks and the Three Bears? Were Hansel and Gretle test driven developers?

Robert Sabourin, AmiBug.com, Inc.

Pages

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.