Conference Presentations

Today's Top Ten Controversies in Testing

Having been in the IT industry for thirty years, Lloyd Roden believes that we often adopt behaviors even though there is little evidence that they are correct or beneficial. Some of these are "that's the way we've always done it" so they must be right. Others come from the latest development and testing philosophies (fads). Lloyd suggests we challenge our assumptions on a regular basis. His "top ten" list includes ideas such as "managers should be judged on product quality rather than delivery date", "acceptance testing should not find bugs", "defect logs are crucial-even in agile developments", "certification plays a vital role in defining our profession", "we define entry and exit criteria but we don't follow them" and five others. You may not agree with everything Lloyd suggests, but this session will help you understand whether what you do is based on evidence, and what are habits you've adopted without testing them.

Lloyd Roden, Grove Consultants
GUI Testing for Multi-language Applications

An analysis of defect reports on several multi-language projects demonstrated that, for localization, almost 80% of the bugs were cosmetic issues. Because of this tendency, GUI tests are always an important part of the localization testing process. However, manual GUI testing is time consuming, labor intensive, and therefore expensive. For testers, it is boring, tedious, and error prone. An automated test suite is the most efficient method to automate detection of these defects, especially for multi-language applications. Marco Torres demonstrates how to simultaneously verify the quality of the GUI in all the languages supported by an application. Using real-world examples, he shows you how to create automated tests for multi-language environments. Join Marco and learn how to write test scripts that consider future language additions.

Marco Torres, Citrix Systems Japan R&D
Perils and Pitfalls of the New Agile Tester

If your background is testing on traditional projects, you are used to receiving something called "requirements" to develop test cases--and sometime later receiving an operational system to test. In an agile project, you are expected to continually test changing code based on requirements that are being uncovered in almost real time. Many perils and pitfalls await testers new to agile development. For example, as a tester new to agile, you might think, "I'll test the latest 'stories' on Tuesday when I get my next build." And you would be WRONG! Waiting for a new build will almost always put you at least one iteration behind the developers and in a schedule hole from which you cannot recover. To avoid this trap, you must start testing as soon as the developer has completed a feature story, even before coding begins.

Janet Gregory, DragonFire Inc.
Systematic Test Design...All on One Page

Good test design is a key ingredient for effective and efficient testing. Although there are many different test design methods and a number of books explaining them in detail, studies have shown that the regular use of these methods is actually quite limited. What are the reasons behind our neglecting to use these methods? How can we improve our practices to design better tests? Peter Zimmerer shares a poster-sized document, "Test Design Methods on One Page," that displays the big picture of test design through a systematic, structured, and categorized overview of different test design methods. When designing test cases, testers in his organization systematically consider each technique as they design and develop tests. He urges you to give this poster to every developer and tester on your team and put it on the wall in your office.

Peter Zimmerer, Siemens AG
Testing Disasters and Turnarounds

It's good to learn from your own mistakes, but even better to learn from the mistakes of others. Randy Rice presents case studies of testing projects that have gone horribly wrong and reveals the one characteristic they all have in common. Although many of these projects ultimately ended in failure, Randy presents examples where mid-course corrective actions were very successful. Learn important lessons that address test team organization, test environment design, working with software developers, test outsourcing, test tool integration, team building, and the importance of strong leadership to the success of your testing. Find out how to become a change agent and what you need to do in order to turn around testing projects that are headed toward disaster. Avoid the mistakes of others and focus your efforts in leading your test team to get the best possible results.

Randy Rice, Rice Consulting Services Inc
Growing Our Industry: Cultivating Testing

Although software testing is a relatively young discipline, immaturity is not the only reason we are still developing our methods, professional qualifications, trade associations, and its position in the software industry and society. All successful professions must continuously evolve and grow. For example, horticulture has been practiced for about 8,000 years longer than software testing. During those millennia, horticultural practices have continued to develop, supported by accidental discovery, increased scientific understanding, and improved technology. Horticulture has brought many benefits and, at the same time, dangers and environmental damage. Just like horticulture, software testing is a multi-discipline, science- and technology-driven industry with political, sociological, and economic implications.

Isabel Evans, Testing Solutions Group Ltd
Assuring Web Service Quality

David Fern demystifies Web services technology, explaining that Web services are loosely coupled, language independent processes that communicate following SOAP standards using XML messages. He describes how to ensure the quality and compatibility of these unique applications by addressing their specific challenges and risk mitigation test strategies. David shares some of the tools that help him perform functional testing as you would with a GUI application-compare and validate the request and response XML files, ensure language independence and cross platform compatibility with .NET and Java services, and ensure that the Web Service Definition Language (WSDL) specifications meet the Web Service Interoperability Organization (WS-I) standards. Finally, David describes how to integrate these tools and strategies to support and automate your Web service testing process.

  • Web services and how they work
David Fern, Social Security Administration
Agile Testing: Traditional Testing Meets Agile Development

Agile development methodologies are taking center stage in many software organizations today. Testing in a highly iterative environment adds great opportunities for success but it also brings challenges. Dietmar Strasser explains how to successfully transform testing from a traditional process to a highly iterative approach that aligns testing efforts around requirements while fostering communication and collaboration among all team members in a distributed development environment. Dietmar describes how to move to an iterative, SCRUM-based development approach and, at the same time, align testing activities around it while dealing with an ever-evolving set of processes and technologies. Lessons learned, tips, and tricks will be shared based on Borland’s experiences moving to an iterative approach.

Dietmar Strasser, Borland Software
Test Cases as Executable Specifications

While testing major architectural changes to a legacy product, it became clear to Andy and Ronelle's team that without a close association between test cases and application requirements there was no assurance that these requirement were met. They detail the processes, tools, and workflows their team developed to produce high quality, searchable, well documented test suites that provided direct transfer of testing requirements from developers to test engineers. These test suites now serve as a continuous verification that the product functions remain current in their dynamic development environment. Any test failure is a clear indicator that the product does not implement the functionality called for in the specification. Reporting capabilities of the tools provide insight into the progress of testing throughout the development cycle.

Ronelle Landy, The Mathworks Inc.
The Rise of the Customer Champions

The customer champion model is a new way for test teams to systematically collect, organize, and act on customer feedback. This model helps test teams think more strategically about their overall customer connection approach, in addition to growing the test discipline in the long term. Mike Tholfsen describes how the Office OneNote test team employed innovative customer connection techniques to improve product quality and customer satisfaction during the Microsoft Office 2007 release. Mike will also talk about how the Office "14" development team brought together test customer champions across forty client, server, service, and shared teams to ensure there is a unified Office voice when gathering user feedback and customer data.

Michael Tholfsen, Microsoft Corporation

Pages

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.