STAREAST 2001 - Software Testing Conference

PRESENTATIONS

Release Criteria: Defining the Rules of the Product Release Game

How do you know when you're finished testing? How do you know when the product is ready to ship? Sometimes the decision to stop testing and release a product seems as if someone's making deals in a smoke-filled room, or that there are rules of the game of which we are unaware. At times, these rules seem completely arbitrary. Instead of arbitrary decisions, it is possible to come to an agreement about when the product is ready to release, and even when it's time to stop testing.

Johanna Rothman

Removing Requirement Defects and Automating Test

Organizations face many problems that impede rapid development of software systems critical to their operations and growth. This paper discusses model-based
development and test automation methods that reduce the time and resources necessary to develop high quality systems. The focus is how organizations have implemented this approach of model-based verification to reduce requirements defects, manual test development effort, and development rework to achieve significant cost and schedule savings.

Mark Blackburn, Software Productivity Consortium
Results From Inspecting Test Automation Scripts

In many ways, development of scripts for automated testing is similar to software development. It involves requirements, design, code, test, and use. So why not use proven improvement activities to enhance the test script development process? This presentation discusses how one software test team adjusted and applied inspections to test script development. Learn the results of these inspections and how you might use this technique to improve the test script development activity in your organization.

Howie Dow, Compaq Computer Corporation
Risk: The New Language of eBusiness Testing

Balancing testing against risk in eBusiness and e-commerce applications is essential because we never have the time to test everything. But it's tough to "get it right" with limited resources and the pressures to release software quickly. Paul Gerrard explains how to talk to the eBusiness risk-takers in their language to get the testing budget approved and the right amount of testing planned. Find out how to identify failure modes and translate these into consequences to the sponsors of the project.

Paul Gerrard, Systeme Evolutif Limited

Software Test Automation: Planning and Infrastructure for Success

Automation tools are often viewed as a cure-all that will instantly reduce test cost and effort. However, without up-front planning and infrastructure design, automated tests can quickly become difficult to create and maintain, and the tools nothing more than expensive shelf ware. This paper describes how to initiate a successful automation effort by developing standards and processes for automation and an infrastructure designed for success.

Bill Boehmer and Bea Patterson, Siemens Building Technologies, Inc.
Software Testing at a Silicon Valley High-Tech Software Company

This paper describes a methodology for allocating priority levels and resources to software testing and other quality activities to achieve "customer satisfaction." This methodology is based on understanding of what the market and the target users require at any point in time during the
product technology adoption life-cycle. The paper also describes the deployment by a leading market-driven company of effective software testing processes and methods that represent real-world customer issues.

Giora Ben-Yaacov and Lee Gazlay, Synopsys Inc.
Standards for Test Automation-A Case Study

Implementing a set of automation standards adopted and followed by the test team will benefit everyone. This presentation discusses methods of creating and implementing standards, guidelines, and practices for teams of testers writing automated tests. Learn about decisions that can be made early in the product cycle that will have a long-term impact. Explore examples of systems that have worked well--and those that have not.

Brian Tervo, Microsoft Corporation
STAREAST 2001: Bug Hunting: Going on a Software Safari

This presentation is about bugs: where they hide, how you find them, and how you tell other people they exist so they can be fixed. Explore the habitats of the most common types of software bugs. Learn how to make bugs more likely to appear and discover ways to present information about the bugs you find to ensure that they get fixed. Drawing on real-world examples of bug reports, Elisabeth Hendrickson reveals tips and techniques for capturing the wiliest and squirmiest of the critters crawling around in your software.

Elisabeth Hendrickson, Quality Tree Software, Inc.

STAREAST 2001: Designing an Automated Web Test Environment

This paper offers an alternative to the typical automated test scripting method of "record and playback now and enhance the automation environment later." It explores a regression automation system design for testing Internet applications through the GUI, along with scripting techniques to enhance the scalability and flexibility of an automated test suite. This paper will present a basic structure for an automated test environment, and will expand on each of the items found in that structure.

Dion Johnson, Pointe Technology Group, Inc.

STAREAST 2001: Exploratory Testing in Pairs

Exploratory testing involves simultaneous activities-learning about the program and the risks associated with it, planning and conducting tests, troubleshooting, and reporting results. This highly skilled work depends on the ability of the tester to stay focused and alert. Based on a successful pilot study, Cem Kaner discusses why two testers can be more effective working together than apart.

Cem Kaner, Florida Institute of Technology and James Bach, Satisfice Inc.

Pages

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.