|
Developing an Automated Regression Test Set Automating a regression test is a tremendous effort, but the payoff is big in situations where continuous, repeatable, repetitive testing is required. This presentation describes a real-world example of a successful team effort toward developing a reusable automated regression test set for legacy medical software products in a client/server environment. Learn the principles of team building and test case design, and the tools and utilities you need to get the job done. Patricia George also discusses how test data management, the breakdown of programming tasks, and date-driven project milestones increase efficiency to keep the team on track.
|
Patricia George, Sunquest Info Systems, Inc.
|
|
How to Evaluate and Select High-End Load Testing Tool This presentation addresses the following topics related to selecting a load testing tool: what tool characteristics matter; gathering information from vendors; determining metrics to collect; executing the test; analyzing the results; the recording process; and lessons learned.
|
Marquis Harding, TestMark.net
|
|
An Execution Framework for Java Test Automation This presentation introduces the Java Execution Framework, describing test suites, test cases, and the JEF test harness.
|
Erick Griffin, Tivoli Systems Inc.
|
|
Automating Test Design The goals of this presentation are to: Redefine the term "path"; Introduce four value selection paradigms; Discuss strengths & weaknesses of each; Examine how value selection relates to automated test design capability; and Examine how test requirements identification relates to each paradigm.
|
Steve Morton, Applied Dynamics International
|
|
Implementing an Automated Regression Test Suite Many efforts to automate regression testing have failed or not met expectations-resulting in "shelfware." Lloyd Roden presents a real-world case study based on the success of implementing a regression test tool within a software company. Learn the steps taken in evaluating and deploying the tool. Discover the key benefits and successes achieved over a three-year period as well as the challenges faced while using the tool.
|
Lloyd Roden, Grove Consultants
|
|
Adventures in Web Application Performance Testing Examine the challenges and successes experienced by a test team analyzing application and systems performance for applications moving from distributed Client/Server solutions to centralized, Web-based designs. In this presentation, Nancy Landau presents case studies to address the changes made in automated testing methods to handle compressed delivery schedules, new architectures, new test tools requirements, and changing customer expectations. These case studies encompass principles such as managing iterative test development, creating reusable tests, standardizing application metrics, migrating from simple to complex networking environments, and predicting performance bottlenecks.
|
Nancy Landau, ALLTEL
|
|
Scripts on My Tool Belt The aims of this presentation are to: convince you that "test automation" is more than automating test execution; show some examples of the kinds of things that can be accomplished with scripting languages, using simplified code samples; and make you aware of three different scripting languages (shells, perl, and expect).
|
Danny Faught, Tejas Software Consulting
|
|
STARWEST 2001: Designing an Automated Web Test Environment This paper offers an alternative to the typical automated test scripting method of "record and playback now and enhance the automation environment later." It explores a regression automation system design for testing Internet applications through the GUI, along with scripting techniques to enhance the scalability and flexibility of an automated test suite. This paper will present a basic
structure for an automated test environment, and will expand on each of the items found in that structure. Web testing levels will be laid out, along with a basic approach to designing test scripts based on those Web testing levels.
|
Dion Johnson, Pointe Technology Group, Inc.
|
|
Three Seasons of Test Automation: A Case Study This presentation makes the following recommendations related to automating testing: don't automate all of an application (seventy to eight percent); don't automate all applications (stable, long term); don't take a 3G approach for short term gain; if shelf life and maintenance costs are important, a 3G approach is best; insure proper roles are filled and people trained; have requirements before you start; have good access to data and test oracles; spend time in design to set the right level of granularity for the test cases and action words.
|
Russell Roundtree, Landmark Graphics and Mike Sowers, Software Development Technologies
|
|
Ready to Automate? Is your organization ready to benefit from automation? The decision to automate your test process can sometimes raise more questions than you expect. What tools do I need? Who should I hire? Do I need to outsource? This presentation will help you determine how your organization can make the best use of test automation. Learn Key steps to ensure your automation efforts get off on the right foot.
|
Bret Pettichord, Pettichord Consulting LLC
|