Conference Presentations

Adventures in Testing Data Migration

Many organizations do not recognize the need for formal testing approaches to data migrations or systems mergers. Migrations are often performed by specially-built data conversion utilities that should be considered new software applications in their own right. Because conversion is a one-time occurrence, data can be riddled with defects and inaccuracies. Geoff Home discusses the different testing levels that you can apply to data migration and the inherent risks associated with such migrations. Take away strategies you can use to bring quality and focus to your data conversions. Examine examples of migration disasters, and learn how to avoid them in your organization.

Geoff Horne, iSQA
Square Peg, Round Hole: Matching Testing with Business Needs

There are many types of software testing methods, ranging from exploratory to a full CMM Level 5 compliance. Choosing the wrong style of testing methodology can jeopardize your company’s software success. In this presentation, Patrick Callahan discusses his real-world experience at ePeople and other Silicon Valley startup companies and outlines some proven strategies selecting and implementing the right one for you. Learn about the four stages of test evolution from chaos to a continuous improvement process.

Patrick Callahan, ePeople Inc
STARWEST 2003: Testing Dialogues - Management Issues

Testing dialogues are a unique platform for you to share your ideas and learn from experienced testers from around the world. In this double session, test managers engage in in-depth, small group discussions with their peers. You'll share your expertise and experiences, learn from others’ challenges and successes, and generate new topics in real-time. Johanna Rothman and Esther Derby facilitate this session, focusing on management issues such as: release criteria; determining ROI of test; coaching and feedback; managing new employees; estimating test time and resources. Bring your BIG issue and start a new dialog with your management peers. Discussions are structured in a framework so that participants will receive a summary or their work product after the conference.

Facilitated by Esther Derby and Johanna Rothman
Large Database Load and Stress Testing

No one looks forward to load and stress testing a large database, but it is a critical task in the test process. Michele Rossi focuses on practical strategies to test software built for large database environments. Before designing your next database load and stress tests, find out what questions to ask and how to model realistic database activity. With the right test scripts and automated tools to create sufficient activity, you'll go a long way toward improving product quality under heavy database loads.

Michele Rossi, BMC Software Inc
Automating J2EE and .NET Tests

With the introduction of the new J2EE (1.4) and .NET platforms, the middleware tier of software applications just became more complex. How do you test an application that uses up to 20 major APIs? Using case study of such an application, Frank Cohen describes how to test functionality, scalability, and performance. In this environment, testers and developers must work together, beginning with automated unit testing and continuing through integration and into system testing. Take away the test agent code, documentation, and installation instructions you’ll need to run the same tests in your own environments under a free, open-source license.

Frank Cohen, PushToTest
Using Test Objectives to Define, Summarize, and Report Your Test Efforts

A large system test can consist of hundreds or even thousands of test cases, making it difficult to report results to management in a meaningful way. We typically use summary metrics, but they don't always present a clear picture. In this session, Jan Scott shows you how to develop business-driven test objectives, measure your testing progress against these objectives, and present your results to management. Improve your test process while giving management a better tool for deciding when the software is ready to go into production.

Jan Scott, QB Software
Creating and Maturing a Testing Center of Excellence

How can your test organization help drive improvement in the overall software lifecycle? During the past several years, Capital One Financial has developed a Testing Center of Excellence (COE) that brought together disparate testing organizations to align their test processes and technical discipline. In addition to the measurable results of the COE, this initiative has supported and encouraged similar improvements in requirements development, project management, systems architecture, and software development methods. Learn how to significantly increase the business awareness of testing’s value and establish testing at the critical element to quality delivery in your organization.

Thomas George, Capital One Financial Corp
Data-Driven Techniques To Text XML APIs

Adapting the Convergys's Advance Data-Driven Techniques (ADDT) process, the company has successfully automated testing of XML APIs. In a highly complex, PC based billing application, ADDT has been used to improve the reliability of the product and significantly reduce testing time. With this approach, automated scenario-based tests are implemented for XML APIs, and test case templates are generated automatically from schema. This technique is generic and can be used for all XML APIs.

Shakil Ahmad, Convergys
The Performance Management Lifecycle-Benchmarking, Methodology and Criteria

Reliable and consistent performance must be an integral part of your software's release criteria and specifically tested during quality assurance. Learn the key elements of building a performance benchmark for your application. Steve Rabin describes a roles-based approach for performance testing benchmark and shares the methodology he has used numerous times. With this process, you define metrics, workload characteristics, transactional definitions, and utilization assumptions.

Steven Rabin, Insight Venture Partners
Getting More Mileage Out of Your Automation

Don't settle for rerunning the same automated test cases over and over again. Instead, get more mileage out of your automation! Learn how to add real-time variety and randomness to automated tests and make your data-driven test cases even more dynamic. Kelly Whitmill offers hints, guidelines, and tradeoffs for automated verification of test executions and tells you how to do automated verification when you can't know the expected results until runtime. Find out why you don’t need formal models and fancy tools to use test case generation in your projects.

Kelly Whitmill, IBM Corporation

Pages

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.