Conference Presentations

Looking Ahead: Testing Tools in 2010

It's May 15, 2010, and you're in a triage meeting reviewing the testing status and bugs in your telemedical software. The system uses real-time voice, video, graphics, and an expert knowledge base to support expert medical procedures in remote locations. As the test manager, you're using trace diagrams, deployment diagrams, runtime fault injection, coverage views, test patterns, built-in self test, and other modern, agile techniques to review the bugs, diagnose faults, assign priorities, and update your test plans. Sam Guckenheimer contrasts the methods available to you in 2010 versus the techniques you used years ago when you were starting out as a test manager.

Sam Guckenheimer, Rational Software ATBU
Beyond Record and Playback: The Behind-the-Scenes View of Web Test Automation

Web-based test automation goes well beyond the mere action of recording manual test scripts and replaying them. Test automation is more of a development process than the normal quality assurance or test effort. This presentation takes an in-depth look into what it takes to truly automate Web site testing. You'll explore the following building blocks: planning/analysis, design/development, implementation, and support.

Michael Prisby, UPS
Automated Web Testing Strategies

As Web applications move from static content to dynamic transactions, the risk of failure increases while cycle time collapses. Although automation is the ideal solution for this combination, those who've ventured into automated Web testing have discovered a whole new world of unexpected challenges. For instance, dynamic page layouts and content frustrate test automation requirements for predictability and repeatability, while the lack of meaningful-let along consistent-object names further complicates consistent execution. Ultimately, this leads to excessive maintenance and lower productivity. This presentation shows you how to identify the potential issues that come with automated Web testing, then offers ways for you to incorporate site and test development strategies to overcome them.

Linda Hayes, WorkSoft
STAREAST 2002: A Case Study In Automating Web Performance Testing

Key ideas from this presentation include: define meaningful performance requirements; changing your site (hardware or software) invalidates all previous predictors; reduce the number of scripts through equivalence classes; don't underestimate the hardware
needed to simulate the load; evaluate and improve your skills, knowledge, tools, and outsourced services; document your process and results so that others may learn from your work; use your new knowledge to improve your site's performance and focus on progress, not perfection.

Lee Copeland, Software Quality Engineering
Investing Wisely: Generating Return on Investment from Test Automation

Implementing test automation without following best practices and tracking your ROI is a prescription for failure. Still, many companies have done so seeking the elusive promise of automated testing: more thorough testing done faster, with less error, at a substantially lower cost. However, fewer than fifty percent of these companies realize any real success from these efforts. And even fewer have generated any substantial ROI from their automated testing initiatives. This presentation takes an in-depth look at the specific pitfalls companies encounter when implementing automated functional testing, and offers proven best practices to avoid them and guarantee long-term success.

Dale Ellis, TurnKey Solutions Corp.
Problems with Vendorscripts: Why You Should Avoid Proprietary Languages

Most test tools come bundled with vendor-specific scripting languages that I call vendorscripts. They are hard to learn, weakly implemented, and most importantly, they discourage collaboration between testers and developers. Testers deserve full-featured, standardized languages for their test development. Here’s why.

Bret Pettichord, Pettichord Consulting
Test Automation of Distributed Transactional Services

Distributed transactions are being implemented everywhere. Web services, EAI, and B2B are just a few examples. Testing these transactions across disparate systems-sometimes even across organizations and firewalls-is difficult, yet vital. But automating the testing is impossible without the right tools. Manish Mathuria offers you a test automation framework built specifically for transactional and component-based implementations. He addresses the practical problems of testing such systems, and suggests solutions for many of them.

Manish Mathuria, Arsin Corporation
Automated Testing Framework for Embedded Systems

Is it possible to use an "open architecture" automation test tool to avoid the pitfalls of testing in the embedded, real-time world? It is now. In this session, Michael Jacobson presents an architecture that allows existing testing tools to be connected together as components in an automated testing framework targeted for embedded systems using network communications. He shows you how existing testing tools can become servers with just a couple lines of code. You'll even learn how each component can be changed and tested without requiring an update to the rest of the components, as long as interface communication is maintained.

Michael Jacobson, Northrop Grumman Corporation
Teach Your Automation Tool To Be As Smart As You

Teach your automation tool to speak your language instead of the other way around. This presentation demonstrates how test professionals can write automated scripts-without knowing coding-while providing a full complement of management reports that identify project progress, script status, and error tracking. You'll learn to fully integrate requirements, project management, and testing automation. Don't just use an automation tool, get it to do what you need it to do.

Bonnie Bayly, Anteon Corporation
Test Automation with Pure Data

While Web-based GUI testing is all the rage, lots of us still operate in a world of UNIX shells, command lines, and scripts. Automated testing in this world traditionally consists of executing the command being tested, then running a series of additional commands that perform validation. But how do you automate the test when the command being run expects answers? The solution: an Intelligent, Interactive Testing Tool (IITT). An IITT requires no scripting to write or maintain as it's completely data driven, meaning non-programming folks can create and maintain their own automated tests. This presentation demonstrates the ease with which an automated test can be developed using IITT's logic for non-GUI interactive applications.

Brian Brumfield, Hewlett-Packard Openview

Pages

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.