Conference Presentations

Mission Possible: An Exploratory Testing Experience

Interested in exploratory testing and its use on rich Internet applications, the new interactive side of the Web? Erik Petersen searched the Web to find some interesting and diverse systems to test using exploratory testing techniques. Watch Erik as he goes on a testing exploration in real time with volunteers from the audience. He demonstrates and discusses the testing approaches he uses everyday-from the pure exploratory to more structured approaches suitable for teams. You'll be amazed, astounded, and probably confounded by some of Erik's demonstrations. Along the way, you'll learn a lot about exploratory testing and have some fun as well. Your mission, should you choose to accept it, is to try out your testing skills on the snappiest rich Internet applications the Web has to offer.

  • Key concepts in exploratory testing demonstrated
  • Learn to test Rich Internet applications (RIA's)
Erik Petersen, Emprove
User Interface Testing with Microsoft Visual C#

Manually testing software with a complex user interface (UI) is time-consuming and expensive. Historically the development and maintenance costs associated with automating UI testing have been very high. Vijay Upadya presents a case study on the approaches and methodologies his Microsoft Visual C# test team adopted to answer the testing challenges that have plagued them for years. Vijay explains how the test team worked with developers to design high levels of testability into Microsoft Visual Studio 2005. These testability features enabled the test team to design a highly robust and effective test suite which completely bypasses the UI. Join Vijay to find out how they adopted data driven testing below the UI and achieved dramatic cost reductions in developing and maintaining their tests.

  • How to bypass the user interface without compromising test effectiveness
  • Designs for software with high testability
Vijay Upadya, Microsoft Corporation
Taming the Code Monolith-A Tester's View

Many organizations have systems that are large, complex, undocumented, and very difficult to test. These systems often break in unexpected ways at critical times. This is not just limited to older legacy systems-even more recently built Web sites are also in this condition. Randy Rice explores strategies for testing these types of systems, which are often monolithic mountains of code. He describes methods he has used to understand and "refactor" them to break up their huge complex codebase into something more testable and more maintainable. Randy describes how to build a set of tests that can be reused even as the system is being restructured. Find out how to perform regression, integration, and interoperability testing in this environment. See how new technologies such as service oriented architecture (SOA) can help achieve better system structures, and learn when and where test automation fits into your plans.

Randy Rice, Rice Consulting Services Inc
Selecting Mischief Makers: Vital Interviewing Skills

Much of testing is tedious-the focus on details, the repetitive execution of the same code, the detailed paperwork, the seemingly endless technical discussions, and the complex data analysis. All good testers have the skills and aptitude necessary to deal with these activities. However, great testers have one other characteristic-they are mischievous. As a hiring manager, detecting mischievous testers is a challenge you should pursue to build the best testing staff. How do you uncover a candidate's mischievous traits during the selection process? Résumés do not help, and phone interviews or email conversations are too easily misunderstood. The best chance you have for detecting mischief is during the interview. Andy explores the ways he identifies the clever people who make great testers and shares techniques that you can easily add to your interview process to find the best people for your team.

Andy Bozman, Orthodyne Electronics
Load Testing New Web Technologies

Web 2.0 applications represent a major evolution in Web development. These applications are based on new technologies such as AJAX, RIA, Web services, and SOA. Unless you, as a tester, understand the inner workings of these technologies, you cannot adequately test their functionality or prepare realistic and valid performance tests. Eran Witkon explains the new Web technologies, how to design and implement appropriate load tests, execute these tests, and interpret the results. For example, Eran describes why the classic "client requests a page and then waits" model used in performance testing the old Web does not adequately represent AJAX processing in which only parts of pages are requested and one request need not complete before another is initiated.

Eran Witkon, RadView Software
Apodora: An Open Source Framework for Web Testing

Are you frustrated with automated test scripts that require constant maintenance and don't seem to be worth the effort? Seth Southern introduces Apodora, a new open source framework for automating functional testing of Web applications. Apodora was released under the GNU General Public License to the open source community with the goal of collaboratively creating a superior, free, automated Web testing tool. The key benefit of Apodora is to help you reduce the maintenance and overhead of test automation scripts. Seth introduces you to the open source project, demonstrates the use of Apodora, and highlights some of the key differences between Apodora and other test automation tools currently available. Seth shows how Apodora can save you time when the software under test changes and scripts require maintenance.

  • Web test tool gaps that Apodora fills
  • Features of Apodora for functional Web testing
Seth Southern, ACULIS - Software Development Services
Emotional Test Oracles

An oracle is a heuristic principle or mechanism by which we may recognize a problem. Traditionally, discussion within testing about oracles has focused two references: (1) requirements specifications that provide us with the "correct" answer and (2) algorithms we execute to check our answers. Testing textbooks talk about identifying a bug by noting the differences between the actual results against those references. Yet high-quality software is not created by merely analyzing conformance to specifications or matching some algorithm. It is about satisfying-and not disappointing-the people who interact with the product every day. Michael Bolton introduces the idea that our emotional reactions to programs as we test them-frustration, confusion, annoyance, impatience, depression, boredom, irritation, curiosity, and amusement-are important triggers for noticing real problems that matter to real people.

Michael Bolton, DevelopSense
A "Framework for Test" for Repeatable Success

Do you have defined and documented processes that describe all the activities and deliverables for testing? Do you have a documented road map for repeating test project successes? The test group at Kaiser found themselves overwhelmed with too many projects, understaffed on most projects, lacking repeatable procedures, and without testing tools. Randy Slade describes how they identified the needed test processes and tools, set priorities, developed new procedures, and implemented them. Their "Framework for Testing" has become the blueprint for all testing activities. Its flexibility makes it applicable to software projects of all types and sizes. It guides testers and managers from A to Z in performing their duties by describing the "what, when, how, and why" of all testing activities and deliverables.

  • Five phases of a software testing life-cycle
  • How to develop, pilot, and evaluate new processes
Randy Slade, Kaiser Permanente
Lightning Talks: A Potpourri of 5-Minute Presentations

Lightning Talks are nine five-minute talks in one conference session. Lightning Talks represent a much smaller investment of time than track speaking and offer the chance to try conference speaking without the heavy commitment. Lightning Talks are an opportunity to present your single biggest bang-for-the-buck idea quickly. Use this as an opportunity to give a first time talk or to present a new topic for the first time. Maybe you just want to ask a question, invite people to help you with your project, boast about something you did, or tell a short cautionary story. These things are all interesting and worth talking about, but there might not be enough to say about them to fill up a full conference session.

Dawn Haynes, PerfTestPlus, Inc.
Testing SOA Applications: What's New, What's Not

The Service Oriented Architecture (SOA) approach to building applications is rapidly approaching critical mass. With this architecture comes a new set of challenges for testers. Brian Bryson demystifies the testing practices to ensure SOA application quality. He begins by building and deploying a Web service to introduce you to SOA. Brian then examines the requirements and risks of SOA quality management including functional, performance, and security testing challenges. Brian demonstrates testing a Web service using both open source and commercial software. Throughout his demonstration, Brian discusses what new skills and strategies, such as a strong focus on unit testing, are required for SOA testing and the more common strategies, such as a strong focus on requirements based testing, that still apply in the new world of SOA.

  • The test and quality ramifications of the SOA
Brian Bryson, IBM Rational

Pages

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.