Conference Presentations

Life as a Performance Tester

At the core of most performance testing challenges and failed performance testing projects are serious misunderstandings and miscommunications within the project team. Scott Barber and Dawn Haynes share approaches to overcoming some of the most common frustrations facing performance testers today. Rather than simply telling you how to improve understanding and communicate performance testing concepts, Scott and Dawn demonstrate their approaches through an amusing role play of interactions between a lead performance tester and a non-technical executive.

Scott Barber, PerfTestPlus, Inc.
Are Agile Testers Different?

On an agile team everyone tests, blurring the lines between the roles of professional developers and testers. What's so special about becoming an agile test professional? Do you need different skills than testers on traditional projects? What guides you in your daily activities? Lisa Crispin presents her "Top Ten" list of principles that define an agile tester. She explains that when it comes to agile testers, skills are important but attitude is everything. Learn how agile testers acquire the results-oriented, customer-focused, collaborative, and creative mindset that makes them successful in an agile development environment. Agile testers apply different values and principles-feedback, communication, simplicity, continuous improvement, and responsiveness-to add value in a unique way. If you're a tester looking for your place in the agile world or a manager looking for agile testers, Lisa can help.

Lisa Crispin, ePlan Services, Inc.
Acceptable Acceptance Testing

This is the tale of a team of software professionals at Microsoft patterns & practices group who wrote a book on software acceptance testing. Grigori Melnik was the content owner, writer, and project manager. Jon Bach was the writer, material producer, and the acceptance testing reality checker, ensuring that the project team used its own methods so the book would be acceptable to you, the reader. To develop the book, Grigori and Jon employed key ideas of agile projects-creating a backlog using story cards, working in short iterations, exploring requirements and expectations, building customer trust through iterative acceptance, and staying connected to the customer community through frequent preview releases, surveys, and interviews. They created a heuristic acceptance testing model for knowing when they had reached enough "acceptability" to stop "developing" the book and publish it.

Grigori Melnik, Microsoft Corporation
The Three Faces of Quality: Control, Assurance, Analysis

Many of the misunderstandings within software development organizations can trace their roots to different interpretations of the role of testers. The terms quality control (QC), quality assurance (QA), and quality analysis are often used interchangeably. However, they are quite different and require different approaches and very different skill sets. Quality control is a measurement of the product at delivery compared to a benchmark standard, at which point the decision is made to ship or reject the product. Quality assurance is the systematic lifecycle effort to assure that a product meets expectations in all aspects of its development. It includes processes, procedures, guidelines, and tools that lead to quality in each phase. Quality analysis evaluates historical trends and assesses the future customer needs as well as trends in technology to provide guidance for future system development.

Stephen Michaud, Luxoft Canada
Test Management for Very Large Programs: A Survival Kit

In large organizations with multiple, simultaneous, and related projects, how do you coordinate testing efforts for better utilization and higher quality? Some organizations have opened Program Test Management offices to oversee the multiple streams of testing projects and activities, each with its own test manager. Should the Program Test Manager be an über-manager in control of everything, or is this office more of an aggregation and reporting function? Graham Thomas examines the spectrum of possible duties and powers of this position. He also shares the critical factors for successful program test management, including oversight of the testing products and deliverables; matrix management of test managers; stakeholder, milestone, resource, and dependency management; and the softer but vital skills of influence and negotiation with very senior managers.

Graham Thomas, Independent Consultant
STARWEST 2008: Performance Engineering: More Than Just Load Testing

Performance testing that is done once or a few times as part of the system test is not the right approach for many systems that must change and grow for years. Rex Black discusses a different approach--performance engineering--that is far more than performing load testing during the system test. Performance engineering takes a broad look at the environment, platforms, and development processes and how they affect a system's ability to perform at different load levels on different hardware and networks. While load testers run a test before product launch to alleviate performance concerns, performance engineers have a plan for conducting a series of performance tests throughout the development lifecycle and after deployment. A comprehensive performance methodology includes performance modeling, unit performance tests, infrastructure tuning, benchmark testing, code profiling, system validation testing, and production support.

Rex Black, QA Software Consultant/Trainer
Driving Development with Tests: ATDD and TDD

A perennial wish of testers is to participate early in the projects we test-as early as when the requirements are being developed. We also often wish for developers to do a better job unit testing their programs. Now with agile development practices, both of these wishes can come true. Development teams practicing acceptance test-driven development (ATDD) define system-level tests during requirements elicitation. These tests clarify requirements, uncover hidden assumptions, and confirm that everyone has the same understanding of what "done" means. ATDD tests become executable requirements that provide ongoing feedback about how well the emerging system meets expectations. Agile developers who also are practicing test-driven development (TDD) design methods create automated unit tests before writing component code.

Elisabeth Hendrickson, Quality Tree Software, Inc.
Reloadable Test Data for Manual Testing

Do you need to execute and then quickly re-execute manual test cases under tight timelines? Do bugs marked as "Cannot Reproduce" bouncing back and forth between developers and testers frustrate your team? Would you like to have more realistic, production-like test data? Join Tanya Dumaresq as she explains the hows and whys of developing and using pre-created, reloadable test data for manual testing. By planning ahead when designing test cases, you can cut test execution time in half and virtually eliminate those "works on my machine" bugs. Learn how to create and load test data in different formats and choose the one that is best for your application under test. Sometimes, you can even use the application itself to create the data! You'll end up with test data and an environment far more representative of your users' world than if you create data on the fly during test execution.

Tanya Dumaresq, Macadamian Technologies
Managing Your Personal Stress Level

In a recent survey of 130 U.S. software testers and test managers, Randall Rice learned that 83 percent of the respondents have experienced burnout, 53 percent have experienced depression of some type, and 97 percent have experienced high levels of stress at some time during their software testing careers. Randall details the sources of these problems and the most common ways to deal with them-some healthy, some not. There are positive things testers and managers can do to reduce and relieve their stress without compromising team effectiveness. By understanding the proper role of testing inside your organization and building a personal support system, you can manage stress and avoid its destructive consequences. Randall identifies the stress factors you can personally alleviate and helps you deal with those stressors you can't change.

Randy Rice, Rice Consulting Services Inc
Beyond Functional Testing: On to Conformance and Interoperability

Although less well known than security and usability testing, conformance and interoperability testing are just as important. Even though conformance and interoperability testing-about standards and thick technical specifications documents-may seem dull, Derk-Jan De Grood believes that these testing objectives can be interesting and rewarding if you approach them the right way. SOA is one example in which numerous services must interact correctly with one another-conform to specs-to implement a system. Conformance and interoperability testing ensures that vendors' scanners can read your badge in the EXPO and that your bank card works in a foreign ATM. Derk-Jan explains important concepts of interface standards and specifications and discusses the varied test environments you need for this type of testing. Get insight into the problems you must overcome when you perform conformance and interoperability testing.

Derk-Jan Grood, Collis

Pages

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.