Conference Presentations

The Estimate Is Nothing, The Estimating Is Everything

In many software projects, estimation is consistently troublesome, contentious, and unreliable. A big part of the problem is that we fantasize that estimates tell us about the future; and so management wants "accurate" estimates that we can "commit" to. In fact, estimates tell you nothing about the future. Estimates are entirely about the present. Estimates express our expectations, based on what we believed when we made the estimate. Dale Emery invites you to explore a gold mine of information left untapped by most estimation processes–the extensive range of knowledge, assumptions, risks, and unknowns that influence our estimates and are often unexpressed and forgotten. Making this information explicit and visible can outweigh the value of the estimate itself.

Dale Emery, DHE
Exploratory Validation: What We Can Learn from Testing Investment Models

Over the past few years, the airwaves have been flooded with commercials for investment-support software. Do your research with us, they promise, and you can make scads of money in the stock market. How could we test such a product? These products provide several capabilities. For example, they estimate the value or direction of change of individual stocks or the market as a whole, and they suggest trading strategies that tell you whether to buy, hold, or sell. Every valuation rule and every strategy is a feature. We can test the implementation of these features, but the greater risks lie in the accuracy of the underlying models. If you execute the wrong trades perfectly, you will lose money. That's not a useful feature, no matter how well implemented.

Cem Kaner, Florida Institute of Technology
STAREAST 2011: Performance Engineering: More Than Just Load Testing

Performance testing that is only done at the last minute–just prior to launch–is the wrong approach for complex systems which have many opportunities for performance bottlenecks. Rex Black discusses a different approach-performance engineering-that is far more comprehensive and valuable than merely performing load testing during the system test. Performance engineering takes a broad look at the environment, platforms, and development processes, and how they affect a system's ability to perform at different load levels on different hardware and networks. Performance engineers use a structured process to conduct a series of performance tests throughout development and after deployment. These tests includes performance modeling, unit performance tests, infrastructure tuning, benchmark testing, code profiling, system validation testing, and production support.

Rex (Red) Black, J9 Technologies
Automation Strategies for Testing Complex Data and Dashboards

Test automation engineers are inevitably confronted with the difficult challenge of testing a screen containing hundreds–if not thousands–of data values. Designing an approach to interact with this complex data can be a nightmare, often resulting in countless programming loops that navigate through volumes of data. Greg Paskal shares an innovative way to approach these automation challenges by breaking the problem into its logical parts. First, understand the data and how to organize it using the Complex Data Methodology, and second, execute common programmatic tasks resulting in shorter automation run times. This approach can be applied to Web and client systems, and adapted easily to other technologies. Automators scripting in languages such as VB Script will find this approach innovative, breaking down automation challenges to optimize performance while still producing meaningful results.

Greg Paskal, JCPenney
Thread-based Exploratory Testing

Although most of us begin our day with a prioritized plan–our To Do list–almost instantly we are plagued by distractions and interruptions. When your exploratory testing work faces the same challenges, thread-based test management (TBTM) is just what the doctor ordered. TBTM is non-committal and not time boxed–embracing start-stop interruptions and even longer delays in your testing. Jon Bach explains and demonstrates TBTM in which the unit of work is a thread–a flow of activities to solve a specific problem. Using the TBTM approach, you'll give your testing attention to the thread that provides the most immediate value. Jon shares his daily To Do list and the Thread Board he uses to help surf through those uncontrollable interruptions. Rather than focusing on completing test documents, he focuses on completing threads, one at a time.

Jon Bach, eBay Inc
Top Testing Challenges We Face Today

Some people thrive on challenges; others struggle daily to deal with them. Handled well, challenges can make us stronger, more passionate, and more determined to succeed. Lloyd Roden describes the top software testing challenges facing many of us today and explores how we can respond in a positive, constructive way. One challenge Lloyd often sees is identifying and eliminating metrics that deceive. While we (hopefully) do not set out to lie with our metrics, we must endeavor to employ metrics that have significance, integrity, and operational value. Another challenge test leaders face is providing reports that are clear, accurate, relevant and tailored to the recipient. A third challenge is convincing test managers to actually test regularly to attain credibility and respect with the team they are leading.

Lloyd Roden, Grove Consultants
The 2011 Survival Guide: Lessons for Test Professionals

When we are in dangerous situations, we need a well-thought-out survival guide to help save ourselves and others. These lifesaving principles and skills provide the basic necessities for life and help us think straight, navigate safely, signal for help, and avoid unpleasant consequences of interactions with our environment. Julie Gardiner shares her 2011 Survival Guide for testers and test managers living in today's challenging business and technical environments.

Julie Gardiner, Grove Consultants
STAREAST 2011: Lightning Strikes the Keynotes

Lightning Talks have been a very popular part of many STAR conferences throughout the years. If you're not familiar with the concept, a Lightning Talk session consists of a series of five-minute talks by different presenters within one presentation period. For the speakers, Lightning Talks are the opportunity to deliver their single biggest-bang-for-the-buck idea in a rapid-fire presentation. And now, lightning has struck the STAR keynote presentations. Some of the experts in testing-James Bach, Jon Bach, Michael Bolton, Dawn Cannan, Dale Emery, Bob Galen, Jonathan Kohl, Randy Rice, Lloyd Roden, and Rob Sabourin-will each step up to the podium and give you their best shot of lightning. With no time to dither or vacillate-and hemming and hawing forbidden-you'll get ten keynote presentations for the price of one and have some fun at the same time.

Lee Copeland, Software Quality Engineering
Sleeping with the Enemy

Unfortunately, traditional software delivery models are often based on a lack of trust among stakeholders. Because the business doesn't trust developers, testers are asked to provide independent validation. Because developers don't trust testers, everyone wastes a lot of time arguing about whether a problem is in the code or in the tests. And testers-they are taught not to trust anyone! All of this distrust even though we share the same end-goal-delivering a product that satisfies our customers. Gojko Adzic describes why independent testing should be a vestige of the past. He explains how testers engaging with developers and business users gives testers opportunities to accomplish things they cannot do otherwise.

Gojko Adzic, Neuri Ltd.
How to Win Friends and Influence People - and Deliver Quality Software

Since first published, Dale Carnegie's How to Win Friends and Influence People has motivated generations of aspiring leaders to polish up their people skills. Yet imagine the reaction of a typical software quality assurance or test professional opening the book and reading the first principle: Don't criticize, condemn, or complain. "Don't criticize? Isn't that a tester's job!" They turn to the next chapter to find: Give honest and sincere appreciation. "Honest feedback, perhaps, but appreciation with the buggy code we get from the developers?" Concerned, they check out the third chapter: Arouse in the other person an eager want. "Now wait a minute. That sounds like something that would get me called into HR!" It's easy to discount and even parody the lessons from Dale Carnegie's work.

Andy Kaufman, Institute for Leadership Excellence & Development, Inc.

Pages

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.