Conference Presentations

STARWEST 2011: Session-based Exploratory Testing on Agile Projects

One of the challenges associated with testing in agile projects is selecting test techniques that “fit” the dynamic nature of agile practices. How much functional and non-functional testing should you do? What is the appropriate mix of unit, integration, regression, and system testing? And how do you balance these decisions in an environment that fosters continuous change and shifting priorities? Bob Galen has discovered that session-based exploratory testing (SBET) thrives in agile projects and supports risk-based testing throughout the development project. SBET excels at handling dynamic change while also finding the more significant technical- and business value-impacting defects. Join in and learn how to leverage SBET for test design and as a general purpose agile testing technique.

Bob Galen, iContact Corp
New Generation Record/Playback Tools for AJAX Testing

While some in the test community talk about record/playback technology as dead-end test automation approach, a new generation of open source record/playback test tools that every tester should consider is now available. Tools like Sahi and TestMaker Object Designer were built for AJAX environments and support thousands of web objects and the asynchronous nature of AJAX. Frank Cohen shows you how to install and use these free tools in your environment and record test scripts of a complicated AJAX application in IE, Chrome, Firefox, Safari, and Opera. Learn how to data-enable applications without coding, use branching and looping commands, construct advanced element target locators without using XPath, and package tests as reusable test objects to share with other testers.

Frank Cohen, PushToTest
Structural Testing: When Quality Matters

Jamie Mitchell explores an underused and often forgotten test type-white-box testing. Also known as structural testing, white-box techniques require some programming expertise and access to the code. Using only black-box testing, you could easily ship a system having tested only 50 percent or less of the code base. Are you comfortable with that? For mission-critical systems, such low test code coverage is clearly insufficient. Although you might believe that the developers have performed sufficient unit and integration testing, how do you know that they have achieved the level of coverage that your project requires? Jamie describes the levels of code coverage that the business and your customers may need-from statement coverage to modified condition/decision coverage. He explains when you should strive to achieve different code coverage target levels and leads you through examples of pseudo code.

Jamie Mitchell, Jamie Mitchell Consulting
Get Testers Out of the QA Business

Why is the testing department often misnamed "Quality Assurance?" We testers usually aren't allowed to control the scope of the product or change the source code. We don't have authority over budgets, staffing, schedules, customer relationships, market placement, or development models. So how, exactly, can we testers assure quality? We can't. Quality assurance is in the hands of those with authority over it-the programmers who write the code and the managers who run the project. We're extensions of their senses-extra professional eyes, ears, fingertips, noses, and taste buds. Join Michael Bolton and learn why and how to focus your testing energy on exploring, discovering, investigating, and learning about the product. Then, you'll be empowered to provide management with information they need to make informed technical and business decisions.

Michael Bolton, DevelopSense
Testing in Production: Which Version Wins?

Would your marketing department like to know which website feature will excite online customers to buy more products, return to your site again and again, and increase revenue and profits? Harish Narayan describes how his team uses risk-based testing and statistical test design to optimally check features deployed with multiple website options. At Vistaprint, their measurement-focused marketing department requires live production tests of multiple web options-split runs in their jargon-that expose different features for different customer sessions; they choose to retain the one that “wins” to maximize returns. Preproduction testing of split-run features, especially when multiple runs are deployed in every release, presented challenges for Vistaprint’s testers.

Harish Narayan, Vistaprint
The Force of Test Automation in the Salesforce Cloud

What would happen if your company doubled or even tripled its number of releases and asked you to do the same with your testing? What if the number of developers doubled and your testing staff remained the same size? Would your test automation be capable of meeting the demand? How would you ensure that one-hundred Scrum teams are investing enough in test automation? How would you triage hundreds of test failures each day? How would you validate each of more than one-hundred releases to production per year? These are the questions Salesforce.com has had to answer during its twelve year history. These are the challenges that led to the creation of its "test automation cloud." Chris Chen shares how Salesforce.com's test automation cloud works and gives you an inside look at the different technologies and methodologies they use today.

Chris Chen, Salesforce.com
STARWEST 2011: Concurrent Testing Games: Developers and Testers Working Together

The best software development teams find ways for programmers and testers to work closely together to build quality into their software. These teams recognize that programmers and testers each bring their own unique strengths and perspectives to the project. However, working in agile teams we need to unlearn many of the patterns that traditional development taught us. In this interactive session with Nate Oster, you learn how to use the agile practice of "concurrent testing" to overcome common "testing dysfunctions" by having programmers and testers work together-rather than against each other-to deliver quality results throughout an iteration. Join Nate and practice concurrent testing with games that demonstrate just how powerfully dysfunctional approaches can act against your best efforts and how agile techniques can help you escape the cycle of poor quality and late delivery.

Nate Oster, CodeSquads LLC
Pushing the Boundaries of User Experience Test Automation

Although full test automation of the user experience (UX) is impractical and unwise, there are approaches that can save you time and resources. At eBay, Julian Harty and his colleagues are finding new ways to automate as much of UX testing for eBay.com as is reasonably possible. Even with a highly complex, web-based application, they have found that automation finds many potential problems in the user experience-even in rich application scenarios. Julian shares a practical experience report of their successes together with the barriers and boundaries they discovered-detecting navigation issues, layout bugs, and problematic differences between the behavior of various web browsers. Learn from eBay's experiences why automated testing can be beguiling and, paradoxically, increase the chances of missing critical problems if you chose to rely mainly or even solely on the automated tests.

Julian Harty, eBay, Inc.
Managing Test Data in Large and Complex Web-based Systems

Are you testing an application or web site whose complexity has grown exponentially through the years? Is your test data efficiently and effectively supporting your test suites? Does the test data reside in systems not under your direct control? Learn how the WellsFargo.com test team integrated test data management processes and provisions to gain control over test data in their very large and complex web system environment. Join Ron Schioldager to explore the lifecycle of data, its relationship to effective testing, and how you can develop conditioned, trusted, and comprehensive test data for your systems. Learn about the tools Wells Fargo developed and employs today to support their test data management process, enabling them to maintain a shorter data maintenance cycle while improving their test reliability.

Ron Schioldager, Wells Fargo
Top Ten Disruptive Technologies You Must Understand

The consumerization of enterprise software applications is no longer on its way-it is here. Emerging technologies such as mobile apps, tablets, 4G, cloud computing, and HTML5 are impacting software engineering and testing organizations across all industries. By enabling sensitive data to be accessed through the web and on mobile devices, there is immense pressure to ensure that apps are reliable, scalable, private and secure. Using real-world examples, Doron Reuveni identifies the top ten disruptive technologies that have transformed the software industry and outlines what they mean for the testing community now and in the future. The ways in which web and mobile apps are designed, developed, and delivered are changing dramatically, and therefore the ways these apps are being tested are being taxed and stretched to the breaking point.

Doron Reuveni, uTest

Pages

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.