Application Lifecycle Management
Conference Presentations
Creating the Right Environment for Mobile Applications Testing Is your organization releasing applications that target multiple mobile devices, platforms, or browsers? If so, you have faced-or soon will face-the challenge of choosing and setting up a test environment for these devices and platforms. Nat Couture shows how to develop a cost-effective application test environment to mitigate the risks associated with deploying mobile applications. He shares his latest research on mobile devices, mobile platforms, and mobile browser usage, and explains in detail what you need to consider when choosing a test environment. Learn how to select a winning combination of device-specific simulation, platform-specific simulation, and browser-specific simulation-coupled with tests on the actual devices. Build a mobile device testing program that reduces cost, increases coverage, and helps achieve the level of confidence you need to release mobile applications into production. |
Nat Couture, Professional Quality Assurance Ltd.
|
|
Enabling Agile Testing through Continuous Integration Continuous integration is one of the key processes that support an agile software development and testing environment. Sean Stolberg describes how a traditional software tester-transitioning to an agile development environment-put a continuous integration infrastructure in place. In doing so, he helped improve development practices and made possible his team’s transition to agile testing. Sean discusses his team’s initial motivations for adopting agile development practices and dives into the nuts-and-bolts implementation details. He shares their post-assessment of the implementation using Martin Fowler's “Practices of Continuous Integration” and concludes with a retrospective on implementing and promoting continuous integration within the context of agile testing. Find out how continuous integration can help improve your testing results and the quality of the software your team delivers. |
Sean Stolberg, Pacific Northwest National Laboratory
|
|
STAREAST 2010: Tour-based Testing: The Hacker's Landmark Tour Growing application complexity, coupled with the exploding increase in application surface area, has resulted in new quality challenges for testers. Some test teams are adopting a tour-based testing methodology because it’s incredibly good at breaking down testing into manageable chunks. However, hackers are paying close attention to systems and developing new targeted attacks to stay one step ahead. Rafal Los takes you inside the hacker’s world, identifying the landmarks hackers target within applications and showing you how to identify the defects they seek out. Learn what “landmarks” are, how to identify them from functional specifications, and how to tailor negative testing strategies to different landmark categories. |
Rafal Los, Hewlett-Packard
|
|
Focusing Test Efforts with System Usage Patterns Faced with the reality of tight deadlines and limited resources, many software delivery teams turn to risk-based test planning to ensure that the most critical components of the software are production ready. Although this strategy can prove effective, it is only as good as your underlying risk analysis. Unfortunately, understanding where risk lies within a product is difficult with the analysis often resulting in little more than an “educated guess.” These risk-based testing exercises can lead to uneven test coverage and the uneasy feeling that the team has neglected to test what is really important. Dan Craig describes how to employ system usage patterns and production defect reports to identify the real risks in a system. |
Dan Craig, Coveros, Inc.
|
|
Proving Our Worth: Quantifying the Value of Testing Over the years, experts have defined testing as a process of checking, a process of exploring, a process of evaluating, a process of measuring, and a process of improving. For a quarter of a century, we have been focused on the internal process of testing, while generally disregarding its real purpose-creating information that others on the project can use to improve product quality. Join Lee Copeland as he discusses why quantifying the value of testing is difficult work. Perhaps that’s why we concentrate so much on test process; it is much easier to explain. Lee identifies stakeholders for the information we create and presents a three-step approach to creating the information they need to make critical decisions. He shares key attributes of this information-accuracy, timeliness, completeness, relevancy, and more. |
Lee Copeland, Software Quality Engineering
|
|
Creating Crucial Test Conversations Many test leaders believe that development, business, and management don't understand, support, or properly value our contributions. You know what-these test leaders are probably right! So, why do they feel that way? Bob Galen believes it’s our inability and ineffectiveness in communicating-selling-ourselves, our abilities, our contributions, and our value to the organization. As testers, we believe that the work speaks for itself. Wrong! We must work harder to create the crucial conversations that communicate our value and impact. Bob shares specific techniques for holding context-based conversations, producing informative status reports, conducting attention-getting quality assessments, and delivering solid defect reports. Learn how to improve your communication skills so that key partners understand your role, value, and contributions. |
Bob Galen, iContact
|
|
Avoid Failure with Acceptance Test-Driven Development One of the major challenges confronting traditional testers in agile environments is that requirements are incrementally defined rather than specified at the start. Testers must adapt to this new reality to survive and excel in agile development. C.V. Narayanan explains the Acceptance Test-Driven Development (ATDD) process that helps testers tackle this challenge. He describes how to create acceptance test checkpoints, develop regression tests for these checkpoints, and identify ways to mitigate risks with ATDD. Learn to map acceptance test cases against requirements in an incremental fashion and validate releases against acceptance checkpoints. See how to handle risks such as requirements churn and requirements that overflow into the next iteration. Using ATDD as the basis, learn new collaboration techniques that help unite testing and development toward the common goal of delivering high-quality systems. |
C.V. Narayanan, Sonata Software Ltd.
|
|
Performance Testing SQL-based Applications Often, we discover the "real" software performance issues only after deploying the product in a production environment. Even though performance, scalability, stability, and reliability are standards of today's software development, organizations often wait until the end of the development life cycle to discover these limitations, resulting in late deliveries and even chaos. He embraces agile development's philosophies to explain how performance testers can identify and resolve software performance issues early and continue performance testing throughout the development process. Learn how to optimize the use of performance tuning tools such as SQL profiler and MS PerfMon to identify and fix MS SQL server, application, and Web server performance issues. Institute agile methods in your performance testing efforts to avoid that "Oh, no!" moment when the system goes live. |
Alim Sharif, The Ultimate Software Group
|
|
Testing Lessons Learned from the Great Detectives What the great detectives have taught me about testing. |
Robert Sabourin, AmiBug.com
|
|
Heuristics for Rapid Test Management Whether you are a tester or a test manager, Jon Bach believes you have little time to do the things you want to do. Even the things on your "absolutely must do" list are competing for your limited time. Jon has a list of what he calls "half-baked" ideas on how to cope. That is, these ideas are still in the oven-still being tested. In his role as a tester and manager, Jon has learned that it's not about time management; it's really about energy management-where you focus your personal energy and direct your team’s energy. Jon shares ideas that have worked for him and some that have failed: Open-Book Testing, Dawn Patrols, Tester Show-and-Tell, Test Team Feud, and Color-Aided Design. Learn how these ideas may solve your problems with test execution, reporting, measurement, and management-all at low or no cost and relatively easy to implement. |
Jon Bach, Quardev, Inc.
|
Pages
Recommended Web Seminars
On Demand | Building Confidence in Your Automation |
On Demand | Leveraging Open Source Tools for DevSecOps |
On Demand | Five Reasons Why Agile Isn't Working |
On Demand | Building a Stellar Team |
On Demand | Agile Transformation Best Practices |