|
Beyond GUI: What You Need to Know about Database Testing Today's complex software systems access heterogeneous data from a variety of back-end databases. The intricate mix of client-server and Web-enabled database applications are extremely difficult to test productively. Testing at the data access layer is the point at which your application
communicates with the database. Tests at this level are vital to improve not only your overall test strategy, but also your product's quality. Mary Sweeney explains what you need to know to test the SQL database engine, stored procedures, and data views. Find out how to design effective automated tests that exercise the complete database layer of your applications. You'll learn about the most common and vexing defects related to SQL databases and the best tools available to support your testing efforts.
|
Mary Sweeney, Exceed Training
|
|
Test Automation with Open Source Tools using An Agile Development Process Test automation, open source tools, and agile methods are three important trends in software development. By employing and integrating all three, a project team at Comcast was able to quickly build and deliver a critical application to its customers. Pete Dignan and Dan Lavender discuss the rationale behind the decision to follow an XP-like process in this case study. They explain how the
|
Peter Dignan, ProtoTest LLC
|
|
Testing "Best Practices": From Microsoft's Context to Yours Testing is a never-ending series of trade-off decisions, what to test and what not to test; when to stop testing and release the product; how to budget your testing resources for automated vs. manual testing; how much code coverage is good enough; and much more. To make these difficult judgement calls, we often turn to the "best practices" recommended by testing experts and others who have encountered similar problems. The key to successful implementation is matching their "best practices" to your own context (team make-up, company culture, market
environment, etc.). Barry Preppernau shares his insights gathered from over 20 years of testing experience at Microsoft. You'll learn about the tools and processes that have been successful within Microsoft and ways for you to identify, adapt, and implement successful test improvement
initiatives within your organization.
|
Barry Preppernau, Microsoft Corporation
|
|
TestCafe: A Vendor Independent Test Execution Controller This article gives examples of modular test design diagrams. It also discusses why using a custom test execution controller for your network project will benefit your team.
|
Jeff Feldstein, Cisco Systems Inc
|
|
Adventures in Session-Based Testing Many projects' first test approaches are characterized by uncontrolled, ad hoc testing. Session-based testing can help you manage unscripted, reactive testing. By using sessions to control and record the work done by the test team, you can use these methods to support and give impetus to ongoing learning and team improvement. You'll be introduced to simple tools and metrics to support test sessions, illustrated by real-world examples from two case studies.
|
James Lyndsay, Workroom Productions
|
|
Testing Component-Based Software Today component engineering is gaining substantial interest in the software engineering community. Jerry Gao provides insight and observations on component testability and proposes a new model to represent and measure the maturity levels of a component testing process. In this presentation, you will identify, classify, and discuss new issues in testing component-based software.
|
Jerry Gao, San Jose State University
|
|
Understanding Test Oracles To get value from test execution, the results must be determined and evaluated. This presentation describes the dimensions and alternative approaches to results. It identifies three types of oracles and more than ten different reference functions. Listen as David Gelperin discusses design for testability issues relating to lower-cost oracles and the elements of an oracle strategy.
|
David Gelperin, Software Quality Engineering
|
|
Managing User Acceptance Testing in Large Projects Managing user acceptance testing poses many challenges, especially in large-scale projects. Julie Tarwater explores the issues of planning, coordinating, and executing effective user testing with a large number of end users. Learn strategies for ensuring user acceptance while exploring the pros and cons of each. Discover ways to prioritize issues that arise from user testing.
|
Julie Tarwater, T. Rowe Price Associates
|
|
STAREAST 2000: A Risk-Based Test Strategy Testing information systems should be based on the business risks to the organization using these information systems. In practice, test managers often take an intuitive approach to test coverage for risks. In this double-track presentation, discover how a "stepwise" definition of test strategy can be used for any test level as well as the overall strategy-providing better insight and a sound basis for negotiating testing depth.
|
Ingrid Ottevanger, IQUIP Informatica
|
|
Using Risk Analysis in Testing Anne Campbell provides insight into how a risk analysis grid was effectively implemented at IDX as a device to gauge QA team performance and as a communication tool to the development team. This valuable tool listed functionality and the risk associated with it, helping IDX focus their testing in higher risk areas. Discover how a risk analysis matrix can be used to plan your testing efforts and increase the quality of your software project.
|
Anne Campbell, Channel Health
|