regression testing

Conference Presentations

Customer Focused Business Metrics throughout the SDLC

Focusing on the customer throughout the software development lifecycle (SDLC) is difficult to do. Teams often can become mired in technical problems, internal resource limitations, or other issues. Following the customer mantra of "Faster! Better! Cheaper!" Steve Wrenn offers measurement and process techniques that he has used to deliver projects on time, on budget, and, most importantly, meeting customers needs. By focusing on the development cycle from the outside in, his organization provides business-based metrics dashboards to monitor and adjust the project plan throughout the development project. Find out how their performance dashboard helps the team and the customer stay on course and drive directly to the targeted results. Discover an approach to determine what customers really want and match product development to customer expectations.

Steve Wrenn, Liberty Mutual Insurance Information Systems
Architectures of Test Automation

Regression test automation is just one example of automated testing, and it is probably not the best one. This double-track presentation considers the problems inherent in regression automation and outlines alternatives that involve automated generation, execution, and evaluation of large numbers of tests. Explore oracle-based, high-volume comparison tests, stochastic tests, and configuration tests.

Cem Kaner, Florida Institute of Technology
Functional and Regression Testing of Web Applications

Gone are the days for most commercial Web sites when the "application" on the site was the Web. Now Web sites are often just the presentation layer for sophisticated applications that interact with a complexity of internal and external systems, all glued together in an elaborate architecture using Corba or DCOM. Learn how to ensure that transaction-based Web sites function properly. Explore the benefits of automated testing in these environments.

Peter Cook, Watchfire
Requirements-Driven Automated Testing

Studies have shown that over fifty percent of software defects are attributed to poorly defined requirements. From a process improvement perspective, it is imperative that project managers establish a more effective and efficient way of defining and tracking business requirements. Jeff Tatelman describes a "how to" approach for developing a practical automated regression testing process using a traceability matrix and business event scenarios. Learn how requirements-based testing-coupled with a data-driven approach to test automation-can solve problems that plague most software development projects.

Jeff Tatelman, Spherion Technology Architects
Developing an Automated Regression Test Set

Automating a regression test is a tremendous effort, but the payoff is big in situations where continuous, repeatable, repetitive testing is required. This presentation describes a real-world example of a successful team effort toward developing a reusable automated regression test set for legacy medical software products in a client/server environment. Learn the principles of team building and test case design, and the tools and utilities you need to get the job done. Patricia George also discusses how test data management, the breakdown of programming tasks, and date-driven project milestones increase efficiency to keep the team on track.

Patricia George, Sunquest Info Systems, Inc.
Implementing an Automated Regression Test Suite

Many efforts to automate regression testing have failed or not met expectations-resulting in "shelfware." Lloyd Roden presents a real-world case study based on the success of implementing a regression test tool within a software company. Learn the steps taken in evaluating and deploying the tool. Discover the key benefits and successes achieved over a three-year period as well as the challenges faced while using the tool.

Lloyd Roden, Grove Consultants
STARWEST 2001: Designing an Automated Web Test Environment

This paper offers an alternative to the typical automated test scripting method of "record and playback now and enhance the automation environment later." It explores a regression automation system design for testing Internet applications through the GUI, along with scripting techniques to enhance the scalability and flexibility of an automated test suite. This paper will present a basic
structure for an automated test environment, and will expand on each of the items found in that structure. Web testing levels will be laid out, along with a basic approach to designing test scripts based on those Web testing levels.

Dion Johnson, Pointe Technology Group, Inc.
STAREAST 2001: Designing an Automated Web Test Environment

This paper offers an alternative to the typical automated test scripting method of "record and playback now and enhance the automation environment later." It explores a regression automation system design for testing Internet applications through the GUI, along with scripting techniques to enhance the scalability and flexibility of an automated test suite. This paper will present a basic structure for an automated test environment, and will expand on each of the items found in that structure. Web testing levels will be laid out, along with a basic approach to designing test scripts based on those web-testing levels.

Dion Johnson, Pointe Technology Group, Inc.

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.