Conference Presentations

Performance Testing 101

Organizations are often so eager to "jump in" and use load testing tools that the critical steps necessary to ensure successful performance testing are sometimes overlooked-leading to testing delays and wasted effort. Learn the best practices and tips for successful automated performance testing in areas such as assembling a proper test team, planning, simulating a production environment, creating scripts, and executing load tests.

David Torrisi, Mercury Interactive
Automated Test Results Processing

This paper introduces techniques used to automate the results analysis process. It examines the analysis of
crash dump files and log files to extract consistent failure summaries and details, showing how these are
used in problem reporting. It then studies the practical application of Automated Test Results Processing
at Mangosoft Incorporated and presents data showing the impact this has had in product testing.

Edward Smith, MangoSoft Corporation
Results From Inspecting Test Automation Scripts

In many ways, development of scripts for automated testing is similar to software development. It involves requirements, design, code, test, and use. So why not use proven improvement activities to enhance the test script development process? This presentation discusses how one software test team adjusted and applied inspections to test script development. Learn the results of these inspections and how you might use this technique to improve the test script development activity in your organization.

Howie Dow, Compaq Computer Corporation
Managing Test Automation Projects

Automation has three dimensions (organizational, process, and technical), and you should adopt a three-part solution: match skills to tasks; define requirements, environment, and hand-off; and adopt an automation approach and architecture.

Linda Hayes, WorkSoft, Inc.
Succeeding with Automation Tools

The problems with using record/playback as your only test automation strategy are well known. But the other option-full script programming-is unattractive to many due to its high cost and long development time. This presentation discusses a strategy called defensive programming that incorporates the best of both worlds. Learn how to leverage your automation tool with simple implementation techniques to create robust test suites.

Jamie Mitchell, BenchmarkQA
Advanced Data Driven Testing (ADDT)

Learn how the Convergys Test Automation Team developed an Advanced Data Driven Testing (ADDT) approach using a test automation engine. Gain insight into how this technique was successfully implemented to improve the reliability and quality of their software products and reduce the number of testing man-hours. Shakil Ahmad gives a high-level description of the engine design, functionality, and benefits as he shares his company's successes-and frustrations.

Shakil Ahmad, Convergys
Automated Testing and Monitoring of Large Application Services

Large application services are very dynamic in their functionality, with some of the business rules hosted by these services changing on a daily basis. This presentation discusses one company's experience in developing a new methodology and test infrastructure for automated testing and nonstop QA monitoring of large application services with high requirements churn. Learn how this method allows you to get a handle on quality even though the application services requirements remain a moving target.

Ashish Jain and Siddhartha Dalal, Telcordia Technologies
Removing Requirement Defects and Automating Test

Organizations face many problems that impede rapid development of software systems critical to their operations and growth. This paper discusses model-based
development and test automation methods that reduce the time and resources necessary to develop high quality systems. The focus is how organizations have implemented this approach of model-based verification to reduce requirements defects, manual test development effort, and development rework to achieve significant cost and schedule savings.

Mark Blackburn, Software Productivity Consortium
Mentors, Models, and the Making of Managers: Special Panel Discussion

Each of us has a story about how we came to be managers in software organizations. Many of us became managers because we were good developers. Some of us studied management in school. A few of us were groomed and mentored by the companies we work for, and some were tapped for management because we were the only warm body available. But now that we're here, what does it take to become an effective manager? Is being mentored and developed as a manager considered a luxury? Join this interactive panel and discuss the real-life issues and challenges of developing ourselves-and others-as software managers.

Moderator: Esther Derby ( Esther Derby Associates, Inc.)
A Practical Framework for Software Measurement

Measurement is often defined in terms of collecting data, distinguishing it from analysis-the interpretation and use of data. Clearly, the collection of data must be driven by its intended use. In this presentation, David Card presents a framework that treats measurement and analysis as an integrated process. Discover the four basic components of this framework, and learn how to use the framework to ensure that all-important perspectives and potential users of measurement are considered in the measurement planning process.

David Card, Software Productivity Consortium

Pages

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.