STAREAST 2002 - Software Testing Conference
PRESENTATIONS
Outsourcing Trials and Traps
Sometimes outsourcing can help your business, but there's too much at stake to take outsourcing lightly. This presentation teaches you the importance of identifying your critical needs before you even begin the task of selecting an outsourcing partner, because the risks are far too great for you to try to fix a major problem with chewing gum and bailing wire. |
Steve Jeffery, PowerQuest Corporation Inc
|
Proactive User Acceptance Testing
User Acceptance Testing (UAT) tends to take a lot of effort, yet still often fails to find what it should. Rather than being an afterthought subset of system test, effective UAT needs to be systematically planned and designed independently of technical testing. In this session, Robin Goldsmith shows how going from reactive to proactive UAT can make users more confident, cooperative, and competent acceptance testers. |
Robin Goldsmith, Go Pro Management, Inc. |
Problems with Vendorscripts: Why You Should Avoid Proprietary Languages
Most test tools come bundled with vendor-specific scripting languages that I call vendorscripts. They are hard to learn, weakly implemented, and most importantly, they discourage collaboration between testers and developers. Testers deserve full-featured, standardized languages for their test development. Here’s why. |
Bret Pettichord, Pettichord Consulting
|
Retrospectives: They're Not Just For Developers Anymore
Traditional methods for improving testing include training, hiring, adding new processes, building infrastructure, and buying new tools. But what about increasing the capability of the team? Author Aldous Huxley said, "Experience is not what happens to a man; it is what a man does with what happens to him." The same is true for software teams: It's what we do with our experience that matters. Too often, we don't do much-if anything-to squeeze learning out of our experience. |
Esther Derby, Esther Derby Associates Inc |
Risk Analysis for Web Testing
All Web sites take risks in some areas ... your job is to minimize your company's exposure to these risks. Karen Johnson takes you through a step-by-step analysis of a Web site to determine possible exposure points. By reviewing the functionality and other site considerations, such as supported browsers or anticipated loads, risk areas can be accurately determined. |
Karen Johnson, Baxter Healthcare Corporation
|
Robust Design Method for Software Testing
This session presents a robust design method based on the Taguchi Approach. A new and powerful way to improve reliability and productivity, this method has been applied in diverse areas such as network optimization, audio and video compression, error correction, engine control, safety systems, calibration, and operating system optimization. Learn the basics of the robust design method for software testing, and experience the principles through case studies like Unix system performance tuning. |
Madhav Phadke, Phadke Associates
|
Software Documentation Superstitions
Do you need credible evidence that disciplined document reviews (a.k.a. inspections) can keep total project costs down while helping you meet the schedule and improve quality? The project documentation we actually need should meet predetermined quality criteria, but organizations are often superstitious about writing this documentation-and they let their superstitions inhibit their efforts. |
Gregory Daich, Software Technology Support Center
|
Software Inspection: A Failure Story?
Even the most successful inspections can fail if team members aren't vigilant. A large financial institution has agreed to allow their story to be told (anonymously) for the purpose of illustrating how a program that was a classic success could fall into disuse. Specifically, you'll see how the company built up a very successful inspection program, and was achieving significant benefits, until four years later when inspections were no longer being done. How did this happen? Is it unique? |
Dorothy Graham, Grove Consultants |
STAREAST 2002: A Case Study In Automating Web Performance Testing
Key ideas from this presentation include: define meaningful performance requirements; changing your site (hardware or software) invalidates all previous predictors; reduce the number of scripts through equivalence classes; don't underestimate the hardware |
Lee Copeland, Software Quality Engineering |
STAREAST 2002: How to Break Software
Trying to figure out how you can become a more effective tester? That's easy: Become a better bug hunter. This presentation uncovers the root cause of software failure and offers techniques you can use to find bugs. James Whittaker shows you examples of real bugs and diagnoses their cause, while giving you the opportunity to ask questions based on your own determinations. He also describes a set of testing techniques designed to help find bugs in your applications more quickly. |
James Whittaker, Florida Institute of Technology |