Conference Presentations

STAREAST 2006: Apprenticeships: A Forgotten Concept in Testing

The system of apprenticeship was first developed in the late Middle Ages. The uneducated and inexperienced were employed by a master craftsman in exchange for formal training in a particular craft. So why does apprenticeship seldom happen within software testing? Do we subconsciously believe that just about anyone can test software? Join Lloyd Roden and discover what apprenticeship training is and-even more importantly-what it is not. Learn how this practice can be easily adapted to suit software testing. Find out about the advantages and disadvantages of several apprenticeship models: Chief Tester, Hierarchical, Buddy, and Coterie. With personal experiences to share, Lloyd shows how projects will benefit immediately with the rebirth of the apprenticeship system in your test team.

  • Four apprenticeship models that can apply to software testers
  • Measures of the benefits and return on investment of apprenticeships
Lloyd Roden, Grove Consultants
Using Production Failures to Jump Start Peformance Test Plans

Learning from a production system failure is not a model MassMutual Financial Group would have chosen. However, when one of their key applications failed under load in production, they turned on a dime and changed their performance testing approach, focus, and capabilities. Let’s set the scene: They ran large numbers of transactions through a performance test tool and, then, went live with a new application that was to be used by all their key users. Within hours, the application had ground to a virtual halt under normal production load. What went wrong? Join Sandra Bourgeois to find out not only what went wrong but also what they learned from failure and how they set about to improve their knowledge, skills, and tools. This is your chance to learn from their mistakes and avoid repeating them in your organization.

  • Lessons learned from the performance failure of a mission-critical application
Sandra Bourgeois, Massachusetts Mutual Life Insurance Company
Hallmarks of a Great Tester

As a manager, you want to select and develop people with the talents to become great testers, the ability to learn the skills of great testers, and the willingness to work hard in order to become great testers. As an individual, you aspire to become a great tester. So, what does it take? Michael Hunter reveals his twenty hallmarks of a great tester from personality traits-curiosity, courage, and honesty-to skills-knowing where to find more bugs, writing precise bug reports, and setting appropriate test scope. Measure yourself and your team against other great testers, and find out how to achieve greatness in each area. Learn how to identify the great testers you don’t know that you already know!

  • The personality traits a person needs to become a great tester
  • The talents a person needs to become great tester
  • The skills you need to develop to become a great tester
Michael Hunter, Microsoft Corporation
Trends, Innovations and Blind Alleys in Performance Testing

Join experts Scott Barber and Ross Collard for a lively discussion/debate on leading edge performance testing tools and methods. Do you agree with Scott that performance testing is poised for a great leap forward or with Ross who believes that these "silver bullets" will not make much difference in resolving the difficulties performance testing poses? Scott and Ross will square off on topics including commercial vs. open source tools; compatibility and integration of test and live environments; design for performance testability; early performance testing during design; test case reuse; test load design; statistical methods; knowledge and skills of performance testers; predicting operational behavior and scalability limits; and much more. Deepen your understanding of the new technology in performance testing, the promises, and the limitations.

  • The latest tools and methods for performance testing
Scott Barber, PerTestPlus, and Ross Collard, Collard & Company
Diagnosing Performance Problems in Web Server Applications

Many application performance failures are episodic, leading to frustrated users calling help desks, frantic troubleshooting of production systems, and re-booting systems. Often these failures are a result of subtle interactions between code and the configuration of multiple servers. On the other hand, well-designed applications should demonstrate gradual performance degradation and advanced warning of the need to add hardware capacity. Join Ron Bodkin as he discusses the patterns of application failure, some common examples, and testing techniques to help reduce the likelihood of episodic failures in production. Learn about the tools and techniques needed to instrument the application, monitor the infrastructure, collect systems data, analyze it, and offer insight for corrective actions.

Ron Bodkin, Glassbox software
Performance Testing Early in Development Iterations

When the software architecture is emerging and many features are not yet ready, performance testing is a challenge. However, waiting until the software is almost finished is too risky. What to do? Neill McCarthy explores how performance testing can be made more Agile and run starting in the early iterations of development. Learn how to implement early performance automation using appropriate tools in build tests and the requirements for early performance testing of user stories. Neill presents lessons learned from his "coal face" of performance testing in Agile projects and shares ideas on how you can add more agility to your performance testing.

Neill McCarthy, BJSS
STARWEST 2005: Testing Dialogues - Technical Issues

Is there an important technical test issue bothering you? Or, as a test engineer, are you looking for some career advice? If so, join experienced facilitators Esther Derby and Elisabeth Hendrickson for "Testing Dialogues-Technical Issues." Practice the power of group problem solving and develop novel approaches to solving your big problem. This double-track session takes on technical issues, such as automation challenges, model-based testing, testing immature technologies, open source test tools, testing web services, and career development. You name it! Share your expertise and experiences, learn from the challenges and successes of others, and generate new topics in real-time. Discussions are structured in a framework so that participants receive a summary of their work product after the conference.

Esther Derby, Esther Derby Associates Inc
It's 2005, Why Does Software Still Stink

We've now been writing software for an entire human generation. Yet software is arguably the least reliable product ever produced. People expect software to fail, and our industry has developed a well-deserved and widely accepted reputation for its inability to deliver quality products. James Whittaker explores the history of software development over the last generation to find out why. He uncovers several attempts to solve the problem and exposes their fatal flaws. James then looks forward to a world without software bugs and offers a roadmap-practical techniques that can be implemented today-for how to get there from here. Join James on this journey through the past and into the future-and be sure to bring something to scrape the bugs off your windshield.

James Whittaker, Florida Institute of Technology
Agile Software Development: The Home of 31 Flavors

You've heard of eXtreme Programming (XP) and perhaps Scrum. How about Crystal Clear, Adaptive Software Development, Dynamic Systems Development Method, Rational Unified Process for Agile Development, and Feature Driven Development? These are some of the many variations of Agile development methods. Join Jeff McKenna as he explores the many flavors of Agile development methods and explains the similarities and differences. Find out what aspects of Agile development can help your organization’s development team in its particular environment. If you are considering Agile development and need to decide in which direction to go, this session is for you. Although a one-hour session cannot provide all the information you will need, you can explore what is common-the philosophy, the values, the characteristics-and what is different-the methods, the coverage, the costs-about different Agile approaches.

Jeff McKenna, Agile Action
Rapid Bottleneck Identification for Successful Load Testing

Rapid bottleneck identification is a methodology that allows QA professionals to very quickly uncover Web application limitations and determine what impact those limitations have on the enduser experience. Starting with the premise that every application has a scalability limit, this approach sets out to quickly uncover where those limitations are and to suggest corrective action.
Learn details about the most common application scalability limits-spanning network, application server, database server, and Web server-and how to quickly uncover them by focusing first on throughput and then on concurrency. With a modular, iterative approach to load testing, you focus

Joe Fernandes, Empirix

Pages

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.