ALM & SCM Tools

Conference Presentations

STAREAST Dig In: Get Familiar with the Code to Be a Better Tester
Slideshow

Maybe you’ve been testing the same application for a while and your rate of finding new bugs has slowed. Or you’re trying to find more ways to figure out what your devs are doing day to day. You have the tools at your disposal—you just need to dig in! Hilary Weaver-Robb will share tools and techniques you can use to take your testing to the next level. See everything the developers are changing, and learn how you can find the most vulnerable parts of the code. These strategies can help you focus your testing and track down those pesky bugs! Take away a better understanding of tools to conduct static analysis on the code, and use these results to find potential bugs. You'll discover ways to use commit logs to figure out what’s being changed and understand why it’s helpful to dig into the code of the application under test.

Hilary Weaver-Robb
STAREAST Excuse-Free Testing: An Open Source Tool for Simpler CI Integration
Slideshow

The goal of continuous testing is to find defects earlier in the development lifecycle and release software faster to the market. This can be achieved by integrating open source functional and performance testing tools in the early stages of your software delivery lifecycle. Klaus Neuhold will explain how to integrate the open source test automation framework Taurus, and other tools such as JMeter and Selenium, as a CI step in Jenkins pipelines, so that these tools can be triggered as part of everyday code commits or builds. Taurus can run a large variety of tests and has reporting features that ensure agile teams dodge defects and nasty surprises before the software is released. It's a sophisticated yet easy-to-use framework that requires no programming skills, and test scenarios can be described in a natural-language DSL and triggered from any command-line enabled tools.

Klaus Neuhold
STAREAST The Dell EMC Journey in the Age of Smart Assistants
Slideshow

Dell EMC is driving to optimize and reimagine their testing practices with the application of data-driven smart assistants, powered by analytics and machine learning. At a macro level, Geoff Meyer will highlight the opportunities across the product engineering and testing landscapes that are ripe for the application of analytics and AI. Key ingredients in moving toward solutions that matter are the identification of organization-specific pain points, their prioritization, and the availability and cleanliness of essential data. Geoff will share the process of experimentation, staffing, and implementation that his team approached these new opportunities with, and then delve into the smart assistants they’ve created to automate deep-thinking, cognitive-based testing tasks.

Geoff Meyer
STAREAST Where Does Data Come From?
Slideshow

With all the tools available on the market, it can be overwhelming to determine which ones might meet your needs and which ones will work best in your environment to create a high-performing team. Join Jennifer Bonine as she explains the relationship of the DevOps cycle, your environment, and how a hub-and-spoke model can link all your different data sets and tools together. Jennifer will identify opportunities for applying test data analytics across the engineering and test landscape, ranging from high-value test cases to dynamically generated regression test suites. She will review ways to collaborate and show results in a way that clearly demonstrates progress and how to present a visual dashboard to your leadership and stakeholders in the organization.

Jennifer Bonine
STAREAST The AI Testing Singularity
Slideshow

Most basic software testing will soon be done by a few individual, large systems. But today, software testing is a fragmented world of test creators, test automators, vendors, contractors, employees, and even “pizza Fridays” where developers roll up their sleeves and test the build themselves. When teams start testing their apps, they dream up the same positive, negative, and edge test cases as every other team before them. Most software testing is either manually tapping an application or manually creating and maintaining detailed automation scripts—from scratch! AI will soon change all that. AI isn’t just better, faster, cheaper automation; applying AI to testing brings two major disruptive changes to the field: reuse and scaling. Join Jason Arbon for a glimpse into this future and discover how you can leverage AI to focus on the more important aspects of testing.

Jason Arbon
Agile DevOps East Service Virtualization: How to Test More by Testing Less
Slideshow

Agile teams tend to struggle in getting development and testing in sync. Many teams run minified waterfalls, where testers get working code a few days before the end of the sprint—and tools usually can't help. But service virtualization is one of those rare tools that can make a huge impact and accelerate software delivery by limiting the dependencies needed for testing. Join Paul Merrill to get an introductory demonstration of service virtualization with a freely available, open source tool. Learn the five modes of service virtualization: capture, simulate, spy, synthesize, and modify. Return to your workplace with one more tool in your tool chest. Paul will walk through a common scenario for service virtualization and teach you how you can test more, faster, by testing less!

Paul Merrill
Agile DevOps East Lessons Learned Implementing DevOps: A Discussion
Slideshow

DevOps is fundamentally about collaboration, communication, and effective teamwork across the entire software supply chain. But in practice, DevOps is much more than that. Tools and technology are used to speed up delivery, but organizational change often must be facilitated for DevOps to take root. Join Lee Eason as he facilitates a peer-to-peer session to help DevOps practitioners share their lessons learned while implementing DevOps. Come to this session with your DevOps challenges and get help from peers who have dealt with similar issues, and bring solutions as well, so you can help others improve. After this dynamic, engaging, and collaborative session, you'll leave with new ideas for how to best implement DevOps, along with the satisfaction of knowing you helped others in the process.

Lee Eason
STARCANADA Strategies for Selecting the Right Open Source Framework for Cross-Browser Testing
Slideshow

Organizations today are required to test their web application across browsers and mobile devices. Choosing the right framework is a matter of organizational as well as technical fit. With a plethora of test frameworks that span across practices such as behavior-driven development, unit...

Eran Kinsbruner
STARWEST 2018 Managing BDD Automation Test Cases inside Test Management Systems
Slideshow

Behavior-driven development (BDD) has been around for a while and is here to stay. However, the added abstraction levels pose a technical problem for writing and managing tests. While BDD does a great job of marrying the nontechnical aspect of test writing to the technical flow of an application under test, keeping this information under source control becomes problematic. Frameworks such as JBehave, Cucumber, or Robot give subject matter experts that additional ability to write tests, but they are often restricted access from them; because people treat test cases as code, they get stored in source control repositories. Additionally, these given-when-then steps soon can grow to an extent where they are difficult to manage without an IDE, and nontechnical people lose interest. Using management tools, Max Saperstone shows how to manage these nontechnical steps and keep them in sync with the automaton in tools such as Git.

Max Saperstone
STARWEST 2018 Risk Based Testing: Communicating WHY You Can't Test Everything
Slideshow

The idea of testing everything is a popular one—in fact many stakeholders think that’s exactly what their quality teams do. It usually isn’t and can’t be; but how can teams communicate this? Join Jenny Bramble as she helps to pave the way using the language of risk-based testing. By defining risk in two simple parts, the team and project have a tangible and usable metric. She shares how to apply this metric and use it to determine where the team should focus testing, making it more effective and efficient whilst communicating that effort through the creation of a risk matrix. As a result, risk becomes the right language for the team to communicate clearly and concisely with everyone involved in the project by using agreed-upon words and definitions. Take away a set of tools that can be used to facilitate both better testing and better communication though precise use of language and risk matrixes.

Jenny Bramble

Pages

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.