Novemeber 2011 QASIG Meeting

Google’s Approach to Test Engineering

Google takes a holistic approach to quality involving developers (SWEs), developers in test (SETs) and a lesser talked about role called test engineering (TE). This talk outlines the role and responsibility of the Google Test Engineer and details the tools and techniques they use to ship high quality software to millions of users on a daily basis. Learn about the process of Attribute, Component, Capability (ACC) analysis and risk assessment used to ship a number of Google products and find out how you can get access to the tools Google TEs use.

About our speaker:

Dr. Whittaker is currently the Engineering Director over engineering tools and testing for Google’s Seattle and Kirkland offices where he manages the testing of Chrome, Chrome OS, Google Maps and other web applications. He holds a PhD in computer science from the University of Tennessee and is the author or coauthor of four acclaimed textbooks. How to Break Software, How to Break Software Security (with Hugh Thompson) and How to Break Web Software (with Mike Andrews). His latest is Exploratory Software Testing: Tips, Tricks, Tours and Techniques to Guide Test Design and he’s authored over fifty peer-reviewed papers on software development and computer security. He holds patents on various inventions in software testing and defensive security applications and has attracted millions in funding, sponsorship, and license agreements while a professor at Florida Tech. He has also served as a testing and security consultant for dozens of companies and spent 3 years as an architect at Microsoft. He is currently writing How Google Tests Software.

Meeting Slide Deck: TestEngineeringatGoogle

September 2011 QASIG Meeting

Sabotaging Quality

Presented by: Joy Shafer, Consulting Test Lead, Quardev, Inc.

From the very beginning of the project there have been discussions about quality. The team has unanimously agreed it’s important-a top priority. Everyone wants to deliver a high-quality product of which they can be proud. Six months after the project kick-off, you find yourself neck-deep in bugs and fifty-percent behind schedule. The project team decides to defer half of the bugs to a future release. What happened to making quality a priority?

One of your partners is discontinuing support for a product version on which your online service is dependent. You have known this was coming for years; in fact, you are actually four releases behind your partner’s current version. For some reason an update project has been put off repeatedly, until now-the last possible moment. Now the team is scrambling to get the needed changes done before your service is brought down by the drop in support. You are implementing the minimum number of features required to support the newer version. In fact, you’re not even moving to the most current version-it was deemed too difficult and time-consuming to tackle at this point. You are still going to be a release behind. Are you ever going to catch up? Is minimal implementation always going to be the norm? Where is your focus on quality?

Do these scenarios sound familiar? Why is it sometimes so difficult to efficiently deliver a high-quality product? What circumstances sabotage our best intentions for quality? And, more importantly, how can we deliver quality products in spite of these saboteurs?

One of the most common and insidious culprits is the habit of sacrificing long-term goals for short-term goals. This can lead to myriad, long standing issues on a project. It is also one of the most difficult problems to eradicate. Other saboteurs can take the form of competing priorities, resource deprivation, dysfunctional team dynamics, and misplaced reward systems.

Joy will show you how to recognize these saboteurs and assess the damage they are causing. She will discuss practical strategies for eliminating these troublesome quality-squashers or at least mitigating their affects.

Joy Shafer is currently a Consulting Test Lead at Quardev Laboratories on assignment at Washington Dental Services. She has been a software test professional for almost twenty years and has managed testing and testers at diverse companies, including Microsoft, NetManage and STLabs. She has also consulted and provided training in the area of software testing methodology for many years. Joy is an active participant in community QA groups. She holds an MBA in International Business from Stern Graduate School of Business (NYU). For fun she participates in King County Search and Rescue efforts and writes Fantasy/Sci-fi.

Meeting Slide Deck: Sabotaging Quality Slide Deck

July 2011 QASIG Meeting

6 Considerations for Mobile Device Testing

Presented by: Jacob Stevens, Senior Test Lead at Quardev, Inc. and Matt Pina, Manager of IT Applications Quality Assurance at Alaska Airlines

New to mobile? Unsure how the unique dynamics and constraints of this lucrative platform might shape your test design? This primer to testing on mobile devices introduces the aspects of any mobile test scenario that may warrant consideration for your testing. We’ll look at the 6 key considerations to help you ensure thoughtful design and proper coverage, from carrier issues to the myriad hardware configurations.

About our speakers:

Jacob Stevens is a Senior Test Lead at Quardev, Inc. and a ten year QA veteran with over 40 industry leading clients. The scope and scale of the projects, and the types of platforms, technologies and methodologies he’s worked with have been widely varied. Jacob studied under Jon Bach to adopt a context-driven approach to test design. One of Jacob’s favorite topics in QA is epistemology. How do we know that our test results are accurate? How do we ensure inherent biases in our test execution methodologies do not manifest in false positives or false negatives? Jacob enjoys talking technology and many other subjects on Twitter @jacobstevens. Jacob is also a little uncomfortable writing about himself in the third person.

Matt Pina is the Manager of IT Applications Quality Assurance at Alaska Airlines, Inc. He is an IT professional with over 25 years in the industry. His passion is to create Quality Models that adapt to multiple platforms. He currently leads the Alaska QA Center of Excellence with project work supporting all platforms, but primarily focusing on Mobile and Web applications. Matt is a graduate of Central Washington University School of Business, holding a B.S in Business Administration/Finance. Additionally he holds several computing certifications from various local universities.

May 2011 QASIG Meeting

Testing in Production: Your Key to Engaging Customers

Presented by: Seth Eliot, Senior Test Manager at Microsoft, Bing

Feature lists do not drive customer attachment; meeting key needs does. Learn how Testing in Production (TiP) works synergistically with customer scenarios to put real product in front of real users in their own environment to get direct, actionable feedback from users and uncover features that meet their key needs. TiP also allows us to quantify the impact of these features which is crucial since evidence shows that more than half of the ideas that we think will improve the user experience actually fail to do so-and some actually make it worse.

Learn about tools and techniques like Online Experimentation, Data Mining, and Exposure Control that you can use to Test in Production (TiP) to get direct, actionable feedback from real users and learn what works and what doesn’t. Of course production can be a dangerous place to test, so these techniques must also limit any potential negative impact on users. Hear several examples from within Microsoft, as well as, and others to show how Testing in Production with real users will enable you to realize the better software quality.

About our speaker: Seth Eliot is Senior Test Manager at Microsoft where his team solves Exabyte storage and data processing challenges for Bing. Previously he was Test Manager for the Microsoft Experimentation Platform ( which enables developers to innovate by testing new ideas quickly with real users “in production”. Testing in Production (TiP), software processes, cloud computing, and other topics are ruminated upon at Seth’s blog at Prior to Microsoft, Seth applied his experience at delivering high quality software services at where he led the Digital QA team to release Amazon MP3 download, Amazon Video on Demand Streaming, and support systems for Kindle.

Presentation Slide Deck: Testing in Production

March 2011 QASIG Meeting

To Test or Not To Test

Presented by: Adam Yuret, Sr. Test Engineer,

Ostensibly the goal of testing is to provide test coverage of a software product to uncover and document bugs. What if a stakeholder doesn’t want you to report bugs? What if they want you to test less? How would you work to implement a new testing process to a team transitioning to a new methodology without over-reaching your mandate as a tester? now, add to that the additional challenge of never having met the team in person.

Let’s discuss scenarios where the tester is explicitly asked to ignore most bugs, not because the product is so polished that the only probable defects are minor, but because the opposite is true. There are so many problems that to document them all and act on them would have a crippling effect on the project. What would you do in this scenario? Come join Adam Yuret and hear how he’s handled this type of dilemma in current, and past projects. Likewise, share your contexts and ways in which you may have faced these challenges.

About our speaker: After 8 years at WebTrends testing an enterprise level SaaS data warehousing product which included building and maintaining a large scale testing environment, Adam currently works as an “army of one” tester for Volunteermatch. VolunteerMatch is a national nonprofit organization dedicated to strengthening communities by making it easier for good people and good causes to connect. Adam is a relative newcomer to the context driven community and is currently working to build a testing process for a project that is transitioning to an agile/scrum methodology.

Presentation Slide Deck: To Test or Not To Test Slide Deck

January 2011 QASIG Meeting

Teaching the New Software Testers – an Experiment

The many facets of what it meant to introduce a software testing course into UW Bothell – professor and students give an experience report and look to the future

Presented by David Socha, Assistant Professor, University of Washington, Bothell

As a new professor at the University of Washington Bothell I decided to put together a new course on Software Testing – a course that had never been taught here, and that I had never taught. Nor had I ever explicitly acted in the role of software tester during my 19 years working in the software industry.

Why do such a crazy thing? How did the course turn out? What were the good and the bad, the memorable and the inspiring from this course? How did it change the students, and the instructor? What did it “mean” for this course to be taught at the University of Washington Bothell? What might this “mean” for the software testing community in this area?

If you are interested in any of these questions, or might want to be involved in the co-design and execution of such a course next year, come hear David Socha and some of his students give an experience report on this autumn quarter 2010 course at the University of Washington Bothell.

About our speaker: As of September, 2010 David Socha is an Assistant Professor at the University of Washington Bothell in the Computing and Software Systems program. He grew up on a farm in Wisconsin, and in the process became eco-literate (e.g. a systems thinker). This deepened as an undergraduate at the University of Wisconsin, Madison where he made his own luck and spent a year as a field assistant in the Galapagos Islands as a research assistant studying the Galapagos land iguanas, and then earned a BS in Zoology. After earning a PhD in Computer Science and Engineering at the University of Washington, Seattle, David spent 19 years working in a variety of software development organizations moving from being an individual contributor, to managing groups of software developers, to being an agile coach. Recognizing the Teacher in him, he now is a tenure-track professor at UW Bothell working to make a difference. His areas of interest focus on the human side of software development, innovation, design, biomimicry… and perhaps even software testing.

Presentation Slide Deck: Teaching the New Software Testers

November 2010 QASIG Meeting

My Crazy Plan to Respond to Change

Presented by Jon Bach, Manager for Corporate Intellect, Quardev, Inc.

During a given workday, we either focus on testing activities or we have to focus on the things that interrupt testing activities. Some of us lament the interruptions, others find it energizing because it’s consistent with notions of Agile – like responding to change over following a plan. I’ve got an idea on how to use interruptions to focus on testing activities. Multi-tasking is not evil, nor does it have to torpedo your project, but it may require a management technique to help you stay sane. Enter Thread-Based Test Management — a way to let interruptions guide you to where you’re needed most. Using a kanban-style dashboard is one way to go with the flow without losing your way. Using a spreadsheet with rows to track topics you followed in a day is another. In this talk, Jon Bach describes his experiences with a new technique he and his brother James invented that might help you get rid of the shame of feeling undisciplined for letting yourself be interrupted while being more response-able to the important things that need your time and attention.

About our speaker: Jon has over 15 years of experience, including Fortune 500 and start-up companies, serving in the Quality Assistance and Software Testing domain. He is a co-author of a Microsoft Patterns and Practices book on Acceptance Testing (2010) and is an award-winning keynote speaker for major testing conferences (STAR, QAI, BCS SIGIST). He has served as the vice president of conferences for the Association for Software Testing; invented a method for managing and measuring exploratory testing; a method for coaching testers (Open-Book Testing) and categorizing risk (Color-Aided Test Design) and has published over 50 articles, whitepapers, and presentations about the notions of value of exploration and rapid, risk-based testing. He has been with Quardev for the past 6 years and serves as the speaker chairman for the acclaimed Seattle non-profit QASIG testing forum.

September 2010 QASIG Meeting

Beyond Six Sigma: Agile/Lean/Scrum/Xp/Kanban for physical manufacturing – building a 100 MPG road car in 3 months

Presented by Joe Justice, a Seattle area lean software consultant and entrepreneur.

Our September speaker, Joe Justice, presented an experience report about Taking on the Progressive Insurance Automotive X Prize using Lean software techniques – this report included how Joe and his team did it, pictures and information about the car, the methods, methodology, and secrets, questions and next steps, and the future of this approach.

About our speaker: Joe Justice, a Seattle area lean software consultant and entrepreneur, ported lean software practices back to manufacturing in order to compete in the Progressive Insurance Automotive X Prize, a $10,000,000.00 race for 100 MPG cars built to road legal requirements. Joe will walk through the take-aways, wins, and lessons learned when building a ultra-fuel-efficient hyper car using TDD, Scrum, and pair development.

July 2010 QASIG Meeting

Lightning Talks

Presented by 8-10 of your fellow QASIG participants

The format was the “lightning talk” where each of the speakers knew that they only had five minutes to present their idea (one at a time). Each speaker had a severely limited time-frame, but could speak about anything related to testing in that time – a new idea for status reporting or measurement, a lesson they learned on a project, an experience report about a new technique they tried, or a general test principle they want to share.

Lightning Talk Speakers, Topics, and Slides (where submitted):

Jim Hancock, Director of Quality, IT and Customer Delivery at Geospiza Creating a Quality Index Most software development has become “agile,” but CMM, CobiT, PMBOK, Malcolm Baldridge and other quality process metric deployments are still large and complicated. Jim will discuss how to deploy software quality metrics that meet the following criteria: objective, non-obtrusive, flexible, repeatable, consistent, holistic, efficient and inexpensive. Most importantly, the Quality Index will provide an accurate assessment of risk and quality within any SDLC upon which logical deployment decisions can be made.
Jeremy Lightsmith, Seasoned Agile Coach, Trainer and Facilitator What’s Your Conflict Management Style? There is a model of the way people respond to conflict that looks like the attached picture. I thought it would be cool to introduce it and talk about why it’s helpful and how to use it. There’s also a related questionnaire that I can leave for everyone to see how they score.
Jeff Brewster, Grameen Foundation Untethering a Test Lab Do you have a performance test lab that can only be used onsite or has expensive software? Been forced to share lab resources due to cost or space concerns? We’ll share the steps the Mifos testing team used to move their performance test lab to the cloud.
Tim Tapping, SDET, Expedia How to sprint as Agile/Lean QA on the feature team QA has challenges if we are used to building out test plans from extensive documentation which is not present in a agile/lean development environments. We need to engage early in the iteration and help the team understand that ‘quality deliverables’ are a team activity. Drawing on my recent experience at an acquired startup which was an agile shop, I can share techniques that worked for us and pitfalls to avoid.
Jon Bach, Manager for Corporate Intellect, Quardev, Inc. Tested I’m working on a talk for Ignite (Seattle) that has a likelihood of being chosen. How much of a likelihood depends on well I can define this talk about our industry to the technical masses. I could use some feedback.
Lanette Creamer, Test Lead, Adobe Systems Agile Testing Ninjas If Adam Goucher learned everything he knows about testing from Pirates, why does Lanette insist that Ninjas hold the key? What is it about stealth, speed, discipline, and skill that translates well to Agile testing?
Jim Benson, CEO, Modus Cooperandi Kaizen or Death If we don’t improve, we stagnate. If we stagnate, we perish. Life requires change, renewal, and innovation. Evolution is innovation. In life, in business, in everything, we must continuously improve. In 5 short minutes, Jim Benson will change your life through the Japanese concept of Kaizen (continuous improvement).
Gary Knopp, SDET, Attachmate Chow, Expand Your Idea of Dogfood This talk will be a quick case study of how we were able to break out of what seemed like an obvious test strategy (using or extending our existing test system) and move to a lighter-weight, component-based test approach. This new approach includes testing the component in isolation, the usage of the component in isolation, and then the system as a whole. Ultimately we were successful in using this strategy and even completed the testing 2 weeks early.
Michael Wolf,Perl Trainer, evangelist, programmer, community member, town crier… TAP – Test Anything Protocol

May 2010 QASIG Meeting

Presented by Jim Benson, CEO of Modus Cooperandi

What is QA?

When we seek Quality, we are looking for something more than “Does it break if I do this?” We are looking for improvement, for ascendancy, for something better.

Scripts, exploration, observation.

We look for patterns, we look for exceptions, we look for jarring transitions.

In Personal Kanban and Lean, we look for patterns, exceptions and transitions in the creation of value and the flow of work. Think of it as QA for your workload and your processes. QA for your QA.


On the 12th of May, we discussed how Lean techniques like Personal and Organizational Kanban can be applied to the highly variable workload of a tester. How understanding the variation in work and making judgments based on this understanding will help testers, managers and clients schedule for testing and the impacts on QA of unreasonable demands.

We will, at last, be able to define “unreasonable” and provide meaningful service level agreements.

About our speaker: Jim Benson is a collaborative management consultant. He is CEO of Modus Cooperandi, a consultancy which combines Lean, Agile Management and Social Media principles to develop sustainable teams. Clients include the United Nations, World Bank, Microsoft, and NBC Universal. Jim created the Personal Kanban technique that applies lean thinking to individual and small team work. He blogs at evolving web and is @ourfounder on twitter.

Quality Assurance Special Interest Group