All posts by Shelly

January 2012 QASIG Meeting

Pairing Developers with Non-Developers

Presented by: Lanette Creamer, Independent Software Testing Consulting and Coach

Pairing in the world of software development traditionally brings up an image of two developers working together in person, creating code at the same time. This practice is often used by agile teams and teams doing extreme programming. Many teams currently may pair coding testers with coders who are working on product development. The vocabulary gap is much smaller when everything is in the same language, code, but what happens when the business needs aren’t understood or well communicated? In the last three years, Lanette has been working with software testers, experimenting on the fringes of pairing. Come learn some ways to pair programmers with non-programmers at specific strategic times for the purpose of more collaboration and efficiency. Learn how handoffs, bug demos, and psuedo code can offer new types of pairing, and tips for making them practical rather than mandated.

About our speaker:

Lanette Creamer is an independent software testing consultant and coach from Seattle, WA. Known in the blogging community as TestyRedhead, her blog can be found at http://blog.testyredhead.com/. Her presentations are known for being candid, including cat photos, and being focused on the human side of software testing. Recently focused on Agile Testing and Pairing with Developers, in the past her papers and presentations have focused on combining automated checks with exploratory charters, and group collaborative testing techniques.

Pairing Presentation Slide Deck: PairingWDevelopers_QASIGJan11

Novemeber 2011 QASIG Meeting

Google’s Approach to Test Engineering

Google takes a holistic approach to quality involving developers (SWEs), developers in test (SETs) and a lesser talked about role called test engineering (TE). This talk outlines the role and responsibility of the Google Test Engineer and details the tools and techniques they use to ship high quality software to millions of users on a daily basis. Learn about the process of Attribute, Component, Capability (ACC) analysis and risk assessment used to ship a number of Google products and find out how you can get access to the tools Google TEs use.

About our speaker:

Dr. Whittaker is currently the Engineering Director over engineering tools and testing for Google’s Seattle and Kirkland offices where he manages the testing of Chrome, Chrome OS, Google Maps and other web applications. He holds a PhD in computer science from the University of Tennessee and is the author or coauthor of four acclaimed textbooks. How to Break Software, How to Break Software Security (with Hugh Thompson) and How to Break Web Software (with Mike Andrews). His latest is Exploratory Software Testing: Tips, Tricks, Tours and Techniques to Guide Test Design and he’s authored over fifty peer-reviewed papers on software development and computer security. He holds patents on various inventions in software testing and defensive security applications and has attracted millions in funding, sponsorship, and license agreements while a professor at Florida Tech. He has also served as a testing and security consultant for dozens of companies and spent 3 years as an architect at Microsoft. He is currently writing How Google Tests Software.

Meeting Slide Deck: TestEngineeringatGoogle

September 2011 QASIG Meeting

Sabotaging Quality

Presented by: Joy Shafer, Consulting Test Lead, Quardev, Inc.

From the very beginning of the project there have been discussions about quality. The team has unanimously agreed it’s important-a top priority. Everyone wants to deliver a high-quality product of which they can be proud. Six months after the project kick-off, you find yourself neck-deep in bugs and fifty-percent behind schedule. The project team decides to defer half of the bugs to a future release. What happened to making quality a priority?

One of your partners is discontinuing support for a product version on which your online service is dependent. You have known this was coming for years; in fact, you are actually four releases behind your partner’s current version. For some reason an update project has been put off repeatedly, until now-the last possible moment. Now the team is scrambling to get the needed changes done before your service is brought down by the drop in support. You are implementing the minimum number of features required to support the newer version. In fact, you’re not even moving to the most current version-it was deemed too difficult and time-consuming to tackle at this point. You are still going to be a release behind. Are you ever going to catch up? Is minimal implementation always going to be the norm? Where is your focus on quality?

Do these scenarios sound familiar? Why is it sometimes so difficult to efficiently deliver a high-quality product? What circumstances sabotage our best intentions for quality? And, more importantly, how can we deliver quality products in spite of these saboteurs?

One of the most common and insidious culprits is the habit of sacrificing long-term goals for short-term goals. This can lead to myriad, long standing issues on a project. It is also one of the most difficult problems to eradicate. Other saboteurs can take the form of competing priorities, resource deprivation, dysfunctional team dynamics, and misplaced reward systems.

Joy will show you how to recognize these saboteurs and assess the damage they are causing. She will discuss practical strategies for eliminating these troublesome quality-squashers or at least mitigating their affects.

Joy Shafer is currently a Consulting Test Lead at Quardev Laboratories on assignment at Washington Dental Services. She has been a software test professional for almost twenty years and has managed testing and testers at diverse companies, including Microsoft, NetManage and STLabs. She has also consulted and provided training in the area of software testing methodology for many years. Joy is an active participant in community QA groups. She holds an MBA in International Business from Stern Graduate School of Business (NYU). For fun she participates in King County Search and Rescue efforts and writes Fantasy/Sci-fi.

Meeting Slide Deck: Sabotaging Quality Slide Deck

July 2011 QASIG Meeting

6 Considerations for Mobile Device Testing

Presented by: Jacob Stevens, Senior Test Lead at Quardev, Inc. and Matt Pina, Manager of IT Applications Quality Assurance at Alaska Airlines

New to mobile? Unsure how the unique dynamics and constraints of this lucrative platform might shape your test design? This primer to testing on mobile devices introduces the aspects of any mobile test scenario that may warrant consideration for your testing. We’ll look at the 6 key considerations to help you ensure thoughtful design and proper coverage, from carrier issues to the myriad hardware configurations.

About our speakers:

Jacob Stevens is a Senior Test Lead at Quardev, Inc. and a ten year QA veteran with over 40 industry leading clients. The scope and scale of the projects, and the types of platforms, technologies and methodologies he’s worked with have been widely varied. Jacob studied under Jon Bach to adopt a context-driven approach to test design. One of Jacob’s favorite topics in QA is epistemology. How do we know that our test results are accurate? How do we ensure inherent biases in our test execution methodologies do not manifest in false positives or false negatives? Jacob enjoys talking technology and many other subjects on Twitter @jacobstevens. Jacob is also a little uncomfortable writing about himself in the third person.

Matt Pina is the Manager of IT Applications Quality Assurance at Alaska Airlines, Inc. He is an IT professional with over 25 years in the industry. His passion is to create Quality Models that adapt to multiple platforms. He currently leads the Alaska QA Center of Excellence with project work supporting all platforms, but primarily focusing on Mobile and Web applications. Matt is a graduate of Central Washington University School of Business, holding a B.S in Business Administration/Finance. Additionally he holds several computing certifications from various local universities.

May 2011 QASIG Meeting

Testing in Production: Your Key to Engaging Customers

Presented by: Seth Eliot, Senior Test Manager at Microsoft, Bing

Feature lists do not drive customer attachment; meeting key needs does. Learn how Testing in Production (TiP) works synergistically with customer scenarios to put real product in front of real users in their own environment to get direct, actionable feedback from users and uncover features that meet their key needs. TiP also allows us to quantify the impact of these features which is crucial since evidence shows that more than half of the ideas that we think will improve the user experience actually fail to do so-and some actually make it worse.

Learn about tools and techniques like Online Experimentation, Data Mining, and Exposure Control that you can use to Test in Production (TiP) to get direct, actionable feedback from real users and learn what works and what doesn’t. Of course production can be a dangerous place to test, so these techniques must also limit any potential negative impact on users. Hear several examples from within Microsoft, as well as Amazon.com, and others to show how Testing in Production with real users will enable you to realize the better software quality.

About our speaker: Seth Eliot is Senior Test Manager at Microsoft where his team solves Exabyte storage and data processing challenges for Bing. Previously he was Test Manager for the Microsoft Experimentation Platform (http://exp-platform.com) which enables developers to innovate by testing new ideas quickly with real users “in production”. Testing in Production (TiP), software processes, cloud computing, and other topics are ruminated upon at Seth’s blog at http://bit.ly/seth_qa. Prior to Microsoft, Seth applied his experience at delivering high quality software services at Amazon.com where he led the Digital QA team to release Amazon MP3 download, Amazon Video on Demand Streaming, and support systems for Kindle.

Presentation Slide Deck: Testing in Production

March 2011 QASIG Meeting

To Test or Not To Test

Presented by: Adam Yuret, Sr. Test Engineer, Volunteermatch.com

Ostensibly the goal of testing is to provide test coverage of a software product to uncover and document bugs. What if a stakeholder doesn’t want you to report bugs? What if they want you to test less? How would you work to implement a new testing process to a team transitioning to a new methodology without over-reaching your mandate as a tester? now, add to that the additional challenge of never having met the team in person.

Let’s discuss scenarios where the tester is explicitly asked to ignore most bugs, not because the product is so polished that the only probable defects are minor, but because the opposite is true. There are so many problems that to document them all and act on them would have a crippling effect on the project. What would you do in this scenario? Come join Adam Yuret and hear how he’s handled this type of dilemma in current, and past projects. Likewise, share your contexts and ways in which you may have faced these challenges.

About our speaker: After 8 years at WebTrends testing an enterprise level SaaS data warehousing product which included building and maintaining a large scale testing environment, Adam currently works as an “army of one” tester for Volunteermatch. VolunteerMatch is a national nonprofit organization dedicated to strengthening communities by making it easier for good people and good causes to connect. Adam is a relative newcomer to the context driven community and is currently working to build a testing process for a project that is transitioning to an agile/scrum methodology.

Presentation Slide Deck: To Test or Not To Test Slide Deck

January 2011 QASIG Meeting

Teaching the New Software Testers – an Experiment

The many facets of what it meant to introduce a software testing course into UW Bothell – professor and students give an experience report and look to the future

Presented by David Socha, Assistant Professor, University of Washington, Bothell

As a new professor at the University of Washington Bothell I decided to put together a new course on Software Testing – a course that had never been taught here, and that I had never taught. Nor had I ever explicitly acted in the role of software tester during my 19 years working in the software industry.

Why do such a crazy thing? How did the course turn out? What were the good and the bad, the memorable and the inspiring from this course? How did it change the students, and the instructor? What did it “mean” for this course to be taught at the University of Washington Bothell? What might this “mean” for the software testing community in this area?

If you are interested in any of these questions, or might want to be involved in the co-design and execution of such a course next year, come hear David Socha and some of his students give an experience report on this autumn quarter 2010 course at the University of Washington Bothell.

About our speaker: As of September, 2010 David Socha is an Assistant Professor at the University of Washington Bothell in the Computing and Software Systems program. He grew up on a farm in Wisconsin, and in the process became eco-literate (e.g. a systems thinker). This deepened as an undergraduate at the University of Wisconsin, Madison where he made his own luck and spent a year as a field assistant in the Galapagos Islands as a research assistant studying the Galapagos land iguanas, and then earned a BS in Zoology. After earning a PhD in Computer Science and Engineering at the University of Washington, Seattle, David spent 19 years working in a variety of software development organizations moving from being an individual contributor, to managing groups of software developers, to being an agile coach. Recognizing the Teacher in him, he now is a tenure-track professor at UW Bothell working to make a difference. His areas of interest focus on the human side of software development, innovation, design, biomimicry… and perhaps even software testing.

Presentation Slide Deck: Teaching the New Software Testers

November 2010 QASIG Meeting

My Crazy Plan to Respond to Change

Presented by Jon Bach, Manager for Corporate Intellect, Quardev, Inc.

During a given workday, we either focus on testing activities or we have to focus on the things that interrupt testing activities. Some of us lament the interruptions, others find it energizing because it’s consistent with notions of Agile – like responding to change over following a plan. I’ve got an idea on how to use interruptions to focus on testing activities. Multi-tasking is not evil, nor does it have to torpedo your project, but it may require a management technique to help you stay sane. Enter Thread-Based Test Management — a way to let interruptions guide you to where you’re needed most. Using a kanban-style dashboard is one way to go with the flow without losing your way. Using a spreadsheet with rows to track topics you followed in a day is another. In this talk, Jon Bach describes his experiences with a new technique he and his brother James invented that might help you get rid of the shame of feeling undisciplined for letting yourself be interrupted while being more response-able to the important things that need your time and attention.

About our speaker: Jon has over 15 years of experience, including Fortune 500 and start-up companies, serving in the Quality Assistance and Software Testing domain. He is a co-author of a Microsoft Patterns and Practices book on Acceptance Testing (2010) and is an award-winning keynote speaker for major testing conferences (STAR, QAI, BCS SIGIST). He has served as the vice president of conferences for the Association for Software Testing; invented a method for managing and measuring exploratory testing; a method for coaching testers (Open-Book Testing) and categorizing risk (Color-Aided Test Design) and has published over 50 articles, whitepapers, and presentations about the notions of value of exploration and rapid, risk-based testing. He has been with Quardev for the past 6 years and serves as the speaker chairman for the acclaimed Seattle non-profit QASIG testing forum.

September 2010 QASIG Meeting

Beyond Six Sigma: Agile/Lean/Scrum/Xp/Kanban for physical manufacturing – building a 100 MPG road car in 3 months

Presented by Joe Justice, a Seattle area lean software consultant and entrepreneur.

Our September speaker, Joe Justice, presented an experience report about Taking on the Progressive Insurance Automotive X Prize using Lean software techniques – this report included how Joe and his team did it, pictures and information about the car, the methods, methodology, and secrets, questions and next steps, and the future of this approach.

About our speaker: Joe Justice, a Seattle area lean software consultant and entrepreneur, ported lean software practices back to manufacturing in order to compete in the Progressive Insurance Automotive X Prize, a $10,000,000.00 race for 100 MPG cars built to road legal requirements. Joe will walk through the take-aways, wins, and lessons learned when building a ultra-fuel-efficient hyper car using TDD, Scrum, and pair development.