WSA Quality Assurance Special Interest Group
Sign up for our E-mail List

Past QA SIG Meetings

from March 12, 2014

Testing Like Batman

Presented by: Zephan Schroeder, Software Development Engineer in Test at Philips Healthcare

Zephan's Presentation Slides

Software Testing Tools Mind Map

What's in your utility belt? What is in your colleague's utility belt? Where do you get your tools and information? Which hero do you test like?

We will touch on many test tools currently in use and outline categories of tools. We will also explore which tools are effective and when they are appropriate. Along the way we will discuss a few heroes and how your mindset is a critical factor for both success and personal happiness.

About our speaker:
Zephan Schroeder has worked for Microsoft for over 15 years doing technical support, technical editing, program management, and software testing. He currently works at Philips Healthcare as a Senior Software Engineer testing remote service solutions for Philips medical imaging devices located around the world. Zephan also manages the TFS (Microsoft Team Foundation Server) instance providing version control, work item tracking, defect tracking, and release repository for over thirty users across 4+ product teams. Additionally Zephan is responsible for ISO 27001 audit compliance for the Remote Service Solutions development team.

When not chasing bugs Zephan enjoys raising a family of two cats, two dogs, one teenage boy, one teenage girl, and an amazing wife. When time permits he does mentoring, tech coaching, casual volleyball, and online chess (zephans on chess.com).

from January 8, 2014

Testing Science: Breaking the Fourth Wall of Engineering

Presented by: Curtis Stuehrenberg, Software Quality Assurance Manager, Climate Corporation

The modern test engineer has a wide variety of tools aiding them in their quest to not just verify computer software but help make sure it's "providing perceived value to some one at some time." However what do you do when your persona, your user stories, and your field trips are simply not enough?

This is a question I've found myself facing again and again in my career. I first had to face it when helping to build software designed to aid bond traders at the Federal Home Bank of Seattle. I'm currently facing it as I work with agronomists, statisticians, meteorologists, climatologists, and actuarial risk analysts building products to turn the industry of crop insurance on its ear.

Please join me for an evening of talk about the problems we face when asked to test a value proposition for which we have no context or experience and how I've addressed and am currently addressing them.

About our speaker:

Curtis is currently helping the world's people and businesses adapt to a changing climate as the Software Test and Quality Assurance Manager for the Climate Corporation with engineering offices in San Francisco and Seattle. Prior to joining Climate just this past October Curtis previously experimented with big data collection and machine learning algorithms at Electronic Arts, helped build Accelrys's industry leading small molecule chemical lab management software, and tried disrupting how phase two and three clinical trial pharma studies are designed and executed with the SF startup Medrio.

from November 13, 2013

High Volume Automated Testing for Software Components

Presented by: Harry Robinson and Doug Szabo, Microsoft

High Volume Automated Testing Slide Deck

Note from Harry: During the presentation, we showed sequences that expose bugs in sort routines. For those who would like to try their luck, here is the URL that hosts the Sorting Demo: http://www.brian-borowski.com/Software/Sorting. The algorithm we showed is called Shearsort. To get people started, the sequence "8 7 6 5 4 3 2 1 0" succeeds; the sequence "8 1 6 3 4 5 2 7 0" fails.

"Bugs are more insidious than ever we expect them to be." - Boris Beizer

Would you expect to find bugs in an award-winning library of sorting routines written by professional coders and featured in the 2006 O'Reilly book, Windows Developer Power Tools?

Or, to phrase it differently, which of the following inputs will expose a bug in this well-regarded sorting library?

A. 0 1 2 3 4 5 6 7 8

B. 1 0 3 2 5 6 4 7 8

C. 8 1 6 3 4 5 2 7 0

D. 1 0 3 2 5 4 7 6 8

E. 8 7 6 5 4 3 2 1 0

The bug-exposing input turns out to be input C.

Would you have chosen that sequence for your unit testing? Probably not.

Let a relentless tester and an enlightened developer show you how simple, high-volume automation found insidious bugs that eluded a bevy of well-crafted unit tests. See the results, ask questions, get answers, and find out whether this technique should be part of your toolkit.

About our speakers:

Harry Robinson has been working on and thinking about software testing for a long time, pioneering advanced test generation approaches at Bell Labs, HP, Google and Microsoft over the past 20 years. He currently focuses on test techniques that combine human and machine intelligence. He is Principal SDET for Microsoft's Windows Embedded team.

Doug Szabo has been developing and breaking software for 20 years across a range of applications from geodesy to 3-D hyperbolic graphs to automated mapping and facilities management systems. His 3-D work provided the visualization interface for Test Model Toolkit, Microsoft's first model-based testing tool. Doug is a big fan of using programmatic test generation to get machines to do the heavy lifting in test.

from September 11, 2013

Human-Scale Test Automation

Presented by: Michael Hunter, Senior SDET, Microsoft

Recently presented at CAST 2013, Michael will give a modified version of his workshop, Human-Scale Test Automation - with audience guidance and input.

Michael has spent the last ten years implementing automation stacks of one form or another. Most of them have been useful. Some have even continued until be useful after he left the team. In helping all these teams converge on a stack that works for them, Michael found two constants: every stack is different, and finding the right stack is hard!

All those implementation details get in the way, even when confident we've abstracted them all away. In this workshop we'll experience this firsthand: we'll figure out the "right" set of customer actions, implement them in an automation stack where we are the various components, and then execute a few test cases and see what we learn.

About our speaker: While studying architecture in Chicago, IL, I took an internship updating CAD drawings at a major Chicago bank. My desire to make the computer do most of the work turned that internship into a full-time job writing applications for the CAD system as well as for other areas of the bank. At the same time, a major CAD company was looking for people fluent in both CAD and programming - a perfect fit with my experience. The collaboration proved fruitful for both parties; I found lots of issues with the APIs, and the expertise I developed with those APIs led to my first published articles.

My work on AutoCAD brought me a job offer from a competitor and my first full-time testing job. A later acquisition of that company by Microsoft made me a Microsoftie, and I'm somewhat bemused to have now spent thirteen years helping Microsoft test better.

My "You Are Not Done Yet" checklist and other good stuff are at http://www.thebraidytester.com.

from July 10, 2013

Lightning Talks

Presented by: Dave Mozealous, Samantha Kalman, and Jacob Stevens

Lightning Talk Speakers and Topics:

Dave's Presentation Slides

Samantha's Presentation Slides

Jacob's Presentation Slides

Dave Mozealous, Amazon Using image comparison to improve UI testing Dave will present on how tools like Selenium WebDriver and ImageMagick can aid and improve the testing of UI changes for the Web.
Samantha Kalman Design-Level Testing Prototypes can be an effective tool in evaluating the quality of a user experience in the modern, multi-device landscape.
Samantha is an independent game developer, designer, and prototyper with an extensive background in testing practices including positions at Quardev, Unity Technologies, and Amazon.
Jacob Stevens Intro to Mobile Test Automation using Trade Federation Jacob will give a brief overview of how it works and then show what it will do for you and highlight the limitations.

from May 8, 2013

Scripted Manual Automated Exploratory Testing

Presented by: Keith Stobie, TiVo

Keith's Presentation Slides

Manual versus automated is a well-known continuum. Less known explicitly is the scripted versus exploratory dimension and its interaction with manual versus automated.

Join us for the May QASIG to learn about the forces that influence when automation or manual testing is most appropriate and when confirmatory (scripted) or bug finding (exploratory) is most appropriate. Keith Stobie will show the role and benefit of each type (manual scripted, automated scripted, manual exploratory, automated exploratory).

About our presenter: Keith Stobie is a Senior Quality Engineering Architect at TiVo who specializes in web services, distributed systems, and general testing, especially design. Previously he has been Test Architect for Bing Infrastructure where he planned, designed, and reviewed software architecture and tests; and worked in the Protocol Engineering Team on Protocol Quality Assurance Process including model-based testing (MBT) to develop test framework, harnessing, and model patterns. With three decades of distributed systems testing experience, Keith's interests are in testing methodology, tools technology, and quality process.

Check out his blog (http://testmuse.wordpress.com) to learn more about his work. Keith is a volunteer with SASQAG.org and PNSQC.org and a member of AST, ASQ, ACM, and IEEE. Keith has a BS in computer science from Cornell University. ISTQB FL. ASQ CSQE. BBST Foundations graduate. Keith keynoted at CAST 2007 and MBT-UC 2012 and has spoken at many other international conferences.

from March 13, 2013

Anyone can be a test innovator - why not you?

Presented by: Alan Page, Microsoft

Alan's Presentation Slides

The software tester's nature for system thinking, and for identifying problems and patterns makes them well-suited for innovation, yet few testers take the time to apply their skills and experience to this end. Successful innovation is not purely a matter of skill, intelligence, or luck. Innovation begins with careful identification and analysis of a problem, obstacle, or bottleneck; followed by a solution that not only solves the problem, but frequently solves it in a way that has widespread benefit - or in a way that changes the basic nature of the problem entirely.

Alan Page breaks down the cogs and wheels of innovation and shows examples of how some testers are applying game-changing creativity to discover new ways to improve tests, testers, and testing on their organizations. Problems, solutions, tips, tricks, and more are all on the radar for this whirlwind tour of pragmatic test innovation. Best of all, you'll walk away knowing that anyone, especially you, can be a test innovator.

About our presenter: Alan Page is currently a Principal SDET (yet another fancy name for tester) on the Xbox console team at Microsoft, Alan has previously worked on a variety of Microsoft products including Windows, Windows CE, Internet Explorer, and Office Lync. He also spent some time as Microsoft's Director of Test Excellence where he developed and ran technical training programs for testers across the company.

Alan is edging up on his 20th anniversary of being a software tester. He was the lead author on the book How We Test Software at Microsoft, contributed chapters for Beautiful Testing (Adam Goucher/Tim Riley) on large-scale test automation and Experiences of Test Automation: Case Studies of Software Test Automation (Dorothy Graham/Mark Fewster). You can follow him on his blog (http://angryweasel.com/blog) or on twitter (@alanpage).

from January 9, 2013

Agile and Quality - How Can They Work Together? A Panel Discussion.

Moderated by: Jacob Stevens, Quardev, Inc.

Panel Members: Joy Shafer, Quardev, Inc.; Uriah McKinney, Deloitte Digital; and Shawn Henning, Deloitte Digital

Join us for a panel discussion on Agile practices and quality - hear from industry professionals from various company sizes who work in Agile environments on how they mitigate risk and incorporate quality best practices. The panel will take questions from the moderator and audience and is sure to be a great discussion!

Jacob Stevens is a Senior Test Lead at Quardev, Inc. and a ten year QA veteran with over 40 industry leading clients. The scope and scale of the projects, and the types of platforms, technologies and methodologies he's worked with have been widely varied. Jacob studied under Jon Bach to adopt a context-driven approach to test design. One of Jacob's favorite topics in QA is epistemology. How do we know that our test results are accurate? How do we ensure inherent biases in our test execution methodologies do not manifest in false positives or false negatives? Jacob enjoys talking technology and many other subjects on Twitter @jacobstevens. Jacob is also a little uncomfortable writing about himself in the third person.

About our panel members:

Joy Shafer is currently a Consulting Test Lead at Quardev on assignment at Alaska Airlines. She has been a software test professional for almost twenty years and has managed testing and testers at diverse companies, including Microsoft, NetManage and STLabs. She has also consulted and provided training in the area of software testing methodology for many years. Joy is an active participant in community QA groups. She holds an MBA in International Business from Stern Graduate School of Business (NYU). For fun she participates in King County Search and Rescue efforts and writes Fantasy/Sci-fi.

Uriah McKinney has been deeply involved in mobile quality assurance since the beginning of the 3rd mobile revolution (circa 2008). Throughout his tenure with Deloitte Digital (formerly, Übermind), Uriah has balanced client engagements on iOS, Android, and mobile web projects with developing a methodological framework for quality assurance specifically tailored to the intersection of mobile and agile development. Uriah is one of the founding members of the Center of the Agile Universe meetup (http://centeroftheagileuniverse.com/); the Product Owner of the upcoming Mobile Agile Quality Conference (http://maqconference.com/); and apparently not above shameless cross-promotion.

Shawn Henning is part of the Agile transformation at Deloitte Digital (formerly Übermind). As both a Senior QA engineer and Scrum Master he helps teams iteratively deliver world class mobile software. He is passionate about working closely with clients to regularly deliver working code, organically growing a completed product through constant feedback and iteration. Shawn has over fifteen years of experience in Quality Assurance in both desktop and mobile software. He attended his first Open Space Technology conference a year ago and was struck by the power of this format to foster conversations which resulted real and practical answers to participants problems. He has since attended many OST and LEAN Coffee events and helped to organize last year's highly successful Mobile, Agile, Quality conference: MAQCon.

from November 14, 2012

Leaping into "The Cloud": Rewards, Risks, and Mitigations

Presented by: Ken Johnston and Seth Eliot, Microsoft

Seth and Ken's Presentation Slides

The cloud has rapidly gone from “that thing I should know something about” to the “centerpiece of our corporate IT five-year strategy.” However, cloud computing is still in its infancy. Sure, the marketing materials presented by cloud providers tout huge cost savings and service level improvements—but they gloss over the many risks such as data loss, security leaks, gaps in availability, and application migration costs. Ken Johnston and Seth Eliot share new research on the successful migrations of corporate IT and web-based companies to the cloud. Ken and Seth lay out the risks to consider and explore the rewards the cloud has to offer when companies employ sound architecture and design approaches. Discover the foibles of poor architecture and design, and how to mitigate these challenges through a novel Test Oriented Architecture (TOA) approach. Take back insights from industry leaders—Microsoft, Amazon, Facebook, and Netflix—that have jumped into the cloud so that your organization does not slam to the ground when it takes the leap.

About our speakers:

Seth Eliot is Senior Knowledge Engineer for Microsoft Test Excellence focusing on driving best practices for services and cloud development/testing across the company. He previously was Senior Test Manager, most recently for the team solving exabyte storage and data processing challenges for Bing, and before that enabling developers to innovate by testing new ideas quickly with users “in production” with the Microsoft Experimentation Platform (http://exp-platform.com). Testing in Production (TiP), software processes, cloud computing, and other topics are ruminated upon at Seth's blog at http://bit.ly/seth_qa and on Twitter (@setheliot). Prior to Microsoft, Seth applied his experience at delivering high quality software services at Amazon.com where he led the Digital QA team to release Amazon MP3 download, Amazon Instant Video Streaming, and Kindle Services.

Ken Johnston is a frequent presenter, blogger, and author on software testing and services. Currently he is the Principal Group Program Manager for the Bing Big Data Quality and Measurements team. Since joining Microsoft in 1998 Johnston has filled many other roles, including test lead on Site Server and MCIS and test manager on Hosted Exchange, Knowledge Worker Services, Net Docs, MSN, Microsoft Billing and Subscription Platform service, and Bing Infrastructure and Domains. Johnston has also been the Group Manager of the Office Internet Platforms and Operations team (IPO) and for two and a half years (2004-2006) he served as the Microsoft Director of Test Excellence. He earned his MBA from the University of Washington in 2003. His is a co-author of "How we Test Software at Microsoft" and contributing author to "Experiences of Test Automation: Case Studies of Software Test Automation." To reach Ken contact him through twitter @rkjohnston.

from September 12, 2012

QA in Scrum: Beyond mere hand waving

Presented by: Uriah McKinney, Deloitte Digital

QA in Scrum, Presentation Slide Deck

Scrum is a tremendously powerful framework for prioritizing tasks, exposing risks, and generally getting things done. However, it has very little to say with respect to quality assurance and testing. While not a problem in and of itself, this lack of guidance can result in any number of dysfunctions in team dynamics and role expectations.

This session will use our current approach to QA integration as a backdrop to discuss some of the most significant challenges we've faced in this area and how we overcame them (or didn't).

About our speaker:

Uriah McKinney has been deeply involved in mobile quality assurance since the beginning of the 3rd mobile revolution (circa 2008). Throughout his tenure with Deloitte Digital (formerly, Übermind), Uriah has balanced client engagements on iOS, Android, and mobile web projects with developing a methodological framework for quality assurance specifically tailored to the intersection of mobile and agile development. Uriah is one of the founding members of the Center of the Agile Universe meetup (http://centeroftheagileuniverse.com/); the Product Owner of the upcoming Mobile Agile Quality Conference (http://maqconference.com/); and apparently not above shameless cross-promotion.

from July 11, 2012

On Combinatorial Testing

Presented by: James Bach, Satisfice, Inc.

NOTE: James will be presenting from Orcas Island and streaming live - you can catch it at our usual location at Quardev where we'll be providing pizza and beverages as usual or access remotely.

Combinatorial testing is the process of testing the interactions between multiple variables in a system. But few testers know how to approach it systematically. I will talk about how to do that, touching on some of the mathematics while focusing mostly on the pragmatics. May also include information on Gray Code, de Bruijn sequences, and all-pairs coverage along the way.

About our speaker:

James Bach is founder and principal consultant of Satisfice, Inc., a software testing and quality assurance company. In the eighties, James cut his teeth as a programmer, tester, and SQA manager in Silicon Valley in the world of market-driven software development. He is a pioneer of agile, rapid, and exploratory approaches to software testing. He is the author of Lessons Learned in Software Testing and Secrets of a Buccaneer-Scholar.

from May 9, 2012

Managing Quality Debt

Presented by: Chris Sterling, Founder and CTO of Agile Advantage, Inc.

Software debt slowly creeps into applications and platforms when integrity is not asserted and verified on a frequent basis. Quality debt is a type of software debt that can be managed and monitored separately from the other types (technical, configuration management, design, and platform experience debt). This session will cover some processes and practices to help manage quality debt effectively such as:

  • Acceptance Test-Driven Development
  • Test-Driven Development
  • Behaviour-Driven Development (BDD)
  • The Three Amigos Pattern (common pattern titled by Bob Payne and George Dinwiddie)
  • Push Button Release
  • Asserting quality with Definition of Done
  • Identify Quality Debt earlier with Tools and Dashboards

About our speaker:

Chris Sterling is founder and CTO of Agile Advantage Inc. where he works with clients as a Technology Consultant, Agile Coach, and Certified Scrum Trainer and creates tools to help Agile teams improve their performance. Chris is author of the book "Managing Software Debt: Building for Inevitable Change" and writes about his real world adventures in technology on the popular "Getting Agile" blog. As a trainer and speaker, Chris enlivens technical topics with his deep passion for software development and a touch of humor. In his spare time, he is a regular contributor to multiple open source projects.

Chris Sterling's Managing Quality Debt Slide Deck Slide Share Slide Deck

from March 14, 20112

The Science of Being Happy and Productive at Work

Presented by: Scott Crabtree

Grounded in solid scientific data, this award winning presentation delivers steps everyone can act on to be happier on the job. Various studies show that happier people are more productive, creative, insightful, engaged, resilient, healthy, and more. This presentation covers dozens of techniques to increase job happiness, organized around themes of goals, relationships, and attitude.

About our speaker:

Scott Crabtree earned a B.A. in Cognitive Science from Vassar College in 1988. Immediately afterward he worked on artificial intelligence software including expert systems. He started working at the first of several game development companies in 1996. Serving as a Software Engineer, Game Designer, Producer, and Entrepreneur, Scott is proud to have worked on game development with companies including Microsoft, Mattel, Disney, LEGO, Nike and more. He’s published games for PS2, Xbox, PC, and mobile phones including the iPhone. He joined Intel in 2005 as an Engineering Manager focused on video game developers. He is currently Tech Strategist for the Intel Atom Developer Program. He is fascinated by and passionately studies organizational development, human psychology, neuroscience, and the science of happiness and well-being.

While happier than he used to be, Scott is NOT one of those over-the-top always bubbly happy people that can be so annoying to the rest of us! :) Scott lives in Portland, Oregon with his wife, young daughter, and mutt. He loves spending time with them, especially in nature, and also enjoys playing with his band Mister Fisk.

Please see Scott Crabtree's Website for more information and to contact him for slides or with questions: http://www.happybrainscience.com/

from January 11, 20112

Pairing Developers with Non-Developers

Presented by: Lanette Creamer, Independent Software Testing Consulting and Coach

Pairing in the world of software development traditionally brings up an image of two developers working together in person, creating code at the same time. This practice is often used by agile teams and teams doing extreme programming. Many teams currently may pair coding testers with coders who are working on product development. The vocabulary gap is much smaller when everything is in the same language, code, but what happens when the business needs aren't understood or well communicated? In the last three years, Lanette has been working with software testers, experimenting on the fringes of pairing. Come learn some ways to pair programmers with non-programmers at specific strategic times for the purpose of more collaboration and efficiency. Learn how handoffs, bug demos, and psuedo code can offer new types of pairing, and tips for making them practical rather than mandated.

About our speaker:

Lanette Creamer is an independent software testing consultant and coach from Seattle, WA. Known in the blogging community as TestyRedhead, her blog can be found at http://blog.testyredhead.com/. Her presentations are known for being candid, including cat photos, and being focused on the human side of software testing. Recently focused on Agile Testing and Pairing with Developers, in the past her papers and presentations have focused on combining automated checks with exploratory charters, and group collaborative testing techniques.

Pairing Presentation Slide Deck

from November 2, 2011

Google's Approach to Test Engineering

Google takes a holistic approach to quality involving developers (SWEs), developers in test (SETs) and a lesser talked about role called test engineering (TE). This talk outlines the role and responsibility of the Google Test Engineer and details the tools and techniques they use to ship high quality software to millions of users on a daily basis. Learn about the process of Attribute, Component, Capability (ACC) analysis and risk assessment used to ship a number of Google products and find out how you can get access to the tools Google TEs use.

About our speaker:

Dr. Whittaker is currently the Engineering Director over engineering tools and testing for Google's Seattle and Kirkland offices where he manages the testing of Chrome, Chrome OS, Google Maps and other web applications. He holds a PhD in computer science from the University of Tennessee and is the author or coauthor of four acclaimed textbooks. How to Break Software, How to Break Software Security (with Hugh Thompson) and How to Break Web Software (with Mike Andrews). His latest is Exploratory Software Testing: Tips, Tricks, Tours and Techniques to Guide Test Design and he's authored over fifty peer-reviewed papers on software development and computer security. He holds patents on various inventions in software testing and defensive security applications and has attracted millions in funding, sponsorship, and license agreements while a professor at Florida Tech. He has also served as a testing and security consultant for dozens of companies and spent 3 years as an architect at Microsoft. He is currently writing How Google Tests Software.

See the slides: Presentation Slide Deck

from September 14, 2011

Sabotaging Quality

Presented by: Joy Shafer, Consulting Test Lead, Quardev, Inc.

From the very beginning of the project there have been discussions about quality. The team has unanimously agreed it's important-a top priority. Everyone wants to deliver a high-quality product of which they can be proud. Six months after the project kick-off, you find yourself neck-deep in bugs and fifty-percent behind schedule. The project team decides to defer half of the bugs to a future release. What happened to making quality a priority?

One of your partners is discontinuing support for a product version on which your online service is dependent. You have known this was coming for years; in fact, you are actually four releases behind your partner's current version. For some reason an update project has been put off repeatedly, until now-the last possible moment. Now the team is scrambling to get the needed changes done before your service is brought down by the drop in support. You are implementing the minimum number of features required to support the newer version. In fact, you're not even moving to the most current version-it was deemed too difficult and time-consuming to tackle at this point. You are still going to be a release behind. Are you ever going to catch up? Is minimal implementation always going to be the norm? Where is your focus on quality?

Do these scenarios sound familiar? Why is it sometimes so difficult to efficiently deliver a high-quality product? What circumstances sabotage our best intentions for quality? And, more importantly, how can we deliver quality products in spite of these saboteurs?

One of the most common and insidious culprits is the habit of sacrificing long-term goals for short-term goals. This can lead to myriad, long standing issues on a project. It is also one of the most difficult problems to eradicate. Other saboteurs can take the form of competing priorities, resource deprivation, dysfunctional team dynamics, and misplaced reward systems.

Joy will show you how to recognize these saboteurs and assess the damage they are causing. She will discuss practical strategies for eliminating these troublesome quality-squashers or at least mitigating their affects.

Joy Shafer is currently a Consulting Test Lead at Quardev Laboratories on assignment at Washington Dental Services. She has been a software test professional for almost twenty years and has managed testing and testers at diverse companies, including Microsoft, NetManage and STLabs. She has also consulted and provided training in the area of software testing methodology for many years. Joy is an active participant in community QA groups. She holds an MBA in International Business from Stern Graduate School of Business (NYU). For fun she participates in King County Search and Rescue efforts and writes Fantasy/Sci-fi.

See the slides: Presentation Slide Deck

from July 13, 2011

6 Considerations for Mobile Device Testing

Presented by: Jacob Stevens, Senior Test Lead at Quardev, Inc. and Matt Pina, Manager of IT Applications Quality Assurance at Alaska Airlines

New to mobile? Unsure how the unique dynamics and constraints of this lucrative platform might shape your test design? This primer to testing on mobile devices introduces the aspects of any mobile test scenario that may warrant consideration for your testing. We'll look at the 6 key considerations to help you ensure thoughtful design and proper coverage, from carrier issues to the myriad hardware configurations.

About our speakers:

Jacob Stevens is a Senior Test Lead at Quardev, Inc. and a ten year QA veteran with over 40 industry leading clients. The scope and scale of the projects, and the types of platforms, technologies and methodologies he's worked with have been widely varied. Jacob studied under Jon Bach to adopt a context-driven approach to test design. One of Jacob's favorite topics in QA is epistemology. How do we know that our test results are accurate? How do we ensure inherent biases in our test execution methodologies do not manifest in false positives or false negatives? Jacob enjoys talking technology and many other subjects on Twitter @jacobstevens. Jacob is also a little uncomfortable writing about himself in the third person.

Matt Pina is the Manager of IT Applications Quality Assurance at Alaska Airlines, Inc. He is an IT professional with over 25 years in the industry. His passion is to create Quality Models that adapt to multiple platforms. He currently leads the Alaska QA Center of Excellence with project work supporting all platforms, but primarily focusing on Mobile and Web applications.
Matt is a graduate of Central Washington University School of Business, holding a B.S in Business Administration/Finance. Additionally he holds several computing certifications from various local universities.

See the slides: Presentation Slide Deck

from May 11, 2011

Testing in Production: Your Key to Engaging Customers

Presented by: Seth Eliot, Senior Test Manager at Microsoft, Bing

Feature lists do not drive customer attachment; meeting key needs does. Learn how Testing in Production (TiP) works synergistically with customer scenarios to put real product in front of real users in their own environment to get direct, actionable feedback from users and uncover features that meet their key needs. TiP also allows us to quantify the impact of these features which is crucial since evidence shows that more than half of the ideas that we think will improve the user experience actually fail to do so-and some actually make it worse.

Learn about tools and techniques like Online Experimentation, Data Mining, and Exposure Control that you can use to Test in Production (TiP) to get direct, actionable feedback from real users and learn what works and what doesn't. Of course production can be a dangerous place to test, so these techniques must also limit any potential negative impact on users. Hear several examples from within Microsoft, as well as Amazon.com, and others to show how Testing in Production with real users will enable you to realize the better software quality.

About our speaker: Seth Eliot is Senior Test Manager at Microsoft where his team solves Exabyte storage and data processing challenges for Bing. Previously he was Test Manager for the Microsoft Experimentation Platform (http://exp-platform.com) which enables developers to innovate by testing new ideas quickly with real users "in production". Testing in Production (TiP), software processes, cloud computing, and other topics are ruminated upon at Seth's blog at http://bit.ly/seth_qa. Prior to Microsoft, Seth applied his experience at delivering high quality software services at Amazon.com where he led the Digital QA team to release Amazon MP3 download, Amazon Video on Demand Streaming, and support systems for Kindle.

See the slides: Seth Eliot's Presentation Slide Deck

from March 9, 2011

To Test or Not To Test

Presented by: Adam Yuret, Sr. Test Engineer, Volunteermatch.com

Ostensibly the goal of testing is to provide test coverage of a software product to uncover and document bugs. What if a stakeholder doesn't want you to report bugs? What if they want you to test less? How would you work to implement a new testing process to a team transitioning to a new methodology without over-reaching your mandate as a tester? now, add to that the additional challenge of never having met the team in person.

Let's discuss scenarios where the tester is explicitly asked to ignore most bugs, not because the product is so polished that the only probable defects are minor, but because the opposite is true. There are so many problems that to document them all and act on them would have a crippling effect on the project. What would you do in this scenario? Come join Adam Yuret and hear how he's handled this type of dilemma in current, and past projects. Likewise, share your contexts and ways in which you may have faced these challenges.

About our speaker: After 8 years at WebTrends testing an enterprise level SaaS data warehousing product which included building and maintaining a large scale testing environment, Adam currently works as an "army of one" tester for Volunteermatch. VolunteerMatch is a national nonprofit organization dedicated to strengthening communities by making it easier for good people and good causes to connect. Adam is a relative newcomer to the context driven community and is currently working to build a testing process for a project that is transitioning to an agile/scrum methodology.

See the slides: Adam Yuret's Presentation Slide Deck

See the archived video: http://www.ustream.tv/channel/qasig

from January 12, 2011

Teaching the New Software Testers - an Experiment

The many facets of what it meant to introduce a software testing course into UW Bothell - professor and students give an experience report and look to the future

Presented by David Socha, Assistant Professor, University of Washington, Bothell

As a new professor at the University of Washington Bothell I decided to put together a new course on Software Testing - a course that had never been taught here, and that I had never taught. Nor had I ever explicitly acted in the role of software tester during my 19 years working in the software industry.

Why do such a crazy thing? How did the course turn out? What were the good and the bad, the memorable and the inspiring from this course? How did it change the students, and the instructor? What did it "mean" for this course to be taught at the University of Washington Bothell? What might this "mean" for the software testing community in this area?

If you are interested in any of these questions, or might want to be involved in the co-design and execution of such a course next year, come hear David Socha and some of his students give an experience report on this autumn quarter 2010 course at the University of Washington Bothell.

About our speaker: As of September, 2010 David Socha is an Assistant Professor at the University of Washington Bothell in the Computing and Software Systems program. He grew up on a farm in Wisconsin, and in the process became eco-literate (e.g. a systems thinker). This deepened as an undergraduate at the University of Wisconsin, Madison where he made his own luck and spent a year as a field assistant in the Galapagos Islands as a research assistant studying the Galapagos land iguanas, and then earned a BS in Zoology. After earning a PhD in Computer Science and Engineering at the University of Washington, Seattle, David spent 19 years working in a variety of software development organizations moving from being an individual contributor, to managing groups of software developers, to being an agile coach. Recognizing the Teacher in him, he now is a tenure-track professor at UW Bothell working to make a difference. His areas of interest focus on the human side of software development, innovation, design, biomimicry... and perhaps even software testing.

See the slides: David Socha Presentation Slide Deck

See the archived video: http://www.ustream.tv/channel/qasig

from November 10, 2010

My Crazy Plan to Respond to Change

Presented by Jon Bach, Manager for Corporate Intellect, Quardev, Inc.

During a given workday, we either focus on testing activities or we have to focus on the things that interrupt testing activities. Some of us lament the interruptions, others find it energizing because it's consistent with notions of Agile - like responding to change over following a plan. I've got an idea on how to use interruptions to focus on testing activities. Multi-tasking is not evil, nor does it have to torpedo your project, but it may require a management technique to help you stay sane. Enter Thread-Based Test Management — a way to let interruptions guide you to where you're needed most. Using a kanban-style dashboard is one way to go with the flow without losing your way. Using a spreadsheet with rows to track topics you followed in a day is another. In this talk, Jon Bach describes his experiences with a new technique he and his brother James invented that might help you get rid of the shame of feeling undisciplined for letting yourself be interrupted while being more response-able to the important things that need your time and attention.

See the Quardev Website for a link to the slides and the archived presentation video http://www.quardev.com/events/2010-11-09-2069561390

About our speaker: Jon has over 15 years of experience, including Fortune 500 and start-up companies, serving in the Quality Assistance and Software Testing domain. He is a co-author of a Microsoft Patterns and Practices book on Acceptance Testing (2010) and is an award-winning keynote speaker for major testing conferences (STAR, QAI, BCS SIGIST). He has served as the vice president of conferences for the Association for Software Testing; invented a method for managing and measuring exploratory testing; a method for coaching testers (Open-Book Testing) and categorizing risk (Color-Aided Test Design) and has published over 50 articles, whitepapers, and presentations about the notions of value of exploration and rapid, risk-based testing. He has been with Quardev for the past 6 years and serves as the speaker chairman for the acclaimed Seattle non-profit QASIG testing forum.

from September 8, 2010

Beyond Six Sigma: Agile/Lean/Scrum/Xp/Kanban for physical manufacturing - building a 100 MPG road car in 3 months

Presented by Joe Justice, a Seattle area lean software consultant and entrepreneur.

Our September speaker, Joe Justice, presented an experience report about Taking on the Progressive Insurance Automotive X Prize using Lean software techniques - this report included how Joe and his team did it, pictures and information about the car, the methods, methodology, and secrets, questions and next steps, and the future of this approach.

Check out the team Website: http://www.wikispeed.com/

See the slides: http://tiny.cc/iqt6f

About our speaker: Joe Justice, a Seattle area lean software consultant and entrepreneur, ported lean software practices back to manufacturing in order to compete in the Progressive Insurance Automotive X Prize, a $10,000,000.00 race for 100 MPG cars built to road legal requirements. Joe will walk through the take-aways, wins, and lessons learned when building a ultra-fuel-efficient hyper car using TDD, Scrum, and pair development.

from July 14, 2010

Lightning Talks

Presented by 8-10 of your fellow QASIG participants

The format was the "lightning talk" where each of the speakers knew that they only had five minutes to present their idea (one at a time). Each speaker had a severely limited time-frame, but could speak about anything related to testing in that time – a new idea for status reporting or measurement, a lesson they learned on a project, an experience report about a new technique they tried, or a general test principle they want to share.

Lightning Talk Speakers, Topics, and Slides (where submitted):

Jim Hancock, Director of Quality, IT and Customer Delivery at Geospiza Creating a Quality Index Most software development has become "agile," but CMM, CobiT, PMBOK, Malcolm Baldridge and other quality process metric deployments are still large and complicated. Jim will discuss how to deploy software quality metrics that meet the following criteria: objective, non-obtrusive, flexible, repeatable, consistent, holistic, efficient and inexpensive. Most importantly, the Quality Index will provide an accurate assessment of risk and quality within any SDLC upon which logical deployment decisions can be made. Creating a Quality Index Slides (pdf)
Jeremy Lightsmith, Seasoned Agile Coach, Trainer and Facilitator What's Your Conflict Management Style? There is a model of the way people respond to conflict that looks like the attached picture. I thought it would be cool to introduce it and talk about why it's helpful and how to use it. There's also a related questionnaire that I can leave for everyone to see how they score. Jeremy's slides (Website link): http://jeremylightsmith.com/classes.html#how_do_you_handle_conflict
Jeff Brewster, Grameen Foundation Untethering a Test Lab Do you have a performance test lab that can only be used onsite or has expensive software? Been forced to share lab resources due to cost or space concerns? We'll share the steps the Mifos testing team used to move their performance test lab to the cloud. Untethering a Test Lab Slides (pdf)
Tim Tapping, SDET, Expedia How to sprint as Agile/Lean QA on the feature team QA has challenges if we are used to building out test plans from extensive documentation which is not present in a agile/lean development environments. We need to engage early in the iteration and help the team understand that 'quality deliverables' are a team activity. Drawing on my recent experience at an acquired startup which was an agile shop, I can share techniques that worked for us and pitfalls to avoid. Agile/Lean QA on the Feature Team Slides (pdf)
Jon Bach, Manager for Corporate Intellect, Quardev, Inc. Tested I'm working on a talk for Ignite (Seattle) that has a likelihood of being chosen. How much of a likelihood depends on well I can define this talk about our industry to the technical masses. I could use some feedback. Tested Slides (pdf)
Lanette Creamer, Test Lead, Adobe Systems Agile Testing Ninjas If Adam Goucher learned everything he knows about testing from Pirates, why does Lanette insist that Ninjas hold the key? What is it about stealth, speed, discipline, and skill that translates well to Agile testing? Agile Testing Ninjas Slides (pdf)
Jim Benson, CEO, Modus Cooperandi Kaizen or Death If we don't improve, we stagnate. If we stagnate, we perish. Life requires change, renewal, and innovation. Evolution is innovation. In life, in business, in everything, we must continuously improve. In 5 short minutes, Jim Benson will change your life through the Japanese concept of Kaizen (continuous improvement).
Gary Knopp, SDET, Attachmate Chow, Expand Your Idea of Dogfood This talk will be a quick case study of how we were able to break out of what seemed like an obvious test strategy (using or extending our existing test system) and move to a lighter-weight, component-based test approach. This new approach includes testing the component in isolation, the usage of the component in isolation, and then the system as a whole. Ultimately we were successful in using this strategy and even completed the testing 2 weeks early. Chow, Expand Your Idea of Dogfood Slides (png files): Overview, PkidIsolation, EndToEnd, ClientIsolation
Michael Wolf, Perl Trainer, evangelist, programmer, community member, town crier... TAP - Test Anything Protocol TAP - Test Anything Slides (pdf)

from May 12, 2010

Presented by Jim Benson, CEO of Modus Cooperandi

What is QA?

When we seek Quality, we are looking for something more than "Does it break if I do this?" We are looking for improvement, for ascendancy, for something better.

Scripts, exploration, observation.

We look for patterns, we look for exceptions, we look for jarring transitions.

In Personal Kanban and Lean, we look for patterns, exceptions and transitions in the creation of value and the flow of work. Think of it as QA for your workload and your processes. QA for your QA.

MetaQA.

On the 12th of May, we discussed how Lean techniques like Personal and Organizational Kanban can be applied to the highly variable workload of a tester. How understanding the variation in work and making judgments based on this understanding will help testers, managers and clients schedule for testing and the impacts on QA of unreasonable demands.

We will, at last, be able to define "unreasonable" and provide meaningful service level agreements.

About our speaker: Jim Benson is a collaborative management consultant. He is CEO of Modus Cooperandi, a consultancy which combines Lean, Agile Management and Social Media principles to develop sustainable teams. Clients include the United Nations, World Bank, Microsoft, and NBC Universal. Jim created the Personal Kanban technique that applies lean thinking to individual and small team work. He blogs at evolving web and is @ourfounder on twitter.

from March 10, 2010

Click to see Lanette's Slides (PDF)

Presented by Lanette Creamer

We may think we are ready to move onto new and innovative features. However, if we do not deal with the past, it can easily come back to haunt, slowing down new projects, and robbing our testing time unexpectedly, often to the point that testing becomes the bottleneck that slows innovation to a crawl.

For those of us who work on software that already exists, exciting new functionality and improvements are the main things that drive upgrades, as well as compatibility with new platforms. However, if end users cannot trust the quality of the legacy features they rely on, they will be reluctant to upgrade, or worse, your new versions will get a reputation of being unstable - harming overall adoption. In some cases users may downgrade their software to an earlier version because they are unhappy with the quality of the newer release or request an earlier version they feel is reliable.

This paper presentation is about the subjective and difficult part of testing which has no provable mathematical correct answer. It is about risk management, test planning, cost, value, and being thoughtful about which tests to run in the context of your specific project. The discussion covers identifying and reducing test case bloat, when it can be done, who does it, along with a few examples used in practice. Further, it will cover one untested but under test theory, experiences shared in significantly reducing test cases to cover more than three times the applications when the test team reduced from sixteen to four testers. When facing increasingly complex software and growing software, we must balance testing existing features that customers rely on every day with new features and interactions. When balanced in a sensible way, the best of the legacy test cases can be maintained, using existing knowledge to reduce risk as much as possible.

About our speaker: Lanette Creamer is a test lead with 10 years industry experience ranging from product feature testing on early versions of InDesign to leading collaborative end to end workflow testing across the Adobe Creative Suites. Most recently Lanette has been testing on an agile team that is automating the software production process between a product build and actual shipment, to be used on all Adobe shipping products in 2010. Lanette has two published technical paper "Testing for the User Experience", voted best paper by presentation attendees at PNSQC 2008 and "Reducing Test Case Bloat", PNSQC 2009. Her magazine article "9 Ways to Recruit Testers for Collaborative Test Events" was published in Software Test & Performance Magazine, in the January 2010 issue. Lanette started writing a testing blog at http://www.testyredhead.com in 2006 and has been writing and collaborating with the online testing community non-stop since.

from January 13, 2010

Presented by Detective Brian Stampfl

Detective Stampfl from the Seattle CSI unit will be joining us to discuss the business of Crime Scene Investigation - a discipline not unlike bug investigation, with processes often used in software testing. We'll talk about parallels and get some insight into what Crime Scene Investigation is really like, straight from the source.

Detective Stampfl will cover the following:

  • The make-up of the Seattle Police Crime Scene Investigation's Unit
  • The type of crimes that they respond to investigate
  • Steps necessary to properly document a crime scene
  • The differences between the real world and how investigations are portrayed on television, and a current and look into the future of Forensic technology

Detective Brian Stampfl has been involved in law enforcement for over eighteen years. He began his career in 1991 in Southern California with the San Bernardino Police Department, and later joined the Seattle Police Department in 1995. Since joining SPD, Detective Stampfl has worked in patrol and was a field training officer, tasked with training new officers who had just graduated from the academy. Detective Stampfl went on to become an academy instructor and when the opportunity arose, he acquired the title of detective and worked for over three years in the Sexual Assault and Child Abuse Unit. Then in 2005, the opportunity of a lifetime presented itself as the Seattle Police Department sought to create its own Crime Scene Investigations Unit. Detective Stampfl was one of seven detectives chosen not only to staff this new unit, but to assist in the building of this unit from the ground up, which started simply as an idea. In addition to hours of training associated with Crime Scene Investigation, Detective Stampfl is a graduate of the National Forensic Academy in Knoxville, Tennessee and is an adjust faculty member of Seattle University where he teaches a course in Crime Scene Investigation.

from November 11, 2009

Click to see Jon's Slides (PDF)

Presented by Jon Bach

Whether you are a tester or a test manager, Jon Bach assumes you have no time to do the things you want to do. Knowing that even the things you absolutely must do today is its own list of competing priority 1 items, he has ideas on how to cope. They are truly "half-baked," as in, they are still in the oven, still being tested. It is not about time management, it is about where you focus your energy - you and your team. This presentation will share some of his half-baked ideas that seemed to be valuable for him as a test manager, even if some of them didn't really work. His ideas are meant to solve common problems in test execution, reporting, measurement, and personnel - all of which are low, or no, cost and relatively easy to implement.

Jon Bach has been a software tester for 14 years, and is currently a freelance consultant for Quardev. He speaks frequently about exploratory and rapid testing, and is the co-inventor of Session-Based Test Management - a way to manage and measure exploratory testing. See his blog at http://jonbox.wordpress.com

from September 9, 2009

Presented by Harry Robinson

Remember the old MacGyver TV series? Each week, the hero solved difficult problems by combining his knowledge of applied science with everyday items such as baking soda, paper clips, and chewing gum. Test automation is currently in a rut and needs some of that outside-the-box thinking. When most testers think about automation, they think of regression scenarios and capture/replay tools. However, regression testing is the weakest, most insipid form of automation around! Regression tests are costly to maintain, rarely find important bugs, and are often obsolete by the time they are checked into the test case manager. If you are interested in approaching test automation from a fresh direction, join Harry Robinson in this innovative session and learn to create "freestyle test automation" from simple elements such as idle test machines, free programming tools and, yes, a bit of creative test thinking.

from July 8, 2009

Improving Testing Collaboration in Agile Development

Presented by Bruce Winegarden

The goal of this program is to stimulate a dialog on how QA testing professionals can best collaborate with agile development to form a quality continuum for enduring sustainable software.

Example cases from two different software products will give contrasting snapshot of before and after agile pictures. The "before" case used traditional waterfall approach with extended code, test and fix cycles. The "after" agile case used extreme programming practices such as: test first (TDD), acceptance testing (FIT), and pair programming. These examples will outline the range of QA testing strategies employed and setup discussion around topics such as:

  1. Why moving testing earlier in the process and making it more integral to guiding the development effort is a good idea.
  2. What quality strategies we can learn from in other industries such as statistical process control from Lean/SixSigma.
  3. How can the QA testing role better complement agile development methods such as SCRUM.
  4. What concepts do agile development and Exploratory Testing have in common and how can we leverage the similarities and uniqueness.

About Our Speaker:

Bruce Winegarden has extensive business and technical knowledge for software and systems development, product management and innovation. He enjoys guiding teams of people from different disciplines discover sustainable practices that smooth and accelerate their delivery. As principal consultant at Eagle Eye View, he delivers consulting, modeling, training, and coaching services applying Lean and Agile principles that enable organizational changes to: streamline product development, improve business workflows, deploy more meaningful and effective information technology.

from May 13, 2009

Data Governance - Protecting Proprietary Information from Unauthorized Release

Presented by Det. David Dunn and Seaton M. Daly III, Esq.

The most valuable asset to any organization is its "information." Quality Assurance professionals, with an emphasis in software development, critically examine a software program for functionality, compatibility, ease-of-use, stability, and integrity. This allows the QA professional to garner an understanding of a client's (or employer's) software strengths and weaknesses. But once that information is shared with a third-party, what controls or mechanisms are in place to ensure the security of that information? Are contractual provisions enough? How is the integrity of the QA industry affected by a data leak? How can the testing industry align its goals with the goals of its client(s)?

According to two of the largest security software firms, cyber-related crimes are now more lucrative than the international drug trade. While a virtual cyber-attack of a 9/11-style magnitude has not yet occurred, the need to adequately safeguard mission-critical data places network infrastructures on the frontlines. What is law enforcement doing to protect businesses from cyber-extortion and theft of proprietary information? Learn what mechanisms mitigate the chance of an unauthorized release, or disclosure, of proprietary information and how you can help investigate cyber-related crimes.

About Our Speakers:

Det. David Dunn, (Detective, Seattle Police Department, U.S. Secret Service Electronic Crimes Task Force) - Det. Dunn has been with the Seattle P.D. for 8 years, and working computer forensics and electronic crimes for 4 years. He is currently assigned to the U.S. Secret Service to assist them in investigating network intrusion and computer hacking cases at the regional, national, and international level.

Seaton M. Daly III, Esq. (Law Office of Seaton M. Daly III, P.L.L.C.) - Seaton Daly has extensive experience working in the Information Protection Industry, both as an attorney and non-attorney, implementing disaster recovery plans, and performing risk assessment audits, in both the public and private sectors. Mr. Daly owns a General Corporate Transaction and Privacy law firm in downtown Seattle.

from March 11, 2009

How To Be A Test Pilot

Rob Bach has an extensive background and expertise in the world of aviation. Parallel to his current job as a pilot for a major airline, he studies Human Factors and Ergonomics in the cockpit where they intersect with post accident/incident investigations and applying those results into preventative programs. Rob will explain a Technicolor thought process for a what is basically a black-and-white world that has many parallels to the domain of software testing. He will demonstrate why he believes that it is not discipline, but critical thinking skill that matters most in this mostly procedural world. This interactive workshop will be an exercise for the nimble brain.

About Our Speaker: Rob Bach has been a licensed pilot and flight instructor for 32 years in all types of flying machines, powered and unpowered. He is a member of the Antique Airplane Association, a member of the Experimental Aircraft Association, has recently restored an antique airplane called a Pietenpol, and considers himself an inventor and philosopher.

Click to see Rob's Slides (PDF)

from November 12, 2008

Color-Aided Test Design

Color coding theory is a method of test creation which aims at giving value by shaping, finding relevance and organizing tests based on over arching and broad testing concepts. By categorizing tests via high level test concepts (designated by color) testers can quickly and easily focus their aim and maximize reliable test coverage in a collaborative method to expose risks in the test plan. In this interactive session, Ben Brodsky, a QA Engineer at LexisNexis, talks about his experiences using a method pioneered by testing expert and McGill University professor Robert Sabourin to use color to aid test design and reporting.

About Our Speaker: Ben Brodsky is a software tester at LexisNexis in Bellevue. He started software testing in 2005 with a background in Applied Philosophy and Criminal Defense Investigations. He found that software testing contained many of the same skills that are used in the legal domain. Since moving into software investigations, he has constantly been on the hunt for ways to further the exploration and effect of test design.

Click to see Ben's Slides (PDF)

from September 10, 2008

Acceptable Acceptance Testing

This is the tale of a team of software professionals at Microsoft patterns and practices group who wrote a book on software acceptance testing. Jon Bach was one of the writers, ensuring that the project team used its own methods so the book would be acceptable to you, the reader. To develop the book, the team employed key ideas of agile projects - creating a backlog using story cards, working in short iterations, exploring requirements and expectations, building customer trust through iterative acceptance, and staying connected to the customer community through frequent preview releases, surveys, and interviews. They created a heuristic acceptance testing model for knowing when they had reached enough "acceptability," but the book is only in Alpha release so they won't really be sure it's acceptable until they talk to potential readers like you.

This will be an interactive talk where Jon will not only describe the process and content they produced, but will facilitate a group activity to discover how acceptable you might find the content of the book and if you apply acceptance testing methodologies in similar ways.

To get a free online preview of the book, go to: http://www.codeplex.com/TestingGuidance/Release/ProjectReleases.aspx

from May 14, 2008

Secrets of a Buccaneer Tester

Education has been the key to my success. Early on I chose not to outsource it. I quit high school in 1982, and have lived without the benefits and drawbacks of institutionalized learning ever since. There are no undergraduate degrees in software testing. The certification programs and graduate degree programs I've seen are disturbingly content-free. The leaders in our craft are all self-educated to some degree.

I develop and sell a methodology for software testing that I call Rapid Software Testing. At the time I created it, my ideas contradicted most of what was conventional wisdom (and still do, although rapid testing concepts are somewhat less controversial, today). This talk will present, not my testing methodology, but rather some of the methods by which I arrived at it. I believe these methods are fundamental to self-education on the job, and that self-education is fundamental to competing in a difficult and diverse economy. You can benefit from these methods even if you completely disagree me about testing and even if you are Harvard educated thrice-over and wear powdered wigs.

About our speaker: James Bach is a local self-educated software tester and author of the unpublished book How I Learn Stuff: Secrets of a Buccaneer Scholar, as well as the published book Lessons Learned in Software Testing.

James' Slides (PDF)

from March 12, 2008

Distributed Agile

Presented by: Joy Shafer, Senior Test Lead, Microsoft

This presentation reported on the challenges and successes Joy's team experienced while developing software as a service with a globally distributed team using Agile methodologies. We concurrently developed two small, first-version, online services using a development team that was located in Redmond and Moscow, and a test team that was located in Redmond and India. We encountered many interesting and difficult challenges, but were able to successfully overcome them and release high quality software on time, delighting our stakeholders. This experience report highlights our learnings, focusing on several critical success factors, such as well-defined processes, cultural awareness, the importance of relationships and communication, and continuous integration.

About our speaker:

Joy Shafer has been working at Microsoft for three years. She is currently a Senior Test Lead in the Mobile, Voice and Partner Services team. Prior to Microsoft, she tested and managed testers at Quardev, NetManage, STLabs, and Aldus Corp. She also was her own boss for many years, teaching and consulting in the area of software testing methodology, providing services to companies as well as colleges. Joy is an active participant in community QA groups and occasionally speaks on the topic of software quality. Joy holds an MBA in International Business from Stern Graduate School of Business (NYU).

Joy's Slides (PDF)

from January 9, 2008

The Strange World of Problem Solving

Presented by: Dr. John Medina

Regardless of what you may have heard, brain science has very little to say to the practical worlds of business and education, problem-solving included. The very little it can address, however, suggests that current practices haven’t moved very far from their 15th century roots. Recent findings about human memory, sleep cycles and physical activity actually suggests a few things business people and educators might try to bring their learning environments into the 21st century. And no, its not what your family doctor usually tells you.

About our speaker:

DR. JOHN J. MEDINA is a developmental molecular biologist focused on the genes involved in human brain development and the genetics of psychiatric disorders. He has spent most of his professional life as a private research consultant, working primarily in the biotechnology and pharmaceutical industries on research related to mental health. Medina holds joint affiliate faculty appointments at Seattle Pacific University, where he is the director of the Brain Center for Applied Learning Research, and at the University of Washington School of Medicine, in its Department of Bioengineering.

Medina’s books include: Brain Rules (March 18, 2008), The Genetic Inferno, The Clock of Ages, Depression, What You Need to Know About Alzheimer’s, The Outer Limits of Life, Uncovering the Mystery of AIDS, and Of Serotonin, Dopamine and Antipsychotic Medications.

To see John in action, view his short videos on www.brainrulesbook.com

from November 14, 2007

Please Break My Stuff!

Presented by: Brian Snyder, Vice President of Delivery Technology at Worktank, an integrated Advertising agency

A talk about how a developer learned to sleep well at night by looking at testing as a partnership, not a medieval battle. Hopefully giving you the tester insight in to one developer's mind, and to let you walk away with a better understanding of how to help uncover and report the nasty, lurking bugs that sometimes escape a developer's view.

About the Speaker: Brian Snyder is Vice President of Delivery Technology at Worktank, an integrated Advertising agency. Brian has had a lifelong passion for technology, gadgets, and general geekery, putting in time at Microsoft before moving to the ad agency life. He now leads a small but effective team of developers with a focus on Web sites and services. He is responsible for the company's testing strategy and implementation, and enjoys the challenge of finding ways to carve out time for testing in an insanely deadline driven industry.

from September 12, 2007

SBT Lite: Components of Session-Based Testing

Presented by: Sam Kalman, Director of Quality Assurance at OverTheEdge

Session-Based Test Management (SBTM) is an approach for managing and measuring exploratory testing. Session-Based Testing Lite (SBT Lite) is a streamlined, customizable version of SBTM. This talk will explore all the SBT Lite elements or "Components" which can be mixed and matched to optimize the test team's time. This talk does not require experience or familiarity with SBTM, as every Component will be detailed with real-world examples. This talk is intended for test leads and engineers who currently perform exploratory or ad-hoc testing on their current projects and are looking for ways to improve their testing processes or increase their accountability.

About the Speaker: Sam Kalman is the Director of Quality Assurance for OverTheEdge, creators of the Unity 3D game engine. For more than two years he worked at Quardev Laboratories with Jon Bach, co-creator of SBTM. During this time he performed both test engineer and test lead duties on two separate SBT Lite projects. Sam is currently active in using SBT Lite for his testing and technical writing duties.

Session-Based Test Lite paper

Sam's Slide Deck (Zip file of information and individual slides)

from May 9, 2007

Pre-STAREAST Presentation:
Top Ten Tendencies that Trap Testers

Presented by: Jon Bach, Manager for Corporate Intellect and Technical Solutions, Quardev, Inc.

A trap is an unidentified problem that limits or obstructs us in some way. We don't intentionally fall into traps, but our behavioral tendencies aim us toward them. For example, have you ever found a great bug and celebrated only to have one of your fellow testers find a bigger bug just one more keystroke away? A tendency to celebrate too soon can make you nearsighted. Have you ever been confused about a behavior you saw during a test and shrugged it off? The tendency to dismiss your confusion as unimportant or irrelevant may make you farsighted — limiting your ability to see a bug right in front of you. In this talk, Jon Bach will demonstrate other limiting tendencies like Stakeholder Trust, Compartmental Thinking, Definition Faith, and more. Testers can't find every bug or run every possible test, but identifying these tendencies can help us avoid traps that might compromise our effectiveness and credibility.

  • How you might be susceptible to traps
  • Ways through (or around) the ten most common traps
  • Participate in exercises that test your situational awareness
Ten Tendencies That Trap Testers Slides (PDF)

from March 14, 2007

Panel Discussion: Steer Your Quality Career with a Career Portfolio

As Quality Professionals, we're all experienced in managing project risk. But, how good are we at managing career risk in the face of offshoring, marketplace shifts, downsizing, and distasteful company politics? One tool that you can use to provide you and your family with job security is creating a career portfolio. Just as a financial portfolio can be used to create wealth, a career portfolio can be used to develop your career assets to earn and learn throughout your life. Also, you can diversify risk so that you always have a number of career choices in front of you.

Panelists:
Brian Branagan, QA Director at Klir Technologies, will lead the panel. He will show how career portfolios are far superior to traditional resumes. He will also show how they can be used to create a personal career path and diversify risk.
Lisa Buffmire, Career Management Consultant at Right Management, will describe the basic elements of a career portfolio based on her work coaching people with the leading provider of career transition services.
Rebecca Warriner, Owner of Woodland Recruiting, will discuss her experience in placing candidates who use a career portfolio approach in their dream jobs.

Related Links:
Tips on Building a Career Portfolio: http://www.quintcareers.com/your_career_portfolio.html
Johanna Rothman article on how to develop a Career Portfolio: http://hiring.inc.com/columns/jrothman/20051130.html
Text of Rebecca Warriner's Talk (PDF): Developing a Career Portfolio

from January 17, 2007

CSI Seattle

Detective Hanf from the Seattle CSI unit will be joining us to chat about crime scene investigation - a study not far from bug investigation and one which surprisingly correlates closely to software testing. We hope to find some parallels and learn some new techniques - straight from a CSI Unit.

Detective Hanf will briefly cover all of the following areas:

Describe the make-up of the Seattle Police Crime Scene Investigation's Unit;
Explain the type of crimes that they respond to investigate;
Outline the steps that are necessary in properly documenting a crime scene;
Provide some examples on how it is different from what is shown on TV; and
Explain some of the various areas of forensic science & the technology that they deal with.


About our speaker: Detective Hanf is a twenty-six year veteran of law enforcement, with the last twenty-three years being with the Seattle Police Department. He has been an investigator in the Special Assault Unit, Domestic Violence Unit, Robbery Unit, and the Homicide & Assault Unit. He is currently an investigator in the Seattle Police Department’s Crime Scene Investigations (CSI) Unit. Throughout his career, he has gained both the experience and specialty training in the processing of various types of major crime scenes. He is a graduate of the National Forensic Academy in Knoxville, TN. In the Spring of 2004, he was tasked with the formation of the Seattle P.D. CSI Unit, which is now a permanent entity within the department.

Jon Bach's Notes (PDF)

from November 8, 2006

Lightning Talks

On November 8, there was no single speaker… there were 10!

The catch is, the format is the “lightning talk” where each of the speakers know that they only have five minutes to present their idea (one at a time). When each speaker was invited, they were told that although their time was severely limited, they could speak about anything related to testing in that time – a new idea for status reporting or measurement, a lesson they learned on a project, an experience report about a new technique they tried, or a general test principle they want to share.

Strictly timed by a moderator, each speaker will speak either extemporaneously or with slides. When the last “lightning talk” ends, the floor will open to three more audience members who are inspired enough to present their own lightning talk. After that, there will be an open networking session between speakers and attendees.

Lightning Talk Speakers and Presentation Materials

Noel Nyman, Microsoft Finding Software Test Ideas in Other Boxes PPT Slide Deck
Pamela Perrott, Construx Software Personal Measurement PPT Slide Deck
Jay Bakst, Technologies Services Group Applying the Theory of Constraints to Testing PPT Slide Deck
Mike Meade, Quardev, Inc. Experience Report: Testing a Real "virtual" Cockpit PPT Slide Deck
James Bullock, Rare Bird Enterprises The Big Book of Perfect Testing The Big Book Files (Zip)
Keith Stobie, Microsoft Metamorphic Testing PPT Slide Deck
Jon Bach, Quardev, Inc. We Need More Storytelling in Testing PPT Slide Deck

from September 13, 2006

An Integral Approach to Creating Quality Software

Presented by, Brian Branagan and John Forman.

This presentation discussed how software quality professionals can master the complexity of projects and organizational politics by making it possible to identify, organize, and effectively operate upon all the truly critical factors they face.

About our speakers:
Brian Branagan is a software quality management professional with over 19 years experience in assuring products meet customer and business requirements. He has used an integral approach to his software testing and quality engineering projects to develop his cross-functional expertise in project definition, planning, execution and customer delivery for both start-ups and Fortune 500 corporations including Adobe Systems, Getty Images and Real Networks.

John Forman is a partner in Integral Development Associates, a Seattle-based consultancy specializing in organizational leadership and management development. He is particularly knowledgeable about the role of organizational culture and values, and the many ways that people make meaning of their work-lives. His work has been included in the curriculum of graduate-level classes at the University of Washington and the National Defense University, where he has been a guest lecturer. John is also a founding member of the Integral Institute in Boulder Colorado.

For more information on Integral Theory see the following links (provided by Brian and John):

Links about Integral Theory
http://integralinstitute.org/public/static/abtapproach.aspx
http://integraldevelopment.com/theory.html

A bibliography of books about Integral Theory and Integral Business
http://integraldevelopment.com/resources_biography.html

Articles about Integral Theory in Practice
http://integraldevelopment.com/resources_articles.html

from May 10, 2006

Measurement Pitfalls

Presented by, Steve Smith, Technical Business Consultant for EMC, a manufacturer of intelligent storage systems, software and services and an independent consultant.

Steve has experienced many measurement program failures as well as a few successes. In this talk he explored with you what separates the successful programs from the unsuccessful. As part of his presentation, Steve led an experiential exercise to demonstrate a typical pitfall in thinking about measurements. He debriefed the exercise to show how to navigate around the pit or, if we are already in it, how to climb out.

About the speaker: Steve Smith works at the intersection of people, management and systems, helping organizations to become more productive.

Although most of his career has been technical — starting in 1975 as a Systems Programmer — over the years, his context has widened. Today, he tackles the most difficult problem businesses face -- managing technical teams so they produce results, rather than merely justifying and deploying the latest technology.

Steve is a consultant, author speaker, and expert facilitator and coach who honors and enhances the wisdom that lies dormant in groups. He works as a Technical Business Consultant for EMC, a manufacturer of intelligent storage systems, software and services; and sometimes freelances as an independent consultant to lead project retrospectives. He is also a host and session leader for the Amplifying Your Effectiveness (AYE, www.ayeconference.com), an annual conference held each November in Phoenix, Arizona.

from March 8, 2006

Breaking Down (and Building Up)Exploratory Testing Skill

Presented by, Jonathan Bach, Manager, Corporate Intellect and Technical Solutions, Quardev Laboratories

Scenario #1: TesterX is a superhero. For the third time today, he is hot on the trail of a great bug that may save the company millions. The Bug Council, the Triage Team, those War Room generals, they're about to make a decision today on whether to ship. At his desk TesterX works without a net, honing his critical thinking skills and scientific mind to find those last buzzer-beater bugs. Then, he finds it -- the severity 1 stop-ship crash he suspected was there -- and he files it like a scoop reporter on deadline. Other testers look on in awe. "How does he do it so reliably?"

Scenario #2: Program Manager Michelle has been in the triage meeting all day. There are 43 new bugs to look at (the day started with 103) and the minutia and nuance of bugs needs to be hashed out. It is tiring and rigorous, but rewarding. As the list narrows to 42 bugs, up pops a new one to bump it back to 43. She sees it was filed by TesterX, and immediately there is a hush in the room. He has earned a lot of credibility on the project for his abilities to find not only severe bugs, but also many less severe bugs that lead to rich discussions about quality. The new bug is read and the staff in the War Room groans collectively. This one is a meaningful issue that will take time to discuss, as they wonder, "How does he do that?" while thanking the Fates that he is on staff to protect the company from shipping unknown bugs.

In this talk, I will explain the factors that make exploratory testers valuable. It's not that they luckily "stumble" onto a bug or find it "randomly". Exploration is both craft and science -- a teachable set of skills and techniques that we all can learn. It makes no difference if we're a PhD in Advanced Computer Science from CalTech or a Journalism major from a small Maine college. We have these skills within us and use them in our everyday lives. I will describe what these skills and techniques are as well as strategies to practice them.

About the Speaker: Jon Bach is the Corporate Intellect Manager and Senior Test Consultant for Quardev Laboratories, a Seattle test lab specializing in rapid, exploratory testing. He is most known for being co-inventor (with brother James) of Session-Based Test Management (a way to manage and measure exploratory testing) and most recently the developer of Open-Book testing. In his ten-year career, he has led projects for many corporations, including Microsoft, where he was a test manager on Systems Management Server 2.0 and feature lead on Flight Simulator 2004. He has presented at many national and international conferences and is a Program Chair for the 2006 Conference for the Association for Software Testing. He is a featured keynote speaker at this year's STAREast Conference in Orlando (www.sqe.com/stareast).

from January 11, 2006

See the Slides

Evolution of Structured Exploratory Testing at Philips Ultrasound

Presented by, Doug Carlton, System Integration Test Manager and LeAnn Coker, Integration Test Lead, Philips Ultrasound.

Philips Ultrasound is a subsidiary of Philips Royal Electronics, producing consumer electronics, lighting, semi-conductors and medical devices. Philips Ultrasound has facilities in Bothell Washington and Andover Massachusetts for the research, design, development and testing of ultrasound imaging systems.

This presentation will discuss the evolution of exploratory testing starting with the original company, Advanced Technology Laboratories (ATL).

In 1997, exploratory testing was "in the closet" and referred to as "random stress testing." As an FDA regulated medical device company, Philips was required to have an independent Verification and Validation team that focused its efforts on providing documented evidence that top level system requirements were covered. The released ultrasound systems were safe and effective, but quite often, full of latent defects.

In 1999, Doug Carlton approached the Director of Software Development with a proposal to embed software testing professionals in the software development department. This proposal was accepted and two testers were hired. The charter of this group was still focused on generating manual test scripts from requirements, but exploratory testing was now out-of-the-closet as a secondary directive. This Software Integration Test (SIT) team added another tester and became very successful, identifying 75% of the severity 2 or greater defects in the tracking system.

In 2001, the SIT team had pretty much abandoned the requirements-based test scripts in favor of full-time exploratory testing. Doug Carlton and SIT lead Joe Germani teamed up to hire a tester who had a strong background in exploratory testing. Doug contacted James and Jon Bach who forwarded the name of LeAnn Coker.

By 2005, most of the original SIT team had left the company. The two remaining team members were transferred to the formal System Verification and Validation team. Immediately, upon transitioning, one of the original SIT members joined a software team as a developer. This last year, the team was re-organized and with some training by Jon Bach, is now formally devoted to exploratory testing using the following charter:

The Integration Test team emphasizes heuristic exploration of ultrasound features as opposed to stringent coverage of system requirements. The IT team uses a structured exploratory testing methodology that is driven by emerging and evolving device functionality.

The testing target focus is risk-based, using the following priorities:

  1. Identified system and software risks along with derived mitigation requirements.
  2. Identified negative impacts on business goals defined in the business plan.
For each project, the team:
  • Collects and analyzes system and software “intelligence” data for use in identifying targets for exploratory testing
  • Coordinates with the System Engineering Team to identify Feature level and Feature Interaction Matrix (FIM) focus test targets
  • Creates, maintains and publishes a testing results Dashboard.
Doug Carlton will provide the history of Philips' exploratory testing and LeAnn Coker will describe the new Process for Integration Testing.

About the Speakers:
Doug Carlton holds degrees in Electronic Technology and Computer Science. He has 27 years of hardware, software and test engineering experience in the Boston, Silicon Valley and Seattle high tech locales. During that time, Doug has designed, developed and tested mass storage, process control and medical imaging systems. Prior to joining Philips Ultrasound 8 years ago, Doug spent 9 years at a start-up medical imaging company developing system software and test automation. Currently, Doug manages two teams at the Bothell,Washington, USA campus; Automation Test and Integration Test. Doug received R&D excellence awards in 2000 and 2001 and was named a Philips Ultrasound Technical Fellow in 2003.

LeAnn Coker is a Senior Development Engineer and Integration Test Team Lead at Philips Ultrasound Bothell, WA. She has more than ten years of experience in the IT industry including time spent as an Instructional designer, computer-based training developer and development engineer. During her six year testing career, she has primarily focused on exploratory testing including four years integrating software exploratory testing in a medical device company.


from November 9, 2005

Powerpoint Slides (PDF)

Anatomy of an Attack: Learning testing techniques to help secure your applications against hacker attacks.

Presented by Joe Basirico, Manager, Technology & Security Services with Security Innovation (http://www.securityinnovation.com)

This talk will discuss the basic methodology a hacker would use to compromise a server. Each step will be outlined to explain why the hacker is taking this step and the information or access the hacker hopes to obtain. After the step has been outlined the techniques and tools the hacker will use to complete this step will be covered. The techniques describe, in detail, what the hacker can discover using that method. Tools tie in directly with the techniques to aid the hacker in the discovery of sensitive information.

There are five categories including: Data Gathering, Exploitation, Elevate Privileges, Cover Tracks, Install Tools, and finally Gathering Sensitive Information from the target machine. Each attack category will be covered in depth and arm the tester with the education and tools it takes to help secure software from attack.

About our speaker: Joe has spent the majority of his educational and professional career studying security and developing tools that assist in the discovery of security vulnerabilities and general application problems.

At Security Innovation his primary responsibility is to train testers and developers from HP, EMC, Microsoft and other reputable fortune 100 organizations to find security and reliability flaws in applications and to write secure code. Basirico has delivered “Building Secure Applications in an Insecure World” at Compuware’s annual application development conference, OJ.X. He was also chosen to give two presentations for Microsoft’s Professional Developers Conference this year. He is the author and content provider of the company’s “SI Security Report”, a quarterly intelligence report that provides an in-depth analysis of the techniques and tools that an attacker could use to compromise enterprises. He has written numerous security whitepapers that focus on vulnerabilities at the source code level, including a detailed 20-page “Static Analysis Tools” report.


from September 14, 2005

Powerpoint Slides (PDF)

It's Too Darn Big To Test

Presented by Keith Stobie Test Architect at Microsoft Corporation

Structuring test designs and prioritizing your test effort for large and complex software systems are daunting tasks, ones that have beaten many very good test engineers. If you add concurrency issues and a distributed system architecture to the mix, some would simply throw up their hands.

At Microsoft, where Keith Stobie plies his trade, that is not an option. Keith and others have reengineered their testing, employing dependency analysis for test design, model property static checking, “all pairs” configuration testing, robust unit testing, and more. They employ coverage to successfully help select and prioritize tests and make effective use of random testing including fuzz testing security. Finally, models of their systems help them generate good stochastic tests and act as test oracles for automation.

About the speaker: Mr. Stobie plans, designs, and reviews software architecture and tests for Microsoft. His current project, Indigo, involves XML Messaging and Web Services using SOAP. Keith is also active in the Web Services Interoperability organization’s (WS-I.org) Test Working Group creating test tools for Profile analysis and conformance.

Keith directed and instructed in QA and Test process and strategy at BEA Systems where he worked on BEA WebLogic Collaborate, and WebLogic Enterprise. Keith was Test Architect at Informix, designing tests for the Parallel Server product, and Manager of Quality and Process Improvement. With over 20 years in the field, Keith is a leader in testing methodology, tools technology, and quality process. He is a qualified instructor for Systematic Software Testing and software inspections. Keith has been active in the software task group of ASQ, participant in IEEE 2003 and 2003.2 standards on test methods, published several articles, and presented at many quality and testing conferences. Keith has a BS in computer science from Cornell University.


from May 11, 2005

A Theory of Variation in Software Development, Architecture and Project Management

Presented by David Anderson of Microsoft Corporation

Traditional software engineering methods are built on an assumption that software engineering is deterministic and can be accurately planned in advance. Recent agile methods rebel against planning and adopt reactive adaptation to change. However, there is a better way - the introduction of a theory of variation into software engineering. By using lessons from Shewhart, Deming and Wheeler, it is possible to create predictive methods for software development, architecture and project management which embrace uncertainty and absorb change gracefully. The result is a system of quality assurance and continuous improvement for software engineering through the reduction of variation.


from March 9, 2005

Powerpoint Slides

Agile Product Management

Presented by Greg Patrick of TechMeth, Inc.

Does the narrow bandwidth, high focus and short term cycling of "Agile" development methods undermine the continuity of product management and effective market planning? Experiences with several "Agile" product development environments are examined to identify critical success factors that may help when Product Managers need to survive and thrive with these dynamic methods. One thing seems certain: Product Managers can have nightmares trying to positively influence an "Agile" product development process. One way to deal with this tendency is shifting more responsibility for detailed product definition and vision into the Product Management realm. However, this adjustment creates issues for deriving product specifications that work well for "Agile" processes and product validation.


from January 12, 2005

Powerpoint Slides

The Revolution in Manufacturing Quality - What can we learn?

Presented by Larry Schuiski, President/CEO, AGILEAN Corporation, Productivity for Performance™

There are many stories coming out of manufacturers recently about 90% improvements in quality, 70% reductions in cycle times, 30% increases in productivity. And these revolutionary results are from manufacturers that everyone thought were already pretty efficient. Agile, Lean, Six Sigma, Theory of Constraints, and Total Quality Management are more than buzz words for these manufacturers; they are radically changing how they do business. The topic of this discussion is to review what is happening out on the shop floor and what does, and does not, have a direct impact on quality software manufacturing.


from November 10, 2004

TeamTest 2005 Demo

Presented by Tom Arnold and Jason Anderson

Visual Studio 2005 Team System is an extensible life-cycle tools platform that helps software teams collaborate to reduce the complexity of delivering modern service-oriented solutions and ensure their applications are “Designed For Operations.”
Microsoft’s offerings now include a comprehensive set of proven process frameworks, best practices, prescriptive architecture guidance, and integrated life-cycle tools that enable IT organizations to successfully deliver custom solutions on the Windows Server System.

With Visual Studio 2005 Team System, organizations can:

  • Reduce the complexity of delivering modern service-oriented solutions that are designed for operations.
  • Facilitate collaboration among all members of a software team, including business users, project managers, architects, developers, testers, and operations managers.
  • Customize and extend the Team System with their own internal tools and process frameworks or choose from over 400 supplemental products from over 200 partners.
In this session, Tom Arnold and Jason Anderson, Program Managers on the team, will drill down on the new development and testing tools being added to Visual Studio 2005.


from September 8, 2004

Powerpoint Slides
Triangle:(the dll and ocx file in your windows dir) - Zip file
TaskMaster 1.0
http://workroom-productions.com - James Lindsay's "exploratory" machines
When You're Tested - Article by Jonathan Bach
Meeting Review - by Kevin Dargie

When *You’re* Tested: Techniques for the Testing Interview

Presented by Jonathan Bach

Software testing is a process of information-discovery. But because not all problems are easily found in the user interface, it takes a broad combination of skills to unearth and report them -- creative, technical, scientific, diplomatic, and communicative, to name a few.

The same is true with job interviews. Finding a person with all of these skills takes, well… the same combination of skills! Just like software, candidates come to the interview, the bulk of their problems hidden, and the interviewer has to use their skills to quickly obtain good information about the candidate's strengths and weaknesses. Jon Bach, a managing test lead at Quardev Laboratories, tells of his experience interviewing hundreds of candidates over the past 10 years and will demonstrate a variety of proven interactive techniques (and a few unproven ones) to help you make the most of the testing interview, no matter which side of the clipboard you're on.

from July 14, 2004

Powerpoint Slides
Rapid Test Planning Zip Files

Rapid Test Planning with Quardev's Joy Shafer

Presented by Joy Shafer

Joy's talk focused on how to plan and execute a reasonable testing effort for short-term or very time-critical projects where there is no time or budget for formal planning. Very valuable and excellent presentation!

from May 12, 2004

Presentation Zip File

Exploratory Testing Bootcamp with Microsoft's Noel Nyman

Presented by Noel Nyman

Noel Nyman will reprise his semi-popular “Exploratory Testing” presentation. An avid exploratory tester, Nyman brings a unique perspective to the field, building on the seminal work of Cem Kaner, James Bach and others to provide unusual insights. He addresses such burning questions as, “How many tests do we really need for Myers’ triangle problem?” “How do I test Windows Notepad?” And, “What does the state of Kansas have to do with Exploratory Testing?” Be sure to put this meeting on your calendar and come join us for pizza, even if you decide to leave before the presentation starts.

About our speaker
Many years ago, Noel Nyman discontinued his exhaustive study of the undocumented op-codes in 65xx series microprocessors and devoted his unusual talents to creating a digital gas engine throttle controller for QS Products. The company declared bankruptcy after selling three units. Noel next moved to Aldus Corporation, the creator of PageMaker. Within six months the company was acquired by Adobe. Undaunted, he moved to Microsoft and joined the Semantic Platforms group, testing proof-of-concept applications using cutting edge technology. The group disbanded a year later. Noel then moved the Application Compatibility group, on special assignment to produce the test frameworks for the Certified for Windows 2000 logo program. The test plans, called the most rigorous testing outside the military, proved too rigorous for most ISVs and very few apps received the logo. When the logo program terminated, Noel became a regular member of AppCompat. The team immediately started a complex series of reorgs which continue to this day. In April of 2003, Noel joined the Microsoft Identity Integration Server group as a test lead. He uses minimalist heuristic, synergistic, managerial paradigms to enable the Test team to make effective contributions to the quality of MIIS. In his spare time Noel counts cars on the New Jersey Turnpike using traffic Webcams and listens to his extensive collection of Marcel Marceau vinyl records.

from March 10, 2004

Presentation slides

Model-based testing in the key of C#

Presented by Harry Robinson and Michael Corning

What would you give to be able to run zillions of distinct tests on your applications around the clock?

Model-based testing lets you generate software tests from descriptions of an application's behavior. Creating and maintaining a model is easier and more productive than writing individual tests.

We will show you how to create models and then generate and execute tests from those models, using nothing other than the C# language!

About the speakers:
Harry Robinson is Test Architect for Microsoft's Enterprise Management Division. He also writes a regular column for StickyMinds.com.

Michael Corning is Harry's minion whose day job includes testing Microsoft Operations Manager and developing model-based testing tools for general use. The publisher who published his regular column (Wrox Press), "Confessions of a Modeling Bigot," has gone out of business.

from January 14, 2004

Presentation Overview
Presentation slides
Presentation Zip File
Meeting Review - by Jonathan Bach

The Three R's of Software Testing

Presented by James Bullock and Brian Branagan

Sometimes software testing goes sideways, with nothing good happening and everyone involved simply pissed-off. Why is that? Can we do anything about it as testers, test managers, people in development, or even as people who depend on software and want more, better and sooner?

"The Three r's of Software Testing" follows the story of John, a test manger in a difficult situation, to illustrate how moods like the "three r's"—resignation, resentment and righteousness—influence behavior in software testing. The talk outlines some ways to manage mood, along with why software testing is particularly prone to these problems. James and Brian will discuss examples of language, thinking, and behavior that can help with the three r's, and will provide a "crib sheet" of tools for managing your own mood while on a project. Because these examples are not silver bullets, references are included for investigating the ideas behind the examples and techniques.

The session will include some simple exercises to work with your experience, connecting the "three r's" directly to what is going on for you.

About the speakers:
James Bullock has been successfully building systems for over 20 years. His experience includes building high-volume embedded control software for heat pumps, plant-floor manufacturing automation, enterprise scale data warehouses, software engineering tools for managing multi-million SLOC code bases, creating testing frameworks and tools and testing large-scale IT & web deployments. He has also run small software development projects, major IT deployments, conversion projects, development tool teams, project specific QA teams, QA departments supporting multiple projects, and an automated testing consulting group.

James is the lead editor for Dorset House's Roundtable on Project Management and co-editor of the Roundtable on Technical Leadership.

James lives in Seattle, where he offers consulting on software engineering processes and methods, project assessment, management or recovery, and IT department planning or reviews. You can email him at: jbullock@rare-bird-ent.com.

Brian Branagan is the Director of Quality Assurance for Getty Images in Seattle. He has successfully lead test teams to deliver great products under tight timelines at Aldus, Adobe, and PictureIQ over his 16-year history in software testing and quality engineering.

Brian has studied with Jerry Weinberg and completed his year-long Systems Effectiveness Management training in 1998. He completed Enterprise Performance's two-year "Action in Management" Executive Development program in 2003. Brian is also a member of the Project Management Institute and a certified Project Management Professional and is a Certified Software Test Engineer through the Quality Assurance Institute.


from November 11, 2003

Presentation slides
Meeting Review - by Jonathan Bach

When All is Said and Done—hosting a meaningful project retrospective

Presented by Christina Chen
Microsoft Corporation

During the life of a project, teams make mistakes and create good practices, but nobody has the time to analyze and understand them. Too often after the project ends, people don't learn from their experience and repeat earlier mistakes on subsequent projects. A well moderated retrospective helps a team constructively examine their recently completed project to carry forward knowledge and prevent problems from reoccurring. This presentation discusses how to effectively moderate a retrospective to help teams get the most out of their experiences.

About the speaker: Christina Chen has been a Program Manager at Microsoft since 1997, working on Combat Flight Simulator and the Midtown Madness series. Her hobbies include playing video games, cooking and moderating retrospectives.


from September 23, 2003

Measuring up: QA and the FDA

Presented by Andrew Jelen
Encompass International Inc.- www.encompassintl.com

Computers and computer software play a critical role in the pharmaceutical industry and are used extensively in the discovery, manufacture, and delivery of drug products. This industry is heavily regulated by the Food and Drug Administration (FDA) and, in the case of computer technology, each firm is accountable for accurate and reliable computer systems. Making this especially challenging is that these companies are increasingly dependent on external suppliers for their computer systems including both hardware and software.

As a prerequisite for FDA compliance, companies providing computer products and services to regulated manufacturers, including commercial off-the-shelf (COTS) software, are required to undergo audits of their product development process. The Parenteral Drug Association (PDA, www.pda.org) a international non-profit pharmaceutical industry association has established the Audit Repository Center (ARC, www.auditcenter.com) to help both manufacturers and suppliers more efficiently and cost effectively meet this audit requirement, avoiding the cost and duplicate effort associated with carrying out multiple audits of a company's software development operations.

Andrew Jelen, Managing Director of Encompass International (www.encompassintl.com) a WSA member company will provide an overview of ARC and the criteria used from PDA Technical Report #32 (TR#32) for evaluating a software suppliers product development process.

Andrew has over 20 years domestic and international experience in Quality Engineering, Regulatory Compliance, and Process Improvement in technical and management positions with major companies in the Medical, Software and Electronics industries. His undergraduate studies were in Engineering Technology and Business Management and his graduate work is in Information Systems. In addition to being a PDA TR#32 certified auditor he is also a ASQ Certified Quality Engineer, Certified Software Quality Engineer and an IRCA Certified Quality Management Systems Lead Auditor.


from July 8, 2003

Presentation slides
Meeting Review - by Jonathan Bach

It's a Beta... What Do You Expect?

Presented by Hal Bryan
Microsoft Corporation

Beta testing is one of the most frequently misunderstood aspects of software development. Many companies put too much trust in their Beta testers, others don't put nearly enough. In both cases, a Beta test will provide little value, and prove distracting at best.

However, when managed effectively, a Beta program can be equal parts quality assurance, market research, and public relations, significantly impacting quality at very low cost. Hal Bryan, software test lead at Microsoft, and Beta Coordinator for the long-running Flight Simulator franchise, provides an overview of their Beta programs of the past five years. Hal will discuss working with Beta testers, how to set their expectations and yours, what to do with their feedback, and provide an opportunity to learn from some of his most spectacular mistakes.

About the speaker: Hal Bryan is a software test lead for Microsoft Flight Simulator. He's been testing software for 6 years, as both a contractor and full-time employee at Microsoft, working on Windows 98, Flight Simulator, Combat Flight Simulator, and other game/simulation titles. For 5 years, he's been known for transforming Beta programs into valuable partnerships with some of Microsoft's long-standing customers. In addition, Hal wrote the "Beta Best Practices" document that is being adopted by test leads in the Microsoft Games Group.


from May 13, 2003

Presentation slides
Full article
Meeting Review - by Andrew Custer

Testing In Session: A Way to Manage Exploratory Testing

Presented by Jonathan Bach

Exploratory testing is unscripted and unrehearsed. Like the music in a jam session, its agile and unrestricted nature makes it a widely-used test method. But that nature also carries a stigma—it's often mistaken for "random testing" because it appears to have no structure, which can make it seem an inappropriate method to use on test projects that invite intense scrutiny or require extensive test documentation.

Jonathan Bach is a senior test lead at Quardev Laboratories in Seattle. He's been testing software for 8 years, half of which was for Microsoft as both contract employee and full-time test manager on Systems Management Server. Additionally, Bach worked with Satisfice, Inc. where he was lab manager and test consultant. Bach has published testing articles in STQE and Computer magazines, as well as presented testing talks at Microsoft, the University of Washington, STAR West 2000 in San Jose, and the 2001 Testing Computer Software conference in Washington, DC.


from March 11, 2003

Defect Management: Building a Better Bug Trap

Presented by BJ Rollison
Microsoft Corporation

A discussion about bug tracking best practices — building a taxonomy for bug classification in order to derive better metrics. Mr. Rollison will present ideas on how to write better bug reports and improve the defect tracking system. He will cover general ideas for improving the defect management database, discuss a few common problems with defect tracking systems, outline common problems with bug reports, and provide guidelines for creating effective and meaningful defect reports that can be addressed quickly and accurately. He will also outline a typical bug lifecycle and discuss resolution negotiation.

BJ Rollison is currently the test training manager at Microsoft. He has been with the company for 9 years. He started in '94 as the setup test lead on International versions of Windows 95, then became a test manager for the Internet client division before moving into Microsoft's internal training department about 4 years ago. Prior to Microsoft he worked for an OEM company in Japan building and testing Japanese computer and network solutions. He has published several articles on DOS memory management and Windows optimization in Japan and Canada, and is currently working on a book on International Software Testing.


from January 14, 2003

Meeting Review by Chris Bridenbaugh

The Revolution in Manufacturing Quality – What can we learn?

Presented by Larry Schuiski
President/CEO
Agilean Corporation
www.agilean.com

There are many stories coming out of manufacturers recently about 90% improvements in quality, 70% reductions in cycle times, 30% increases in productivity. And these revolutionary results are from manufacturers that everyone thought were already pretty efficient. Agile, Lean, Six Sigma, Theory of Constraints, and Total Quality Management are more than buzz words for these manufacturers; they are radically changing how they do business. The topic of this discussion is to review what is happening out on the shop floor and its impact on the future of quality software manufacturing.
Please contact Larry Schuiski if you would like to receive a copy of the presentation.

Mr. Schuiski is the founder and CEO of Agilean Corporation. Agilean's consulting services deliver common sense, low-cost, low-tech improvements to business processes. Prior to Agilean, he worked for Attachmate Corporation as the SVP of Worldwide Products and Support. Managing an organization of 400 people, he was responsible for all aspects of the company's strategic planning, product development, and technical support. Mr. Schuiski brings over 20 years experience in business process improvement, enterprise system implementation, and management consulting. He is the current Chairman of the Board for the WSA. Mr. Schuiski graduated with honors from the University of Washington with a Bachelor of Science in Physics.



Home | Calendar | Contact | Articles | Volunteer | Sponsors | Toolkit