Suppose you find a bug in a website you use frequently. What do you do? For most websites, the only option is to call customer support and tell them about the issue. But what happens then? Will anyone other than the person you spoke to ever hear that the problem exists? Will they prioritize the issue, or ignore it? Will the person responsible for maintaining that code ever hear about your call?
Most web companies maintain a team of Testers whose job it is to find bugs and get them fixed. Testers have vast knowledge of the products they work on; they know which developers wrote which code, how to explain the seriousness of the impact of a bug, and they know whom to evangelize the fix to, and how to get the fix deployed to production as quickly as possible. Wouldn’t it be great if you knew that the bug you reported was sent directly to one of these Test people?
Unfortunately, most organizations don’t work this way. Your issue will likely get filed along with other customer complaints, boiled into a consolidated report at the end of the month, and will appear as one out of many data points on a spreadsheet presented to a senior manager at a quarterly review. Sometimes this report will find its way back to the engineering team, but this happens less often than most companies would care to admit.
Other web businesses architect expensive solutions into their products designed to catch the bugs you see and send a report directly to the engineering team. This is fantastic—when it works. Inevitably, those systems will only catch the bugs the engineering team is expecting to see and has designed the system to catch. They’ll catch a crash easily enough, but something as simple as a dialog box that won’t go away might not be recognized by their system, and then you’re back to calling support if you want the bug fixed.
What never seems to happen is one of the most obvious solutions: That someone from the Test team walks over to the support agents and asks them, “What’s up?”
I’ve always believed that Testers should go to where the bugs are, wherever that path leads them. Shouldn’t Test organizations be interested in the issues customers report? Shouldn’t they be more proactive in finding out issues customers are complaining about?
When I joined the Test team at Zoosk, I was determined to find a way to build a bridge between the customer support team (called Zoosk User Operations) and the Test team. I wanted to know what issues we were missing, and what customers cared about that we might not be testing well enough.
I found that Zoosk already had a culture that was very supportive of user operations. Unlike other organizations I was familiar with, bugs reported by customers would move to the engineering team quickly, and would be resolved within a few days. This is a reflection of Zoosk’s culture of valuing all employees and listening carefully to any issues that might impact customers.
But while the customers’ issues were being addressed, the Test team still wasn’t part of the conversation. We would see the customer complaints only when they were entered into the bug database, or when we were asked to investigate an issue a customer had reported. I was certain we could do better.
I contacted our Director of User Operations, Sejal. She was happy to set up a weekly meeting with me to go over customer reported issues, and have me open bug tickets for them. Before long, she was pinging me on our instant messenger product whenever her team found an issue, and I would drop what I was doing and investigate. We had a handful of occasions where Sejal would report an issue to me, I would file the bug and engage the developer, and the fix would be deployed to production a couple hours later. Those were my favorite moments!
The support agents were thrilled. One of them told me he had worked in support for 20 years, and in all the companies he worked for he’d never seen customer issues addressed so quickly. Sejal tells me that it provides the entire user ops team a huge morale boost to know that engineering is actively listening to their issues.
The developers were also thrilled; customers were responding to their features better, and they had a clearer idea of how to architect solutions that would be less likely to bother customers in the future. As I wrote this post, I asked one of our developers for his thoughts. He told me ”With this feedback from customers, I get a feeling that my code is really being used and someone is actually waiting for *me* to improve it! The customer is not a PM or QA or manager; he/she is a user who actually uses my code and can share feedback easily! Providing these chains all the way from the customer to the engineer creates a feeling of cooperation for everyone that is so damn unique! And I love it.”
I was happiest of all—I hate missing any bug, and take it very personally when something gets by me. Now that I had become part of the process to fix those bugs, it gave me a tremendous sense of satisfaction that I could personally see them fixed, and ensure that our processes were in place to prevent them from recurring.
After a few months of this something I hadn’t predicted happened—I was hearing from Sejal less and less. The customers just weren’t reporting as many bugs as they were before. One day I asked Sejal if she was holding out on me. She told me no, there just weren’t as many software issues. Nowadays it’s not unusual for weeks to go by without customers reporting software bugs. Customers still call in, of course, but for more common issues like billing questions or clarification on how our service works. Sejal tells me that our Test team must be doing some miraculous work.
I don’t think we’re doing anything miraculous. I think we learned from our customers how to test better. Our customers taught us which issues to prioritize, how to better anticipate the kinds of problems they would encounter, and how to generally do a better job of being the kind of customer advocates we aspire to be.
In talking to Testers in other organizations, many of them use different techniques to retrieve customer data. Some build systems that scrape the call database for trending issues, others talk directly to the occasional customer, or sit in on user testing sessions. Yet, according to Practitest’s 2013 “State of Testing” survey, only 28% of Test teams are in touch with their customers through support or other means. (It’s worth noting that Practitest’s survey is global, and most of the respondents were from outside the U.S. I suspect if the survey were U.S. only, the numbers would be even lower.)
It’s amazing to me that more Test professionals don’t make use of this obvious and easily reachable resource. If you are in Test and are looking for a way to improve your product quality, I encourage you to try this as soon as possible. It doesn’t require any special coding skills, infrastructure knowledge, or heuristics or certifications. All you need to do is walk down to your customer support team, smile, and ask them, “What’s up?”
Read Michael Larsen’s response to this post on the TestHead blog: Two Great Tastes that Go Great Together 🙂