March 24th marked our first official (and virtual) Release Event. We showed off new features like Teams and our upcoming Productboard Integration, shared great insights and best practices from Clubhouse customers, heard our CEO's vision for the future of Product Collaboration, and showed off some of the exciting things that are coming to Clubhouse in 2021.
One of the customers we heard from was Matt Sheaffer, a Senior Engineering Manager at Guru. He spoke with Sidd Penakalapati, a Product Manager here at Clubhouse about how he used our Google Sheets Integration to build a Customer Pain Score that helps teams across Guru strike a balance between shipping features and tackling technical debt and bugs.
You can watch the recording of his 30 minute session below and read the key questions and points from his session below that.
Matt, tell us about Guru and what you do there.
Thanks for having me. My name is Matt, and I’m originally from Philly. I've been part of the Philly startup scene my entire career. It's been a blast being part of that. I started coming out of college with an engineering background and quickly moved over to project management, partly because I'm probably a little bit of a control freak in that sense., I've been focusing on how to help teams become highly performant and make good decisions in terms of growing those teams.
Most recently, I shifted at Guru from project manager to engineering manager. It's been a great experience so far.
Today we're talking about making decisions with deeper reporting insights, but before we get there, I'd love to set the stage with how you and your team use Clubhouse today.
All our teams use Clubhouse to track all the work, to plan all the work. They're really in there every day in their daily huddles, talking about what they're going to be working on next. That's really what we try to focus on, making it easy to decide what's the next thing to do where we're not spending a whole bunch of time trying to figure that out and every day should feel easy. That's what we try to strive for. Try.
The key word. Could you describe the approach you take to leveraging data to improve the efficiency and outcomes for your teams?
Yes. I'll talk about this, but we try to take a decentralized approach to decision-making. All our various teams, which change frequently, we're constantly – not constantly – but regularly spinning up new teams and winding down old teams. We want to empower them to make good decisions, and we also the leadership team doesn't have to worry about going off the rails basically. There's always that balance of empowering teams and making sure we all have the confidence to move forward with those decisions / not second-guessing ourselves.
Talk to us about the reporting insights you put to use with your teams and with Clubhouse
My mantra is to have data-driven decisions and that data should bring people together to make informed decisions, not make them for you. This actually aligns very well with Guru's approach to knowledge. We're not trying to automate the whole thing. For better or worse, AI and machine learning can't make all these decisions for us, and we still require people to make them, but that doesn't mean that we can't gather the data to make us better at it.
Really, what does this mean? It means a lot of different things, but for me mostly, it means we're supposed to be deciding what the metrics are. The metrics are the tool in all this and we're the carpenters and not the other way around. That's one of the things I love about Clubhouse always, is that it's so flexible that we can make it work for us and not the other way around.
I'll try to use an example that's near and dear to my heart because I've just run into so many different times. This is like, as a project manager, again, you're trying to figure out what to work on next. You might go and talk to an engineering lead, "Hey, this project's about to wrap up. I'm so excited, but what are we going to be working on next?" They're going to say, "We have a lot of technology that we have bugs coming in. We need to focus on that." In the previous conversation, we're able to take time for 50% technical debt. That's great.
Then, you happen to have a different conversation with the product manager. It seems like you're now on a different planet. They’re saying, "We're behind on our roadmap and we really need to ramp things up in these feature areas, and we really need to do this." Now, you realize you have a problem. If there's any project managers out there listening right now, I can't see you, but I imagine everyone raising their hands and saying, "Oh. Yes." What do you do? You get the engineers and product managers together and then you watch them fight and then whoever can scream the loudest ends up getting their way.
Why is that happening, and what can we do about it? The first thing is, you need to be conscious of what they're saying and what their reasoning is. If they're just saying, "Oh, well, a couple of bugs came in today, and therefore, that's why we need to make this long-term decision," they're not looking at the data. We're not making data-driven decisions at that point. That's recency bias and so on and so forth.
In another case, you might get, "Okay, well, we got 30 bugs last month, and that's high for us." The other person might go and say, "Yes, we got 30 bugs, but they were all very low priority, small impact and in the previous three days we had two major outages." You can't really compare the two. Well, now we're talking data, but we're not looking at the same data. We don't have a contract of how we're going to be looking at this data. In many ways, that can be more dangerous because you think that you're talking the same language, but you're really not.
Ultimately, what you want to be able to do is get those people on the same page and have everyone agree this is how we look at this stuff and this is how we're going to talk about it. Therefore, when I go back to saying, "Hey, how do we try to make things easy where we're not trying to fight about every little thing, we're moving in motion together," data-driven decisions can really help the team move forward.
What are the ingredients to this? I'll start to tie it in with what we've done with the Clubhouse reporting. You need to start with the desired outcomes. Those two people, I'm picking on product managers and picking on the engineering leads, but really it's an entire company. Those people work for the same company. They should have the same goals, you think, but the product team might have their own metrics and engineering might have their other metrics. That's a bad sign.
We'd actually try to break down the term engineering team in what we now call PDE, which is product, design, and engineering. We're trying to build a product, we're trying to serve our customers, so let's focus on that. If you can agree on the desired outcomes, what you want to do in terms of how you serve your customers or how you serve your revenue, that's the first building block.
The second is well-aligned metrics. Once you pick these metrics that are tied to your outcome, how well tied together are they? People try to gain the system. I'll call out velocity. I can make the velocity go up. I can bump up numbers on my estimates or do all sorts of different things. At the end of the day, that's not going to help anyone. How is it well tied to the outcomes and how can you make sure that everyone's coming in with the proper incentives to make sure that this happens?
Education and buy-in.
Everyone needs to have the same understanding of how these numbers come to be, how we define them, what they are and what they aren't. Then everyone has to be on board because if people ignore them, then it's a worthless metric.
Probably the most overlooked issue in my experience has been accountability. You can put these things in place, but if you never come back or you never have to actually be answerable to them to say, "We hit these numbers," or, "We didn't."
Did the team feel part of that win when it does happen? Or when they missed, are they walking away feeling like, "Wow, I have a strong desire to get better and not have this feeling again"? Where you almost walk around with a chip on your shoulder. That instills a sense of drive in everyone that everyone can get behind and either celebrate or be bummed out and go to the bar afterwards and say, "This sucks, let's do better next time."
Then, of course, with anything feedback. We've gone through multiple iterations of these reporting tools at Guru. Some of them have been very spirited like, "Hey, this is great, but I really want to be able to do this in addition to it." It takes multiple iterations to really get there. Some teams look at it differently too, so you need to work closely with them. I'm going to use a specific example of how we use these reports at Guru, unless you wanted to hop in, Sid.
Those are great insights and high-level strategies. I'd love to have a specific example of how you're executing on this at Guru.
Here's the story or the backstory. Back in 2019, which I refer to as 1BC, which is before COVID, we started to be bursting at the seams in terms of decision-makers being able to handle all the stuff that we've been working on. Our goal for the next year was to double in size after doubling in size the previous year, and really being stressed out over it. We really said," Okay, well, something needs to shift here. We can't be making all the decisions anymore, and we need to give away our Legos, which is something that we like to say.
We identified our problem, which is how can we successfully scale and adapt in the midst of continued hyper-growth while giving away this decision-making power without freaking out that something terrible is going to happen, even though it probably wouldn't. To skip over a lot of research and part of the decision-making process, we decided to build autonomous teams that we call pods. They're cross-functional teams. Basically, they each have a focus. You can have a search pod, an editing pod, a billing pod, and so on.
They would have product managers, they would have engineers, and whatever else, data science members, whatever they needed to go and do the things that they needed to do. They were really responsible for building out their roadmap based on minimal input from the company leadership, not only saying, "Hey, here's some general KPIs, we want to be able to move. Ultimately, we want to be able to go and say we're doing this for our customers in this particular area." They take that and they run with it.
Obviously, there's communication along the way, but we're really trusting them to go and really explore these areas because they really should be the subject matter experts in that field once they get going. They're really going to know better than anyone on the leadership team. When we're constantly spinning up and winding down these teams, how can we avoid massive ramp-ups and chaotic decision making?
Or if you're an engineer, and you switch teams, it's like, "Whoa, this is like I just walked into a totally different company." How do we tie those things together and make sure that we're looking at the decision-making process in the same way? I was really struggling over how to do this. Everyone's looking at me, and I'm like, "Yes." I'm trying to point to someone else. It just kept coming back to me. Then I don't remember how I caught wind of it, but I saw this GoogleSheets integration thing coming. I'm like, "This might be a godsend. This might really save us from this hell that we're wallowing in."
Going back to those main ingredients, we said, "Okay, you know what? We really want to explore this Google Sheets integration. We want to focus on quality first." It was clear that that was our main outcome that we wanted. If anything else, we want to move fast and all that stuff, but before we do any of that, we want to make sure quality is on lockdown. Matt, let's go build a quality dashboard and make sure that everyone minds our P's and Q's as we go through this process.
We try to figure out, "Okay, well, what's the right metric to properly incentivize people and get them gassed up to actually care about quality and monitoring?" We came up with what we call the customer pain score. That was introduced to put a face on what we were trying to optimize for maintaining a good experience for our customers. It introduces a severity weight score for customer-reported bugs in the team's area of responsibility.
I talked about each one of those teams having an area of responsibility, which is search. We take in those bugs, we give them a priority score high, medium, low or critical. Then based on that, it gets a certain weighting. Then you add all those up and you have your customer pain score. That simplifies all the conversation now that I just talked about. You get those two people in the room. You're like, "Hey, how is quality going?" It's not a conversation. It's like, "Okay, our customer pain score is 27." "Okay. Cool." What led to that? How do you feel about that? What are you going to do about it? How does it tie into what you want to do with the new product area? It just simplifies every time you need to have that conversation.
I really love that active customer pain score because, like you said, it standardizes how each team talks about the issues they're facing and what they need to work on next. I'm curious if you could talk a little bit about what it took to have buy-in from the various schemes to settle on this KPI.
Yes. Buy-in is something that I always worry about, or you can create a whole bunch of processes, and do you think it's great? If no one picks up on it, it's all a moot point. That's always something that I'm super worried about. You can do presentations, and if that doesn't work, then you have to massage and meet with the different teams and things like that. Quite honestly, this customer pain score has been the simplest thing for me to roll out in recent memory. I just immediately picked up, it connected with people, and they were immediately coming back to me and saying, this is great. This is like something that he didn't even know that I needed. We're already fighting over how to get this stuff down.
This is the end result of the quality dashboard here. Up here, you see the quality score has got a nice bright color and here this team has managed to keep it in the green, but this actually shows historically what's happened with that score. This is telling me, looking at it here and be like, hey, back in December, it was getting pretty gnarly. It was getting pretty high. You should be having a conversation. If I receive that, I might pop in to talk with the team, and say, "Hey, how are things going? You realize your scores at like 37 right now. Do you need any help? What's going on? What's the story behind that?"
Then usually they'll say, we're aware. We think it's coming from here, and this is what we want to be doing about it. Then hopefully, you see it start to wing down. Then even though we're here, you can see that this is stuff that I love to see. It's hovering in the yellow thing right here, the threshold's 25. They're in Slack talking about it, saying, "Hey, how do we get our pain score down?"
Then this is someone from another team popping in being jealous. There's a little bit of competition, right? Saying, "Hey, you Photoshopped this." That's what makes it all fun too. This is our CTO coming in here and celebrating the fact that it went down to the green. I'm surprised, but buy-in has not been an issue here, which I'm super thankful for.
Well, the teams immediately tried to get it to drop, and they did, they were successful, but then, as bugs tend to do, they start to creep up. We build new features and create technical debt as a result, inevitably. Now, teams are starting to be like, "Oh, dude, we're in the yellow. We're in the red again. How did this happen?"
The elevated sense of awareness of that, like, "Okay, maybe our wave shouldn't be going choppy like this, and what if we can do to smooth it out?" Going back to, how do we make every day feel as easy as possible instead of like, "Okay, fix bugs, fix bugs, okay. Now, go product a hundred percent." Like, "Oh, crap. We just created a whole bunch of bugs. Don't do that. It's a vicious cycle." The rest of these views, that I won't get into too much detail now, help support the story or the conversation that we want those teams to be happy.
Our customer pain score is this, and it's trending in this direction. Why is that? Where are they coming from? These are the different areas that the feature areas are coming from. There's a certain card UI editor and they can identify where their stuff is coming from. What's the volume? Our thing might be steady, but we might be spending half our time working on bugs just to keep it flat. Where are they coming from over time? That can help you identify where the debt is coming from to help you see that line.
What was the experience like using the Google Sheets integration to build this? Did you have any hiccups, any problems?
No. In fact, the updating and the stability of it has been absolutely great. I think that the biggest challenge for me was, coming back from a background of Excel, trying to figure out how you do all that stuff in Google Sheets. Usually, it's a Google search away, luckily.
There are a couple of things that we needed to create custom fields for. For instance, when a customer reports a bug, we put a label on that. I would just create a couple of custom extra fields that are looking for the existence of that tag to make it easier to report in here. In terms of the data that Clubhouse is providing, it's been an excellent experience.
What has the impact been since you've implemented this customer pain score?
We doubled down on quality. We had a couple incidences where we just said, no, never again. We need to make sure that all the stuff is on lockdown. You can see after the creation of these autonomous teams, it was easy for teams to be hyper-focused on building new product, and you can see this pain score grow and grow. It didn't even exist at this point. This is looking at it historically.
When we introduced the concept of customer pain sore, you can see it starting to reverse and start to melt away. It really just gave visibility into what was actually happening and allow people to attack it in a specific way and celebrate it really at the same time. This was a huge success in that sense.
Do you have any advice for others who may be curious about building out a similar program or dashboarding to implement for their teams?
First ask yourself what outcomes you want and experiment. I built several light versions of this and put it in front of people. Some people's eyes lit up and some people's eyes rolled in the back of their head. There was a lot of good insight and an understanding. I went and spent tens of hours working on building something to find out that it sucked just being iterative and getting feedback along the way. It also helps the team feel a sense of ownership in that as well.