
After all, you’re building the product for their use!įeedback is most effective when it correlates with your data. That’s not to say you should ignore feedback from customers. Feedback is your lifeline in product design We constantly need to balance those requests with the rest of our customers’ needs. Sometimes, we find ourselves reacting to the noisiest customers from our support pages and social media. You can’t only react to those who cry the loudest.

As you figure out where to go next with your product, behavioral data is salient-especially since self-reported claims are often subjective.ĭefinitely don’t believe what people predict they may do in the future. When customers give feedback, things inevitably get lost in translation. The reality is, customer feedback alone is just one part of the “user-centric development” equation. But are your data and customer feedback truly equal? Behavioral data ≠ customer feedback Looking for patterns in customer feedback can point you towards things that need to be changed. The customer (data) is almost always right You can even use it to ground your conversations with customers, since you’ll have a clear picture of how they use your product. How do customers that churn use the product? What are they missing?Ībove all, use data-and the context around it-to learn what your customers could be getting out of your product.How do our most reliable customers use our product? Can we make that path easier to follow for others?.How many steps does it take a user to accomplish a task? Can we reduce that number going forward?.It’s great to ask yourself questions like: You always need context for your numbers, and that requires working backwards. These tools are life-saving when it comes to understanding user behavior.īut even if you’re armed with great tools, you still can’t take your data at face value. Tools like Amplitude give you detailed analytics on how customers use your product, and FullStory actually lets you record and replay user sessions. There are a number of ways to access your users’ behavioral data. It’s a quantitative sign that you’re (hopefully) on the right track. The surefire way to make sure your results will be accurate is to see what users actually do.ĭata measures what users do in your product, and for how long. The fact is, you need to listen to your customers. If it’s not user-centric, it’s like your team is driving without a map. Keeping your product development user-centric is key. How do you balance competing factors like behavioral data and customer requests for the product? It’s (mostly) about your data And you have to make lots of small, iterative decisions to stay on track. Our goal is to help teams communicate asynchronously, and we’ll always be tinkering with the best ways to achieve that.īut this probes at a bigger question. When it comes to exact entry length, we’re passing the baton to those who know their team’s needs best-team leaders. We don’t want to fall down the rabbit-hole of offering too many configuration options-but we also don’t want to lose customers who find our product useful. Ultimately, we set our default in I Done This 2.0 to shorter entries, but we added an optional button to allow longer entries. Does that mean we should encourage this behavior, and cap entries after a certain number of characters?

We find, for example, that a higher volume of short entries helps people feel great about their work, and it’s more interesting for their co-workers to read.

But we also noticed a few patterns in our user behavior data that we weren’t quite sure what to do with. We added new functionality, like blockers. Product design is all about tradeoffs-and when we designed I Done This 2.0, we had a lot to consider.
