Our best tutorial was actually broken (we just couldn't see it)
12 user complaints revealed what our analytics couldn't
A tutorial looked perfect in the analytics.
5-minute average session time. Best retention of any page. During our review call, the product lead pulled up the dashboard: “See? Users love this one.”
The team wanted to move on to other priorities.
I suggested we add a feedback widget first. Just a small button in the corner. “Was this helpful?”
They were hesitant. “We already know it’s working. The data proves it.”
I pushed anyway.
One week later, the Slack message came through: “Um... you need to see this.”
12 users had left feedback. Not one was positive:
“Step 3 is missing”
“This example doesn’t work”
“I don’t understand this part”
“Where’s the actual code?”
“Is there supposed to be a screenshot here?”
“I’ve read this four times and I’m still confused”
Those 5 minutes weren’t engagement. They were users scrolling up and down, re-reading, trying to figure out what was missing.
The analytics showed what users did. The feedback showed why.
They fixed everything in 48 hours.
Added the missing step. Replaced the broken example. Clarified the confusing parts. Added code snippets where users expected them.
Two weeks later, the feedback changed:
“Finally makes sense”
“This saved me hours”
“Exactly what I needed”
The session time actually dropped to 3 minutes. But now users were finishing successfully instead of giving up confused.
Here's what I've learned: Good-looking metrics don’t always mean good experiences. This tutorial had great numbers but frustrated users.
The companies that ship the best products don’t wait for problems to show up in dashboards. They collect feedback from users from day one.
Quantitative data shows behavior. Qualitative data explains it.
Your analytics will tell you users spent 5 minutes on a page. Your users will tell you they spent 5 minutes confused.
You need both to see the full picture.

