We added an AI assistant to a docs site. Support tickets went up.
Here’s why that’s actually a good thing.
We added an AI assistant to a client’s docs site. First week, support tickets went up. Not down.
The assistant kept responding “I couldn’t find information about this” for basic things. Export data. Reset password. Delete account.
Support had been answering these verbally for years. Everyone assumed it was documented, but it wasn’t.
Why basic flows go undocumented
When a team ships something new, they want to document the shiny parts. The feature they spent months building. The complex integration that’s hard to understand without a guide.
Nobody writes a page about how to reset a password. It feels too obvious.
Then support starts getting questions. They answer them in 30 seconds over chat, and writing a whole doc page for that feels like overkill. So it stays verbal. Months pass, new support agents join, and they learn the answers from colleagues, not from docs. At some point everyone just assumes it’s written down somewhere.
An AI assistant breaks this cycle. It can only answer what’s documented. It doesn’t know what support told a user last week, and it doesn’t have access to Slack threads or old tickets. In our case, that’s intentional. We only train it on curated, reviewed documentation.
So when a user asks “How do I export my data?” and the assistant says “I couldn’t find information about this,” that gap is no longer invisible. It shows up in the logs, every day. You can’t ignore it.
Use the gaps
Once we documented the missing flows, tickets dropped. The assistant started answering questions it couldn’t before.
But here’s the part we didn’t expect: the “I don’t know” responses told us exactly what to write next. We just looked at what users asked most and started there.
AI assistants don’t fix documentation problems. They make them visible.


