Illustrative case study
Users signed up but never activated. Here is what the founder changed
120 people were signing up every week. Only 18 ever did anything inside the product. The landing page was fine. Acquisition was working. The problem was everything that happened after the signup button.
15% → 34%
Activation rate
12 min → 4 min
Time to first value
+60%
Day-7 retention lift
This is an illustrative case study. It is based on a common pattern seen in early-stage SaaS products where signup numbers mask an activation problem. The numbers and context are modeled from typical scenarios, not drawn from a specific named customer. If your product is getting signups but struggling with activation, the diagnostic and fix approach here applies directly.
Quick summary
A solo founder built a lightweight collaboration tool for small remote teams. Traffic was growing. The landing page was converting at 12%. Every week, 120 new users were creating accounts.
On paper, this looked like a healthy early-stage product.
But very few of those signups were actually using the product. Only about 18 per week completed the first meaningful action. The rest created accounts, looked around for a few minutes, and left. Most never came back.
The problem was not traffic. It was not the landing page. It was not the product itself. It was the path between signup and first value. Onboarding had six steps, took about twelve minutes, and never clearly told users what to do first. The founder simplified it to two steps, added a guided first action, and moved all the setup work to after the value moment.
Activation rose from 15% to 34%. Time to first value dropped from twelve minutes to four. Day-7 retention among activated users improved by 60%.
The signup numbers looked like success
The product had been live for about eight months. The founder had spent the first four months building features and the last four focused on distribution. That distribution work was paying off. Organic search traffic was growing. A few community posts had been well received. Weekly traffic was holding steady around 1,000 visitors, and the signup rate was sitting at about 12%.
120 signups per week. For a bootstrapped product with no paid acquisition, this felt like real traction. The founder was tracking traffic and signups as the primary metrics. Both were moving in the right direction.
There was no obvious reason to worry. The dashboard showed green numbers. The product was adding new accounts every day. The monthly growth chart looked like progress.
But the founder had a lingering question that the dashboard was not answering: were these users actually doing anything?
What the activation data revealed
The founder had defined an activation event months ago but had never tracked it seriously. The activation event was simple: creating the first shared workspace, which was the product's core action and the point where the tool became genuinely useful.
When they finally pulled the numbers, the gap was stark.
Of 120 weekly signups, only 18 completed that first action. That is a 15% activation rate.
85% of new users were creating accounts and leaving before they experienced the product doing anything useful. They were signing up, looking at the interface, and leaving without a clear impression of what the tool actually did for them.
This is the pattern described in detail in what is user activation: the gap between a signup (which measures intent) and activation (which measures experienced value). Intent is necessary but not sufficient. If the product does not deliver on the intent quickly, the user moves on.
The founder had been watching signups as a proxy for success. Signups were measuring how many people were interested. Activation was measuring how many of them actually got to the part where the product is useful. Those two numbers told very different stories.
Where users were getting stuck
The founder ran through the product's onboarding flow as if they were a new user. They used an incognito window, created a fresh account, and timed every step.
The experience looked like this:
-
Account creation. Email, password, company name. About 90 seconds. Fine.
-
Welcome screen. A large page with a short greeting, an overview of the product's three main features, and a "Get started" button. The button led to step 3, not to the product itself. This felt like a delay, not a welcome.
-
Preferences setup. A form asking about team size, industry, primary use case, and notification preferences. The founder had built this to personalize the experience, but in practice, a new user has no context for answering these questions yet. They have not seen the product work. They do not know what their "primary use case" will be. The form took about two minutes and produced no visible change in the product.
-
Feature tour. A five-screen slideshow showing the product's main features with annotated screenshots. The slides were well designed. They were also the wrong thing to show a user who has not yet touched the product. Reading about features is not the same as experiencing them. About 40% of users abandoned during this step.
-
Team invite. A prompt to invite team members by email. For a solo user evaluating the product, this was irrelevant. For someone on a small team, it was premature. They had not seen what the tool does yet. Asking them to bring other people into it felt like a commitment before a demonstration.
-
Workspace configuration. A settings page for the first workspace: naming conventions, default views, color scheme. This is useful eventually. It is not useful before the user has created their first workspace.
After all six steps, the user finally landed in an empty workspace. No sample data. No guidance on what to do first. Just a blank screen with a sidebar and some icons.
Total time: about twelve minutes. Twelve minutes of setup before seeing the product do anything.
Why onboarding was failing
The problem was structural, not cosmetic. Each individual step made sense in isolation. Preferences help personalize the experience. A feature tour helps users understand capabilities. Team invites help adoption. Configuration helps the workspace feel right.
But the sequence was wrong. Every one of those steps was placed before the user experienced any value. The product was asking for investment before demonstrating return. This is the classic time to first value problem: every step between signup and the moment the product does something useful is overhead that competes with the user's patience.
The user's mental model at the time of signup is: "I want to see what this does." The onboarding's mental model was: "Let me set everything up for you first." Those two models are in direct conflict.
The result was predictable. Users started the flow, got through two or three steps, realized they still had not seen the product work, and left. They did not decide the product was bad. They just ran out of patience before the product had a chance to prove itself.
This is also why users do not complete signup in so many products. The path between "I am interested" and "I can see this is useful" is where most early-stage products lose the majority of their new users.
What the founder changed
The fix took about a week of focused work. The core principle was simple: get users to the first value moment as fast as possible, and move everything else to after that moment.
Change 1: Removed the welcome screen, preferences form, and feature tour entirely from the initial flow.
These were not deleted. They were moved. Preferences became available in account settings, accessible anytime. The feature tour became an optional link in the sidebar labeled "Quick tips." The welcome screen was replaced by a single line of text on the first workspace page.
Three steps removed from the path to first value.
Change 2: After account creation, users were taken directly to a workspace with one prompt.
The prompt said: "Create your first shared workspace. Give it a name and add one task." Below the prompt was a text field for the workspace name, a button, and nothing else.
One clear action. One described outcome. No competing choices.
Change 3: The workspace was pre-populated with sample content.
Instead of an empty workspace after creation, the user landed in a workspace with three example tasks, one team member placeholder, and a clear note explaining what each element represented. The product looked alive. The user could see what the tool would look like with real content before adding any of their own.
This addressed the empty state problem. An empty product feels like it is not working. A populated product feels like it is already useful.
Change 4: Team invites and configuration were moved to a gentle nudge after the first session.
24 hours after signup, users who had created a workspace received a short email: "Your workspace is set up. Want to invite your team?" This was the same prompt that had been step 5 of onboarding. Moving it to after the first session meant users had context for the question: they had already seen the product work and could make an informed decision about whether to bring others in.
The new flow was: create account, name your workspace, see it populated, start using it. Two steps, about four minutes, and the user experiences the product's core value before being asked to configure anything.
Results after the changes
The founder measured results over three weeks after the new flow went live.
Activation rate went from 15% to 34%. More than twice as many signups were completing the first meaningful action. The product was reaching users before their patience expired.
Time to first value dropped from 12 minutes to about 4 minutes. This was the most direct measurement of the structural change. The path was shorter, clearer, and produced the value moment before the setup overhead.
Day-7 retention among activated users improved by 60%. This was a downstream effect the founder had not expected. Users who activated in the new flow returned at a higher rate than users who had activated in the old flow. The likely explanation: users who reached first value in four minutes formed a stronger mental model of what the product does for them. That model is what brings users back. A user who activated after twelve minutes of friction may have formed a weaker impression.
The total number of weekly signups was unchanged. The landing page had not been touched. Traffic sources were the same. The only change was what happened after the signup button, and that change doubled the number of users who actually experienced the product.
What other founders can learn from this
Signups are not a success metric. They are an intent metric. A signup tells you someone was interested enough to create an account. It does not tell you they experienced the product, understood its value, or have any reason to return. If you are tracking signups as your primary growth number, you may be missing the real problem entirely.
Measure activation separately from conversion. Conversion rate (visitors to signups) and activation rate (signups to meaningful first action) are different metrics that diagnose different problems. A product can have a healthy conversion rate and a terrible activation rate. That combination means your marketing works and your onboarding does not. You cannot see this if you only track the top of the funnel.
Every step before first value is overhead. This is the principle that drives onboarding design: the only steps that should sit between signup and the first value moment are the ones strictly required to produce that value. Everything else belongs after. Preferences, team invites, feature tours, configuration screens, and welcome pages all feel helpful but they all delay the moment the user sees the product working.
Empty states kill activation. A blank dashboard or an empty workspace communicates "this product has nothing in it yet." That is technically true but experientially devastating. Users do not have the imagination to project what a populated product will look like from a blank screen. Pre-populate with sample data. Show the product as it will look when it is working. Let the user see the output before they invest in the input.
The first session is the entire product for most new users. If a user does not experience value in their first session, they have almost no reason to return. The first session is not a preview. It is the audition. Everything about your product's long-term retention starts with whether the first three to five minutes answered the question: "Is this worth my time?"
Time the flow yourself. Open an incognito window, create a new account, and time yourself from signup to the moment the product does something useful. If it takes more than five minutes, users with less patience than you (which is most of them) are leaving before they get there. This exercise costs ten minutes and reveals more than a week of dashboard analysis.
FAQ
What is a good activation rate for an early-stage SaaS product? It depends on product type, but for most web-based SaaS tools, 25% to 40% within the first 7 days is a reasonable range. Below 20% is a strong signal that onboarding has a structural problem. Above 50% means your path to first value is working well. The more useful comparison is your own trend over time: is the number improving as you simplify the path?
How do I define my activation event? Choose the first action that delivers the product's core value to the user. Not a setup step, not a configuration action, and not a passive event like "viewed the dashboard." It should be the moment the user got something useful from the product. For a project tool, it might be creating a task. For an analytics product, it might be receiving the first insight. For more detail, what is user activation covers how to choose and measure this event.
Should I remove onboarding steps entirely? Not necessarily. The goal is to move non-essential steps to after the first value moment, not to delete them. Preferences, invites, and configuration are useful. They just should not block the user from experiencing the product. Put them in settings. Offer them via email the next day. Make them available without making them mandatory.
How long should time to first value be? Under five minutes is strong for most web-based products. Five to ten minutes is acceptable. Over ten minutes is risky, and over twenty means you are relying on exceptionally patient users. The time to first value guide covers benchmarks and reduction strategies in detail.
What if my product requires real data before it can show value? Products like analytics tools, monitoring platforms, and data processors face this specifically. The solution is sample data: show users what the output will look like using fictional but representative content. They see the product working immediately, understand what they are setting up, and the wait for real data feels purposeful rather than empty.
See this kind of insight in your own product
The gap between signups and activation in this scenario went unnoticed for months. The founder was watching the right traffic chart but the wrong success metric. The data was available the entire time. Nothing was surfacing it.
Muro is built to flag this kind of gap automatically. When signups are healthy but activation is low, Muro tells you. When time to first value is climbing, Muro tells you. You do not need to remember to check the right split. The right split is checked for you.
If you are not sure what your activation rate looks like right now, that is the question worth answering first.