Trust Is the Funnel
Doubling conversion without creating healthcare trust debt
4 min read
At a glance
- Role
- Growth Product Manager (via ADK Group)
- Problem
- Acquisition worked, onboarding leaked
- Solution
- Instrumentation, funnel redesign, lifecycle recovery
- Impact
- 3.8% → 8.4% conversion, 500+ customers

TL;DR
In healthcare, conversion is never “just UX”. It is trust, clarity, and follow-through. When the experience is confusing or feels risky, users bail even if the underlying product is strong.
A digital health startup needed scalable acquisition and a significantly better onboarding conversion rate, but the constraints were real: privacy, compliance, and a user mindset shaped by stigma and skepticism.
I built an evidence-driven growth system: event taxonomy and instrumentation to make the funnel legible, channel-level performance analysis tied to downstream conversion, and lifecycle interventions to recover abandoned intent.
Impact:
- Conversion improved from 3.8% → 8.4%.1
- Acquired 500+ paying customers.2
- Supported expansion into 14 additional states.
Industry Primer
Healthcare growth tends to fail when teams optimize the wrong thing. Clicks and sign-ups are easy to measure. Trust is not, but it is the real bottleneck.
- Regulated data: Even seemingly “simple” analytics questions can touch sensitive data.
- High-consideration behavior: Users evaluate credibility and safety, not just price and convenience.
- Operational reality: Improvements that increase volume but degrade clarity can overwhelm clinical and support teams.
Context
The engagement sat at an uncomfortable intersection: the founder needed speed (prove traction, refine positioning, expand distribution), but the product required rigor (privacy constraints, careful messaging, and an onboarding experience that didn’t feel like a trap).
The team’s initial instinct was to focus on acquisition and creative. My instinct was different: before “more traffic,” we needed a clear model of where users were dropping and why.
Problem
The funnel wasn't legible enough to optimize
Traffic and spend existed across channels, but the team could not reliably answer:
- Which channels produced users who completed onboarding (not just clicks)?
- Where did users abandon, and what abandonment was recoverable?
- What parts of the flow were “necessary friction” vs accidental friction?
Guessing carried trust and compliance risk
In healthcare, low-quality optimizations don’t just miss a metric. They can create downstream damage: confused users, lower confidence, and higher support burden. That risk made stakeholder debates sticky, because “shipping a change” without measurement felt unsafe.
Solution
Decision 1: Instrument first, then optimize
I set up a measurement foundation that made the funnel observable end-to-end:
- a clear event taxonomy aligned to the onboarding steps
- channel-level funnel visibility (impressions → clicks → onboarding progression)
- a repeatable cadence for experiment readouts
This is unglamorous work, but it turns conversion from a debate into a system.
Decision 2: Prioritize channels that behaved like intent
Once the funnel was measurable, we compared performance by channel based on downstream conversion. The notes show Instagram materially outperformed other channels, so we shifted focus toward the channel that produced completions, not just attention.
This improved iteration speed: tighter learning loops and fewer false positives.
Decision 3: Recover abandoned intent with lifecycle interventions
When the flow is complex, conversion work can’t be “remove steps and pray.” It has to be systems-oriented.
We introduced lifecycle improvements that treated abandonment as recoverable:
- moved email capture earlier to enable follow-up
- built a re-engagement series targeted at sign-up drop-off points
This lifted completion meaningfully without undermining clarity.
Results
Validated impact
- Conversion improved from 3.8% → 8.4%.
- The re-engagement series improved completion by roughly 37%.
Durable capability
The most important output was not one “winning test”. It was a growth operating system:
- a measurement model the team could reuse
- a shared definition of “success” that respected trust constraints
- a repeatable experiment cadence based on downstream outcomes
What I'd Do Differently
I would formalize a “trust guardrail” earlier: a lightweight set of qualitative signals (confusion, perceived risk, “I don't know what happens next”) tracked alongside conversion. In healthcare, conversion lifts that erode trust often show up later as churn, poor adherence, or support burden.
Collaborators
I partnered with the founder/CEO, engineering, and clinical operations stakeholders to align growth work with compliance constraints and the realities of care delivery.