Scaling From Stealth to Unicorn
Turning documentation and pricing truth into a growth engine
5 min read
At a glance
- Role
- Growth Product Manager (via ADK Group)
- Problem
- Traffic without clear conversion mechanics
- Solution
- Intent mapping, SEO strategy, experimentation
- Impact
- 100x organic traffic, +63% trial conversion

TL;DR
I discovered the highest-converting part of this company’s website was not the paid “offer” pages. It was the documentation and pricing truth that technical evaluators kept coming back to.
As a Growth Product Manager, I helped build a measurable growth program: a reporting cadence that separated “traffic volume” from “conversion efficiency”, an SEO strategy tied to buyer intent, and an experimentation loop that made website changes safer and faster.
Impact:
- Increased free-trial conversion from 2.7% to 4.4% (+63%) and grew organic traffic 100x (100 to 10,000 monthly visitors).
- Identified and corrected major intent mismatches (high-traffic pages that produced almost no trials versus “boring” pages that reliably converted evaluators).
- Established a repeatable experimentation cadence, including tests where obvious optimizations underperformed the control.
"Sam, you're an excellent partner and you pushed our marketing team to new heights. Hope we get to work together again soon." — CMO
Industry Primer
If you sell infrastructure, your website is not “just marketing”. For many buyers, it is the evaluation environment.
- S3-compatible object storage means buyers can use Amazon S3-style APIs and tooling, but store data on a different provider.
- “Hot” storage is positioned for frequently accessed data (vs deep archival) and is often judged on reliability, performance, and total cost (including egress fees).1
- Enterprise evaluators don’t convert because a hero banner is pretty. They convert when the site answers the questions they would ask in a vendor call: pricing mechanics, migration path, security posture, and proof the product will work in their stack.
Context
This company competed in a category dominated by hyperscalers. The bar for credibility was high: technical buyers expected deep documentation, clear pricing models, and content that felt engineered rather than “marketed”.
At the same time, the marketing team invested across multiple channels (paid, organic, partners, content syndication). The risk was predictable: the dashboard showed activity, but it did not explain which pages and channels drove free trials.
Problem: “More Traffic” Was Not the Same as “More Growth”
The team had two failure modes that looked successful on the surface:
1) Paid Volume Masked Funnel Health
Certain campaigns increased sessions and even increased conversions, while quietly degrading conversion efficiency.
When we broke performance down by segment, we found cases where a campaign landing page drew attention, but the users who started there converted orders of magnitude worse than the site average. That is not a “tune the copy” problem. It is a mismatch between the campaign and the buyer’s evaluation mindset.
2) Local Optimization Without a Shared Model
Website changes were often evaluated by intuition or aesthetics, not by an agreed definition of success.
That created a predictable pattern: a stakeholder would propose a change, the team would debate opinions, a change would ship, then we would argue about what happened (or avoid the argument entirely).
A high-consideration buyer journey made simple, last-click funnel thinking misleading. Evaluators bounced between documentation, pricing, and comparison pages, then returned later to start a trial.
Solution: Treat the Website Like a System
Decision 0: Define “Good Traffic” and “Good Conversions”
Before changing anything, I aligned stakeholders on a practical definition of success.
- North star: free-trial submissions2
- Guardrail: conversion rate alongside absolute conversions
This prevented a common trap: buying more traffic and celebrating raw conversion counts, even when conversion efficiency was falling.
Decision 1: Map Conversion Intent (Not Just Pageviews)
I built a reporting cadence that answered one question every two weeks:
“What do converters do that non-converters don’t?”
That meant:
- Identifying which pages reliably produced trials versus which pages burned traffic.
- Segmenting by channel and intent (paid vs organic vs direct behaved differently; averaging hid the truth).
- Treating documentation and pricing paths as core conversion paths, not afterthoughts.
Decision 2: Turn High-Intent Paths Into a Scalable SEO and Content Strategy
Once we had evidence on where evaluators converted, we invested accordingly:
- Pricing and comparison intent (where decision-makers asked cost and risk questions).
- S3-compatibility intent (where technical evaluators validated feasibility).
- UX hygiene and findability (small friction in a high-consideration journey compounds fast).
I also used a simple SEO prioritization framework to force explicit trade-offs:
- Defend: already ranking top 1-3 for high-value topics; protect those positions.
- Optimize: ranking 3-15; often the fastest wins.
- Conquest: net-new topics that matter, where authority and intent are plausible.
Decision 3: Build an Experimentation Loop (and Accept Counterintuitive Results)
We treated conversion work like engineering: make a hypothesis, test it, keep or revert.
Some tests worked. Others surprised us. The key was the new norm: changes were reversible and evidence-based.
Results
The most durable outcome was not a single test win. It was a capability:
- A repeatable reporting cadence the team could use to prioritize changes.
- A shared mental model of buyer intent (and where the website needed to earn trust).
- A healthier relationship with experimentation: “test, learn, keep, revert” instead of “ship and hope”.
What I'd Do Differently
If I could redo the project, I would introduce direct “voice of customer” inputs earlier into the SEO roadmap. Search demand is useful, but it is not a substitute for an accurate model of the buyer’s day-to-day questions and constraints.
Collaborators
I partnered with marketing leadership, a growth marketing manager, web engineers, and analytics stakeholders to translate data into roadmap decisions and to ship site changes safely.