Measure Lean Startup Metrics
After I got hooked on the Lean Startup Methodology from Eric Ries I figured we need to go back to the basics with Convert Insights and figure out what makes our clients happy clients and how can we really learn faster and more objective and find the perfect product market fit.
The problem with this Lean thought is that even though Eric Ries new book Lean Startups is good, it does not give me a template or format on how to measure our growth. It says you have to pick metrics and monitor changes, but what metrics should we take. Here is my thought on what we could use as metrics for Convert Insights.
Dave McClure’s AARRR metrics, development cycle, measure feature use in time, viral coefficient (V-Co), Viral Cycle Time (CT). Customer Life Time Value (LTV), Customer Acquisition Cost (CAC) and Churn (how many people stop playing) and I think we should consider connecting every feature to hard metrics like these to see the impact. We need to go forward some key metrics that show how much we are getting closer to the moment of product market fit is important.
Before diving in: every metric needs an owner. So each of theee metrics will be the responsibility of someone on the team to watch and improve.
First AARRR. For phase product/market fit the activation and retention are most important. Followed by Revenue to proof viable business in B2B market.
- 1: Visit Site: all visits
- 2: Does not abandon: 2 views or stay 10 seconds or 2 clicks
- 1: Happy 1st Visit: 5 views or stays 20 seconds or 10 clicks
- 2: Email us or Whitepaper download
- 1: Open weekly email
- 2: Repeat visitor of app (3+ visits on separate days in last 30 days)
- Refer 1+ users who visits site. Viral coefficient (V-Co) is connected to this since it shows referrals/active-users. So anything over 1 is good.
- Refer 1+ users who sign up for trial
- User generates minimal revenue
- Users generates break-even revenue.
- Customer Acquisition Cost (CAC)
- Customer Lifetime Value (LTV) where in the beginning we need to figure our life-time.
- Viral Cycle Time (ct) the shorter the better to gain maximum benefit fromg groth. Focus is 1-3 days ct. We can shorten this time by Register -> Tell Friends -> Use Product -> Evaluate. We could even experiment with pay for tweet on trial instead of free.
- Churn: what percentage of clients stop paying
- Measure every features in use discover key features.
Everything is analyzed with Cohort graphs since placing customers and trial users in cohorts gives clear evidence of each metric in time.
What do we do with these metrics?
You need to measure to see if your going forward or backward. Just accumulating more clients does not tell you anything interesting. In search for the real problem and to see if our solutions fixes that problem we need to know if we are getting closer with each change.
- We measure then in two cohorts (trial and paid users) we try and get start before next development cycle finishes (hopefully 4-5 days from January onward).
- When each additional experiment leads to less progress we pivot (as suggested by Eric Ries)
- When 40% of clients are very disappointed without your product we hit the growth phase (Dan Martell)
- When do we have product market fit? When 50% of the 30 cold-call clients return our call (Nail it and Scale It)
- LTV should be greater than CAC and preferred 3x CAC to have a sustainable SaaS business
Since we now know the above we will not role out any new feature without connecting one of the above metrics to it. We have to know why we are rolling out a feature or change anything in our company and be able to measure it. So from now on we sill starting building into our product triggers from Convert Insights that measure the use of features and build a hypothesis before we even start building.
Since applying this since this week we already blocked one feature from being developed since we did not have stats on how the before situation was, so we could never measure the impact of the feature afterwards.
Measure feature use
We are blessed with our own tool since we can use the “Click and track as goal” feature we have from our A/B testing tool to actually track all feature uses within our app. So now only are we able to measure URL’s loaded, but also clicks on drop-down boxes, links, tabs, images, radio-buttons and checkboxes so we can map what features are most used/clicked. Then we can measure the AARRR and other metrics impact of we change anything.
We added a kanban board to our toolset from kanbanery.com to make sure we have max. 3 development spaces each time and each round ends with a validation moment (testing) to see if the metric we hoped that would improve actually improved and measure the impact on other metrics as well.