The article's point doesn't seem right.
It's measuring a random quantity in the bigger picture.
Retention percent doesn't matter as much as number of users retained, which leads to total revenue. But then you have to factor in the cost of acquisition. And the cost of infra to serve non-paying users.
My take on it: they took a multivariate analysis, plugged in the details of their business model, came up with a result, and then publish it as if it's advice applicable for all kinds of companies.
If you make me enter a credit card in during a trial, I am going somewhere else. Amount of effort isn’t worth it. I have to log on to Ramp and request a virtual card and explain to finance why I need a new virtual card for a service I haven’t even trialed/demoed. Yeah no thanks.
Irony is that equals product would probably be useful for our org since we use salesforce and snowflake heavily.
>"Your goal is for people to get their first moments of value from your product, commonly known as “activation”. All the talk about eliminating friction and removing steps is totally detached from this goal." This is the gist I Guess. Remove as much friction as possible without compromising the "moments of value".
This feels a lot like content marketing instead of anything of value. It's really vague on details and numbers. It could not be about friction. It could easily be a blank spreadsheet does poorly compared to one with sample data loaded in.
Credit card vs none. Is that better retention or just eliminating those that would have never paid so retention looks higher?
Just seems like fluff with a clickbait title.
This advice is absolutely on the spot for B2B Saas businesses. For B2C? not really.
That line graph is bad and it should feel bad.
I have come to appreciate downloadable apps that work out of the box. This immediate gratification lowers friction because users can walk away if it’s not right for them. Don’t waste your user’s time.
I made a decision early on with my electron app (Label LIVE, to design and print labels from excel/CSV) to make it fully functional on first start. The only catch is the print has a randomly placed watermark. If it works for the user, they can signup for a free trial to remove the watermark for 14 days. If after two weeks they have found value in the product they can purchase a perpetual or subscription license.
We have always taught that friction is filtering, and that filtering should be used wisely.
i'm doubtful of this being causation. But, if it is, isn't this basically a clickbait-y way of saying you are trying to create "sunk cost"?
I wish the writer would let us know if the total number of activations went up or down.
...or maybe instead of friction (it gives you burns) let's add engagement triggers. Clickbait title.
Sure, dumbification of software has to stop somewhere. Emacs is not easy to learn and it takes quite a while to see the point of all the complexity and flexibility over just using Notepad. But, at the end of the day, it can still hold its own over modern IDEs for many tasks. Notepad has its uses, but I am not going to pay $999/seat for Notepad. To convince me to pay, you need to explain to me everything I am getting and give me enough skills to have an initial idea of how I can map my unique needs to what is being offered. That's going to take some time.
Minimal viable friction is a good sweet spot to work on the balance of.
I just want a tool. Just let me use it. Thanks!
I wish I could upvote this post more than once. Some good, clear thinking.
That 2nd graph is awful. You’re comparing the 8week retention of two groups with a single line graph?
There’s not even 8 points on that line.
Article is disingenuous. They say retention is down 30%, but we don't know how many signups there were.
The article mentions retention rate down 30% but makes no mention of conversion rate. Was it up by the same amount? Was it up by more? Did the MRR change? There's not enough information here to make any kind of conclusion.
They also seemed to give up very quickly on the low-friction version. Maybe they could have iterated to reduce time-to-value, by pre-populating some example dummy data or something.