Ever since Apple opened up subscription monetization to more apps in 2016 — and enticed developers with an 85/15 split on revenue from customers that remain subscribed for more than a year — subscription monetization and retention has felt like the Holy Grail for app developers. So much so that Google quickly followed suit in what appeared to be an example of healthy competition for developers in the mobile OS duopoly.
But how does that split actually work out for most apps? Turns out, the 85/15 split — which Apple is keen to mention anytime developers complain about the App Store rev share — doesn’t have a meaningful impact for most developers. Because churn.
No matter how great an app is, subscribers are going to churn. Sometimes it’s because of a credit card expiring or some other billing issue. And sometimes it’s more of a pause, and the user comes back after a few months. But the majority of churn comes from subscribers who, for whatever reason, decide that the app just isn’t worth paying for anymore. If a subscriber churns before the one-year mark, the developer never sees that 85% split. And even if the user resubscribes, Apple and Google reset the clock if a subscription has lapsed for more than 60 days. Rather convenient… for Apple and Google.
Top mobile apps like Netflix and Spotify report churn rates in the low single digits, but they are the outliers. According to our data, the median churn rate for subscription apps is around 13% for monthly subscriptions and around 50% for annual. Monthly subscription churn is generally a bit higher in the first few months, then it tapers off. But an average churn of 13% leaves just 20% of subscribers crossing that magical 85/15 threshold.
In practice, what this means is that, for all the hype around the 85/15 split, very few developers are going to see a meaningful increase in revenue: