Comparing the growth and user retention of Arbitrum with Polygon and Optimism: Lessons for ZKsync and Starknet's airdrops and other user incentive strategies.
It’s been more than half a year since the Arbitrum airdrop and the whole web3 community has learned a lot from it. A proof of that is how many dashboards there are in dune replicating its criteria for multiple protocols. The Arbitrum airdrop can be seen as a success, since it
However, there's always room for improvement. Following Arbitrum as an example, this article is a general airdrop guide, exploring the effect that airdrops have on protocol users. We will also show that there's potential to optimize the selection of users for the arbitrum airdrop and extend its positive impact over time, lessons that can be applied to any reward system, like the potentially upcoming Starknet and Zksync airdrops.
The eligibility for the Arbitrum airdrop was determined via a point-based system, where users could earn points based on their on-chain activity. Once they reached a specified threshold of points, users became eligible for the airdrop. Check out this post for further details.
During the analysis, the analytics team went through the process of:
They reached the following categories:
Quoting the Nansen team:
Each “organic” activity earned positive (behaviors to encourage) or negative points (behaviors to discourage). The amount of tokens that a wallet received in the airdrop was a function of how many points it collected. In order to participate, a wallet had to hit a minimum threshold of three points. The more points earned, the higher the allocation claim.
An the amount of points given where the following:
Positive criteria, one point each:
Negative criteria, subtracting a point:
The criteria, as we can see, focus on rewarding users that have participated in the Arbitrum network consistently over time and been early adopters of new features.
The airdrop had two primary goals:
Here, we will focus on understanding how well the airdrop achieved its second goal of fostering a bond between the project and its users. For this, we will study the effects of the Arbitrum airdrop in two distinct dimensions: 1) user acquisition and 2) user retention.
To understand the impact of the Arbitrum airdrop, we need to compare its results with similar projects. For this reason, we are presenting data for Arbitrum, Polygon, and Optimism pre and post-airdrop.
In the chart, we can see that, originally, Optimism and Arbitrum had a comparable number of new users each month. This trend changed considerably at the time of the airdrop, when Arbitrum quadrupled its user acquisition, with numbers almost comparable to Polygon.
Indeed, there's a significant boost in user acquisition that lasts for ~3 months, and then it slowly decays to match again the numbers of Optimism.
The impact is clear, and one question that arises from this data is: could a project prolong the effect of the boost over time by offering continuous rewards rather than a one-time airdrop?
User acquisition is worthless if those users do not stick with the project afterwards. For this reason, we have extracted the retention data from the cohort analysis from Token Terminal for each of the projects.
In this cohort analysis, what we observe is the percentage of users returning to the project each month segmented by the month they joined the project. This makes it ideal for our analysis, as we can compare different behaviors for users joining at different times.
Now, we've categorized the data into pre-airdrop, during-airdrop, and post-airdrop periods.
By calculating the average of behaviors for each of the projects, we obtain the following:
The data reveals the following:
Several questions emerge from this, but two, in particular, resonate with us:
The truth is that, given how the airdrop criteria were designed, the Arbitrum team successfully amplified activity on the platform during the airdrop period. This achievement was possible thanks to a detailed analysis of which KPIs/metrics were aligned with the project's objectives.
Nevertheless, the airdrop results show that none of the airdrop criteria directly improve the likelihood of someone staying on the platform. In contrast, modern web2 consumer companies diligently monitor their clientele and devise their churn models, an analysis that we ourselves have done in the past. This approach allows companies to predict which users are likely to leave the platform or which ones will depart regardless of any intervention.
Perhaps, merging the analysis of the value added in the past with the probability of returning for a given user are the key to optimal incentivization. We could decrease the number of tokens distributed to users unlikely to contribute to the platform in the future. This more token-efficient approach might enable not just one, but perhaps three or four airdrops. This scenario is advantageous for everyone:
This would be what's called smart incentivization.
We have examined the Arbitrum airdrop criteria and the eligibility requirements. Thanks to the thorough analysis for designing their airdrop, the Arbitrum team was able to:
Overall, it's been a success, with commendable alignment of the right metrics. However, there's still much room for improvement in terms of continuous engagement and increasing user retention. This leaves us wondering what the outcome might have been had smart incentivization been applied.
We will watch closely to see if potential upcoming airdrops, such as the ZkSync airdrop or the StarkNet airdrop, implement the lessons learned from this experience or even surpass the original results.
PD: If you are a company looking to optimize your reward system, contact us!