![DWF Ventures: ETH Denver Recap](/images/research/Threads-Intro-Images/Resize/DWFVentures-ETH-Denver-Thread-intro-960x540px.png)
- Details
Spanning from February 22 to March 3, ETHDenver 2024 attracted industry leaders, investors, and innovators, fostering exciting discussions and momentum across key blockchain narratives. As the crypto conference wraps up, DWF Ventures takes a closer look at the event, its standout moments, and the overarching market vibes it brought to the crypto community.
Overall Market Vibes
ETHDenver coincided with a major market upswing, as Bitcoin surged from $52,000 to $62,000 during the event period, signaling increased market activation and growing investor confidence. The bullish sentiment was palpable across the crypto conference, with attendees and projects feeling the momentum as capital flowed into the ecosystem. The crypto event provided a strong validation of market optimism heading into Q2 2024.
Side Events
ETHDenver wasn't just about the mainstage; side events played a crucial role in shaping the experience. DWF Labs hosted the DWF Labs Haus, providing an exclusive venue for networking, collaboration, and knowledge sharing.
Notable gatherings such as the AI Summit, MoveCon, Modular Day, Bitcoin Renaissance, and community-driven crypto events like HotpotDAO and Berapalooza featured an impressive lineup of speakers and panels.
Adding to the vibrant atmosphere, it was completely normal to see speakers donning quirky outfits—such as bear heads—while discussing serious topics on official panels, reflecting the unique culture of the crypto conference.
![](/images/research/2024/2024-03-04-ethdenver/2024-03-04-ethdenver-1.webp)
Key Narratives: LRTs, AI, Bitcoin L2s, and DePIN
From a crypto venture capital perspective, the dominant narratives emerging at ETHDenver were Liquid Restaking Tokens (LRTs), AI x Crypto, Bitcoin Layer 2s, and DePIN.
While crypto VCs remain eager to deploy capital, many are proceeding with caution given the ongoing bull market. The prevailing trend suggests that VCs are increasingly targeting projects that are scheduled for Token Generation Events (TGE) within the next 6-8 months. This shift reflects a gradual convergence between venture investing and trading-driven narratives, ensuring that capital is deployed strategically to align with market cycles.
Fundraising Frenzy
The fundraising landscape at ETHDenver showcased an unprecedented level of activity, with numerous major projects securing substantial capital inflows. Some of the most notable fundraising announcements included:
- EtherFi, a leading LRT protocol, announcing substantial funding rounds to further its expansion.
- Avail Project, a modular blockchain solution, raising significant funds to enhance scalability and interoperability.
Interest in sectors such as LRTs, AI, Bitcoin L2s, and DePIN was at an all-time high, with investors and industry leaders lining up to engage with project teams and explore opportunities.
Networking
ETHDenver provided unparalleled networking opportunities, making it easier than ever to meet and interact with prominent figures in the crypto space. Attendees had the chance to connect with popular influencers like @kelxyz_ and @naruto11eth, among many others. The crypto event's inclusive environment fostered meaningful conversations, partnerships, and collaborations, reinforcing the value of in-person gatherings in the crypto industry.
Overall
ETHDenver 2024 has set a high standard for upcoming crypto events this year. With VCs eager to deploy more capital and projects successfully securing funding, the crypto conference has reaffirmed the industry's bullish outlook.
If you’re building the next big thing in Web3 and blockchain, feel free to reach out to DWF Ventures to discuss partnership opportunities.
![DWF Ventures: February 2024 Recap](/images/research/Threads-Intro-Images/Resize/DWFVentures-February-Recap-Thread-intro-960x540px.png)
- Details
As February 2024 comes to a close, the crypto market continues to heat up, driven by several evolving narratives and significant developments across various sectors. In this recap, DWF Ventures takes a closer look at some of the most promising trends of the month, along with insights into what lies ahead.
Restaking
Restaking has emerged as one of the strongest narratives this year, driven by exponential TVL growth and numerous airdrop opportunities. Despite concerns voiced by Ethereum co-founder Vitalik Buterin regarding potential risks of restaking, user engagement continues to surge as they seek yield opportunities in the evolving ecosystem.
With the upcoming EigenLayer mainnet launch, attention is shifting toward liquid restaking (LRT) and LRT-geared protocols, which are expected to introduce greater capital efficiency and liquidity options within the Ethereum ecosystem. This area remains a critical one to watch as new financial primitives and infrastructure emerge.
Bitcoin Layer 2s
Bitcoin Layer 2 solutions have seen impressive growth, with several high-profile launches and major liquidity inflows. Notably, Merlin, which introduced its points program on February 8, 2024, has surpassed expectations by accumulating over $2 billion in TVL, overtaking Ethereum’s Blast network. Meanwhile, BSquared Network's Buzz deposit campaign, launched earlier in the month, has already secured $440 million in TVL, underscoring the increasing demand for Bitcoin-native scaling solutions.
As the much-anticipated Bitcoin halving in April 2024 approaches, more liquidity is expected to flow into Bitcoin-related projects. Beyond BRC-20 tokens, BTC DeFi is expected to expand, with projects like Omega Infrastructure leading the charge by enabling native Bitcoin to be used as collateral across EVM-based protocols, further bridging Bitcoin with the broader DeFi ecosystem.
AI x Crypto: Growing Mindshare with Bittensor Leading the Charge
The AI sector within crypto has been gaining substantial traction, primarily led by Bittensor (TAO). A recent proposal from the Bittensor Foundation introduces a transition to a multi-token mechanism, where each subnet will have its own token that can only be acquired by staking TAO. This shift aims to enhance decentralisation and better align incentives across the network’s AI-focused subnets.
Beyond Bittensor, numerous new AI x Crypto projects are entering the market, offering innovative solutions in data, compute, and decentralised AI applications. The increasing overlap between AI and blockchain technology continues to present new opportunities for automation, decentralised intelligence, and enhanced decision-making across various sectors.
Blockchain Gaming
The Web3 gaming sector has seen a surge in activity with several major launches taking place in early 2024. Projects such as XAI Games, Pixels Online, and Mavia Game have captured significant attention, while long-term builders like Overworld, Ultiverse, and Nyan Heroes are positioned to capitalise on the growing demand for blockchain-integrated gaming experiences.
With the integration of blockchain into gaming ecosystems, play-to-earn models and community-driven economies are gaining renewed interest, presenting significant opportunities for sustained adoption and engagement.
Emerging Narratives
Several notable narratives have also gained traction, pointing to potential long-term growth areas within the crypto ecosystem.
- The first one is related to Farcaster, which is planned to get Frames, an innovative feature improving user experience and unlocking potential on-chain use cases, has led to a notable spike in daily active users (DAUs). While initial excitement has tapered off, continued development and ecosystem growth make Farcaster a noteworthy social-fi player to watch.
- Another theme is the ERC-404 token standard. Introduced by the Pandora team, ERC-404 took the crypto community by storm by enabling native fractionalisation for NFT projects, offering a potential solution to longstanding liquidity challenges in the NFT market. However, as the standard is still in its infancy, newer variations such as DN-404 are emerging to address some of ERC-404’s limitations and further refine the concept.
Key Catalysts in March and April 2024
Overall, the market is maintaining a positive outlook as it heads toward significant near-term catalysts, including:
- EIP-4844 (Proto-Danksharding): Expected in March, the Ethereum Dencun upgrade aims to drastically reduce gas fees for rollups and enhance network scalability.
- Bitcoin Halving (April 2024): Historically a key driver of bullish sentiment, the halving event is expected to bring heightened interest and increased inflows into BTC-related assets.
While these major events will continue to dominate market attention, other narratives such as restaking, Bitcoin L2s, AI, gaming, and NFTfi innovations
Conclusion
February 2024 has been a dynamic month for the crypto market, with several narratives already taking shape and gaining traction. From restaking protocols to the rapid expansion of Bitcoin Layer 2s, AI innovations, and NFT fractionalisation, the market is showing resilience and enthusiasm. As we move forward, monitoring these trends alongside upcoming major catalysts will be crucial in navigating the evolving landscape.
We remain committed to tracking these developments closely and investing in projects with the potential to onboard massive communities and redefine the crypto space. If you want to get support from our crypto venture capital fund, contact DWF Ventures directly.
![ERC-404: A New Era of NFTfi](/images/research/2024/2024-02-09-404/2024-02-09-404-cover-intro.webp)
- Details
The emergence of the experimental ERC-404 token standard is opening up a new “Pandora’s box” for NFT finance (NFTfi) with its blend of fungible and non-fungible token elements.
DWF Ventures explored the new standard, which merges key aspects of ERC-20 (fungible) and ERC-721 (non-fungible) tokens, offering native fractionalisation and crypto liquidity, and has a potential to revolutionise the way NFTs are bought, sold, and owned.
What is ERC-404?
Developed by the Pandora team, ERC-404 is an unofficial token standard designed to bridge the gap between fungible tokens and NFTs. Unlike previous NFT fractionalisation protocols, which required locking an NFT and issuing tradable shares against it, ERC-404 enables native fractionalisation within NFT projects, opening up new prospects for experimentation and trading, with increased liquidity and accessibility for collectors and investors alike.
Why is ERC-404 Unofficial?
Unlike formal Ethereum token standards, ERC-404 has skipped the standard Ethereum Improvement Proposal (EIP) and Ethereum Request for Comments (ERC) processes. As a result, it has not been formally community-vetted for security vulnerabilities. This introduces potential risks, as undiscovered flaws or exploits could arise due to the lack of rigorous peer review and formal auditing processes. While the standard is gaining traction, projects adopting ERC-404 must carefully consider the associated risks.
How ERC-404 Works
ERC-404 creatively integrates elements from both the ERC-20 and ERC-721 token standards, which traditionally operate independently. Its unique approach to NFT ownership and transferability involves a mint-and-burn mechanism for fractional transfers. Specifically:
- When a full ERC-20 token is purchased, an associated NFT is minted to the buyer’s wallet.
- Conversely, when the token is sold, the corresponding NFT is burned. This dynamic model ensures a direct one-to-one relationship between fungible tokens and unique NFTs, enabling seamless trading on both decentralised exchanges (DEXs) and NFT marketplaces.
Why ERC-404 Matters
ERC-404 represents a paradigm shift for NFT finance by offering native liquidity and fractionalisation, directly embedded within the token standard itself. Traditional NFT fractionalisation required custodial protocols to lock assets and issue fractional ownership in the form of fungible tokens. ERC-404, however, eliminates the need for such intermediaries, allowing NFT projects to natively adopt fractionalisation, enhancing market accessibility and utility.
With this new standard, NFT projects can benefit from:
- Enhanced liquidity: Tokens can now be traded more freely without losing the connection to their underlying NFT assets.
- Broader market participation: Fractionalisation allows smaller investors to own portions of high-value NFTs, opening up new investment opportunities.
- Increased utility: Projects can integrate ERC-404 for gamification, DeFi applications, and social platforms.
Pandora: The First ERC-404 Project
The first project to adopt the ERC-404 standard, Pandora, introduced an innovative model featuring 10,000 ERC-20 tokens and 10,000 associated “Replicant” NFTs. When users purchase a full PANDORA token, a corresponding Replicant NFT is automatically minted to their wallet. If they sell the token, the associated NFT is burned, creating an intrinsic link between the fungible and non-fungible assets.
Despite its experimental nature, Pandora has gained rapid traction, with trading activity surging on popular platforms like Uniswap (for ERC-20 token trading), Blur and OpenSea (for NFT marketplace transactions).
ERC-404's Rapid Growth and Market Impact
Since its launch, ERC-404 has seen explosive market growth, with its total market capitalisation reaching nearly $300 million within the first week, according to CryptoRank.
In addition, the rapid adoption of ERC-404 has led to the launch of several projects within the first few days, further driving interest and innovation within the NFTfi sector. More initiatives are expected to explore the ERC-404 standard, pushing for wider platform integrations across the Ethereum ecosystem.
![A watchlist of tokens in the ERC-404 space. Source: CryptoRank](/images/research/2024/2024-02-09-404/2024-02-09-404-1.webp)
Challenges and Considerations
While ERC-404 presents exciting opportunities, it is not without challenges. Key considerations include:
- Security Risks: Without formal Ethereum community vetting, potential vulnerabilities could lead to financial losses or exploits.
- Market Adoption: Many existing platforms may require significant updates to support ERC-404's unique functionalities.
- Regulatory Uncertainty: As fractionalisation introduces new financial complexities, regulatory scrutiny could impact the widespread adoption of the standard.
Overall
With its rapid adoption and market excitement, ERC-404 can change the NFT landscape by offering unprecedented liquidity and utility. However, its long-term success will depend on further security audits, community validation, and the development of robust infrastructure to support its integration.
Being the crypto venture capital firm, we support projects working in the NFT market. If you build in this space, feel free to reach out to DWF Ventures to discuss potential collaboration.
![The Growth of Farcaster](/images/research/Threads-Intro-Images/Resize/DWFVentures-Farcaster-Thread-intro-960x540px.png)
- Details
Over the past weekend (February 3-4, 2024), Farcaster captured significant attention in the SocialFi space, with its daily active users (DAUs) surpassing 20,000, a remarkable 10x increase in just two weeks.
Farcaster is a decentralised social network (DeSoc) protocol that empowers users with greater control over their data while enabling communication and connectivity through a network of decentralised hubs.
This growth has sparked considerable hype and curiosity about the platform’s unique approach and interesting features. What makes Farcaster stand out? DWF Ventures dives in below.
The Evolution of Farcaster
Farcaster was founded in 2020 and operated in an invite-only phase until October 2023. Since opening its doors to the public, the platform has gained traction among tech enthusiasts and SocialFi proponents.
One notable supporter is Ethereum co-founder Vitalik Buterin, who has expressed excitement about the protocol, and has been using Warpcast, Farcaster's client akin to X. Multiple endorsements for a protocol, which Buterin shared publicly, has further boosted Farcaster’s credibility and adoption within the Web3 community.
The Technology and Architecture Behind Farcaster
Farcaster employs a network-based model. It leverages decentralised hubs to store both on-chain and off-chain data, ensuring seamless connectivity among users. To minimise costs, only essential data is stored on-chain, while users have the option to cover their own storage expenses to prevent hubs from becoming overloaded.
![A layered architecture showcasing the technology behind Farcaster, integrating on-chain components, off-chain hubs, middleware, and apps for seamless functionality](/images/research/2024/2024-02-05-farcaster/fullfarcaster-thumbnail-compressed1.webp)
- On-Chain Infrastructure: Comprises components like the FID Contract (manages user identifiers), Storage Contract (handles essential on-chain data), Signer Contract (enables cryptographic authentication), and Ethereum Name Service (ENS), which provides human-readable decentralized domain names.
- Off-Chain Infrastructure: Includes the Fname Server, which manages user namespaces and username-to-address mapping efficiently in an off-chain environment.
- Hubs Layer: The central infrastructure of the protocol that stores and manages off-chain data. Hubs facilitate data exchange between middleware and apps, ensuring decentralized connectivity.
- Middleware Layer: Features components like Neymar, Airstack, Farcaster PostgreSQL Indexer, and Searchcaster API that act as bridges between the hubs and apps, offering data indexing, processing, and search capabilities.
- Apps Layer: Includes decentralised applications for end-users built on top of the Farcaster protocols. Apart from Warpcast, these include Fiids, Jam, Opencast, Searchcaster and Yup. These apps provide different social networking functionality, such as content discovery, social interaction and communication, and data aggregation.
- Continuous feature development. The SocialFi platform’s agility in rolling out new functionalities has kept users engaged. In recent months, Farcaster received improved content discovery tools, enhanced user profile features, and a more seamless integration with ENS.
- Growing app ecosystem. Farcaster’s integrations with third-party apps expand its utility and appeal. By the end of 2023, more than a dozen apps have been launched within the ecosystem.
- Community engagement. The creative use of memes and organic retention strategies have cultivated a loyal user base.
As of early February 2024, over 560 geographically dispersed hubs contribute to the platform’s robust decentralisation. This approach not only enhances scalability but also improves trust within the ecosystem by distributing control.
Farcaster’s Ecosystem and Community
Beyond its technical foundation, Farcaster has rapidly established itself as a leading player in the SocialFi vertical. Its success is attributed to:
Frames
One of Farcaster’s standout features is Frames, which offers a seamless one-click user experience for activities like payments, voting, and subscriptions—all without leaving the app. Frames’ intuitive design allows users to easily iterate and build upon it, making it a versatile tool for creators and developers alike.
In a recent example from late January 2024, it took less than nine hours for a user to develop a working Girl Scout cookie shopping cart using Frames, showcasing the feature’s potential for real-world applications.
Expanding Use Cases
Farcaster continues to expand its ecosystem through:
- Gamified Experiences: Platforms like FarQuest offer gamified onboarding and marketplaces, introducing Layer 2 solutions to reduce fees.
- Prediction Markets: Tools such as Perl let users bet on outcomes and earn rewards, driving engagement.
- Tokenised Streaming: Live-streaming apps like Unlonely integrate tokenisation for community voting, adding new layers of interactivity.
- Memecoins for Onboarding: Projects like DegenTokenBase have pioneered airdropping tokens with in-app utility for tipping creators, incentivising user retention and higher-quality content.
Addressing Farcaster’s Challenges
While Farcaster’s growth has been impressive for the SocialFi sector, it also faces challenges.
The first one is the increasing storage demand. Farcaster’s reliance on hubs to store copies of off-chain data presents a significant scalability challenge, as the volume of data grows with increased user activity. If storage costs are not effectively optimized, hubs risk becoming prohibitively expensive to operate, potentially limiting the network’s ability to scale and accommodate more users.
Secondly, there is the centralisation risk. While Farcaster aims to decentralise social networks, the reliance on a limited number of hubs could concentrate power and control over data in certain hands. The protocol community should make sure that hubs are widely distributed and incentivised to remain independent, to preserve Farcaster’s decentralisation while maintaining operational efficiency.
The Future of Farcaster and SocialFi
Farcaster’s rapid adoption and innovation mark a significant milestone in the SocialFi landscape, bringing the industry closer to realising the vision of DeSoc. Though achieving mass adoption will require overcoming challenges, Farcaster’s progress in early 2024 is a great example of the potential of Web3 for improving social network services.
At DWF Ventures, we’re looking to support the future of decentralised finance and social networks. If you’re building in this space, we’d love to hear from you, reach out to our crypto venture capital fund directly.
![Developments in the DeSoc Space](/images/research/Threads-Intro-Images/Resize/DWFVentures-Tako-Protocol-Thread-intro-960x540px.png)
- Details
Crypto projects in the space of Decentralised Social Networks (DeSoc) have seen substantial advancements recently, offering promising benefits such as censorship resistance, enhanced monetisation opportunities for creators, and greater user control. Censorship resistance is one of the strongest value propositions of DeSoc: it defines the ability of said platforms to prevent a single entity from arbitrarily controlling or removing content, which is a growing concern in traditional social networks.
Yet, despite their potential, DeSoc protocols have not achieved widespread adoption, a challenge that we believe may shift in 2024.
Leading DeSoc Projects
At the forefront of the DeSoc ecosystem are two dominant social graph protocols: Lens Protocol and Farcaster.
Social graph protocols serve as the foundational infrastructure for mapping relationships and interactions between users in a decentralized manner, enabling individuals to retain ownership of their data, while enabling interoperability. These protocols leverage blockchain technology or other decentralised architectures to create transparent, tamper-proof, and portable records of social connections, activities, and preferences. By doing so, they eliminate reliance on centralised intermediaries, giving users control over their online identities and how their data is accessed or shared.
Both Lens and Farcaster have cultivated thriving ecosystems of dapps, yet they differ in how developers build on them. This divergence often forces developers to commit to one protocol early on, limiting interoperability and user reach.
Tako Protocol: Bridging the Gap
Tako Protocol is an emerging force of DeSoc.
Tako is designed to help developers build apps that can work across different social networks rather than being limited to just one. It offers easy-to-use resources for developers such as SDKs and APIs to simplify the process of adding features like monetisation and improving the way apps function and interact with users.
By integrating with Tako, developers can bypass the limitations of building exclusively on one protocol, enabling a more flexible and interoperable approach to creating decentralised social platforms. Interoperability, in this context, means that applications can work seamlessly across different protocols, enhancing user accessibility and reach.
An example of Tako’s potential lies in Jamfrens, a protocol allowing creators to tokenise their social influence on both Lens and Farcaster. By tokenising influence, creators can assign value to their social interactions, which can then be traded or monetised.
Tako’s APIs drive monetisation by incentivising meaningful engagement, amplifying the network effects—a phenomenon where a service becomes more valuable as more people use it—that creators rely on to grow their communities.
Customisation and Expansion
![Overview of supported and upcoming integrations with Web3 social networks](/images/research/2024/2024-02-01-desoc/fulldesoc-2-compressed.webp)
Tako’s adaptable smart contracts make it suitable for various genres of social protocols, extending its applicability beyond Lens and Farcaster. This flexibility allows developers to tailor features to specific use cases. Upcoming integrations with platforms such as Deso Protocol promise to expand its target user base further, contributing toward a more unified and robust DeSoc ecosystem.
Challenges to Overcome
While DeSoc networks looks promising, several barriers remain:
- Interoperability. The fragmented nature of current protocols can hinder user adoption and limit dApp scalability. Tackling this issue will involve creating middleware solutions, such as Tako, that bridge the gaps between systems.
- Adoption Rates. Competing with established centralised platforms like Twitter and Facebook requires significant improvements in user experience and onboarding processes to attract non-technical users.
- Scalability. Ensuring the infrastructure can handle mass adoption without compromising decentralisation or performance is a critical hurdle. This will involve advancements in blockchain technology to reduce costs and improve efficiency.
The Path Forward for DeSoc
Despite these challenges, the momentum in the DeSoc space is undeniable. As the ecosystem matures, we anticipate greater adoption, interoperability, and creativity to pave the way for a decentralised internet where users truly own their data and social experiences.
We’re committed to supporting projects in the DeSoc space. If you’re building solutions to redefine how social networks operate, and is searching for suppport from a crypto venture capital fund, feel free to contact DWF Ventures directly.
![](/images/research/2024/2024-01-31-perp/2024-01-31-perp-cover-intro.webp)
- Details
In one of the previous articles, DWF Ventures covered the specifics and landscape of decentralised perpetual futures, including the evolution of existing perpetual DEXs and the potential developments in this space.
In this article, we will explore the current tokenomics landscape of perpetual DEXs, analyse the different mechanisms utilised by the protocols and discuss potential future developments.
Why Is Tokenomics Important?
Tokenomics is crucial to the growth and stability of a protocol. With the experiences of “DeFi Summer”, liquidity mining was successful in bootstrapping yield protocols in the early stage but was eventually unsustainable in the long run. The mechanism attracts mercenary capital which fostered a vicious cycle of ‘farming and dumping’, with farmers constantly searching for the next protocol offering higher yields while leaving the farmed protocol decimated.
One example of this was Sushiswap’s vampire attack on Uniswap, which was successful in attracting significant TVL at first, but eventually proved unsustainable. Meanwhile, protocols like Aave and Uniswap managed to attract and retain their users sustainably with a product-first focus and sustainable tokenomics has helped in solidifying their positions as market leaders, a status they continue to maintain to this day.
![](/images/research/2024/2024-01-31-perp/2024-01-31-perp-1.webp)
While product-led growth is important, tokenomics serves as a distinguishing factor for crypto perpetual DEXs in a competitive market. Tokens represent how much users value a protocol based on its activity, similar to how stocks reflect the predicted performance of the companies. Unlike traditional markets, token prices often precede widespread awareness and growth for crypto projects.
Thus, it is important to have tokenomics that accrues value from the growth of the protocol. Ensuring that there is a sustainable token economy that provides sufficient incentives for new users to come in will be important as well. Overall, good tokenomics are key to achieving long-term growth and retaining value for the protocol.
Recap: The Perpetual DEXs Landscape
In our article about decentralised perpetual futures, we covered the evolution and mechanisms of perpetual DEXs extensively. Going into the tokenomics of these protocols: dYdX was one of the first to introduce perpetual contracts on-chain in 2020, and launched its token in September 2021. The token was generally known for high levels of inflation due to emissions from staking, LP and trading rewards as not much utility was provided for their holders apart from trading fee discounts.
Targeting the unsustainability of emissions was GMX, which entered the scene in September 2021. GMX was one of the first to introduce a Peer-to-Pool model and a fee sharing mechanism for users which garners yield from trading fees, paid out in majors and the native token. The success of it has also led to the creation of more Peer-to-Pool model systems, such as Gains Network. It differed in the collateralisation model and revenue sharing parameters, which had lower risk levied on users but produced lower yields as well.
Synthetix is another original DeFi protocol in the space, which supports multiple front-ends of perpetual and options exchanges, such as Kwenta, Polynomial, Lyra, dHEDGE, etc. It utilises a synthetic model whereby users must stake their SNX tokens as collateral to borrow sUSD for them to trade. Stakers receive sUSD fees that are generated from trading on all front-ends.
Mastering the Tokenomics of a Perp DEX
Designing good tokenomics involves careful consideration of various factors to create a system that aligns the incentives of participants, and ensure the long-term sustainability of the token. We discuss the various factors below based on the current landscape of perpetual DEXs tokenomics.
The table below shows the comparison between different crypto perpetual protocols and their tokenomics:
![The comparison of tokenomics in various crypto perpetual protocols. Source: DWF Ventures](/images/research/2024/2024-01-31-perp/2024-01-31-perp-2.webp)
1. Incentives and Rewards
Incentives and rewards play an important role in encouraging desired behaviours. This includes staking, trading, or other mechanisms that encourage users to contribute to the protocol.
Staking
Staking is a mechanism that gives out yield in exchange for depositing the native token with the protocol. Yields received by users are either through revenue sharing from fees which can be in larger cap tokens/stablecoins or from native token emissions. From the protocols we analysed, there are 3 main types of staking:
- Fee sharing in majors or stablecoins.
- Fee sharing in native token.
- Emissions shared in native token.
![](/images/research/2024/2024-01-31-perp/2024-01-31-perp-3.webp)
As shown from the table, fee sharing has been effective in incentivising users to stake their tokens. For dYdX and Synthetix, the table reflects the recent changes made to tokenomics — including the introduction of 100% fee sharing for dYdX v4 and the elimination of inflationary SNX emissions.
Previously, dYdX v3 had a safety and liquidity staking pool which emitted inflationary DYDX rewards as the pools were not directly benefiting from the volume on the platform. Both pools were deprecated in Sept/Nov 2022 after a vote by the community as it did not really serve its purpose and was not efficient for the DYDX token as well. With v4, there are fees accrued from trading volume that is given back to stakers, incentivising users to stake for yield.
GMX utilises a mix of two types of staking for their rewards, sharing fees in majors (ETH/AVAX) as well as their native token. GMX, Gains Network and Synthetix have very high rates of tokens staked, signalling that the rewards are sufficient to incentivise users to provide upfront capital and maintain their stake within the protocol. It is hard to determine what is the ideal mechanism, but having a portion of fees paid out in majors/stables and introducing vesting for emissions in native tokens has been effective so far.
Overall, staking has the following benefits:
- Reduce the circulating supply of tokens (and selling pressure). Only works if the yield generated is not pure emissions to ensure sustainability. Having the yield generated in majors or stablecoins will reduce the selling pressure as users do not have to sell to “realise” their yield
- Value accrual to staked token. The token can indirectly grow in value as the protocol grows and fees generated per token increase. Having a stable yield for the token can attract non-traders to enter just to earn yield as well.
Nonetheless, there are several factors to consider for staking based on the protocol’s goals. The first one is the longevity and type of rewards. Having stable rewards will be important given that users who are more risk-averse need not sell off the tokens to ‘realise’ their yield. The rate of emission is also important to ensure that the yield users receive are not too volatile and can be sustained over a period of time.
The second factor is rewarding the right users. Having a lower barrier to entry and ease of earning rewards (no upfront capital, no vesting, etc.) will likely attract mercenary users that will dilute the rewards of active users (active traders, long term stakers etc.).
Here are our thoughts. Staking is common in most protocols to reduce the circulating supply of the token. It is a good way to align interests with users especially if staking is required for collateral (e.g. SNX), it reduces the volatility of users’ position through returns in yield. The effects of staking will be more positive and long-term if rewards are given out on a portion of fees and in majors/stables, which would be suitable for most perpetual DEXs that are doing decent volumes.
Liquidity Providers (LPs)
Liquidity Providers (LPs) are essential to a perpetual DEX, especially for Peer-to-Pool models as it would allow them to support more volume on the platform. For Peer-to-Pool models, LPs become counterparties to traders on the platform. As a result, the yield shared from fees must be sufficient to offset the risk of losing to traders.
For orderbook models like dYdX, LPs serve as a way for users to earn rewards. However, most TVL still comes from market makers and rewards emitted in DYDX were pure inflation. Thus, the LP module being deprecated in October 2022. An exception would be Synthetix whereby stakers are technically LPs on the platforms integrated with it (Kwenta, Polynomial, dHEDGE etc.) and earn fees from trading volume.
GMX and Gains Network both utilise peer-2-pool models that require LPs as counterparties to the trades performed on the platform. Comparing both protocols:
- GLP has significantly higher TVL than gDAI, possibly due to higher yield.
- gDAI users have lower risk of losing with trader’s wins being backstopped by GNS minting, while GMX pays out users’ funds from GLP.
- Users who are more risk averse will be attracted to GLP with higher yields while users who are less can deposit into gDAI despite lower yields
The mechanism of Gains Network is similar to the predecessor of GMX, also known as Gambit Financial on BNB. Gambit generated quite a lot of volume and TVL upon launch. While Gambit and GMX shared similar characteristics on the Peer-to-Pool model and revenue sharing mechanism, their parameters differed from one another.
![Trading on Gambit vs. GMX V1. Source: DWF Ventures](/images/research/2024/2024-01-31-perp/2024-01-31-perp-4.webp)
While Gambit gained decent traction, GMX on Arbitrum exploded in terms of trading volume and users after making changes in their tokenomics and structure. Based on the examples, we have noticed these key changes:
- Providing most revenue shares to stakers/LPs. Gambit only allocated 40% of revenue share (20% to USDG + 20% to Gambit stakers) back to holders/stakers, while GMX v1 allocated 100% (70% to GLP + 30% to GMX stakers). Increasing the amount of yield directed to stakers/LPs generated a good narrative that appealed to a wider audience apart from just pure traders.
- Transferring risks towards LPs as the direct counterparty. Gambit only allocated 20% revenue share to USDG while GMX allocated 70% revenue share to GLP. Both USDG and GLP are minted by depositing whitelisted assets to provide liquidity for platform trades. The reason Gambit offered lower yield to LP is that USDG is a stablecoin and the platform directed 50% revenue share as collateralisation to ensure the redemption of funds for LPs. In contrast, GMX shifts the risk to LPs who bear the brunt of traders’ wins or losses.
From the case studies of Gains Network, Gambit and GMX, shows that increased LP yields can incentivise greater liquidity as compared to the protocol absorbing some risk. With GMX v2, slight changes were made to the tokenomics which reduced the fee share to stakers and GLP holders by 10%. Details on the adjustment:
- GMX V1: 30% allocated to GMX stakers, 70% to GLP providers.
- GMX V2: 27% allocated to GMX stakers, 63% to GLP providers, 8.2% to the protocol treasury, and 1.2% to Chainlink, which has been approved by a community vote.
Community members were largely supportive of the vote, and the continued growth of GMX V2’s TVL signals that the change was positive for the protocol.
There are benefits to rewarding LPs significantly, especially for Peer-to-Pool models whereby they are one of the key stakeholders:
- Enhanced stickiness towards the protocol through stable yield. It means reduced risks of LPs of losing their initial capital. Coupled with stable yield, it reduces the inertia of shifting their positions as they do not need to liquidate their tokens to ‘realise’ yield. This mechanism does not work for dYdX v3 given the volatility in yield which is emitted in their native token.
- Value accrual to native token. For GMX, the growth of GLP and volume on the platform indirectly increases the value of GMX in terms of fees generated per token, which was a great driver of demand for the token.
Among the factors to consider is the adjustment of LP risks of LP for the protocol. LPs can face risk for both Peer-to-Pool and orderbook models if parameters are not changed according to risk and market conditions. SNX stakers lost $2m recently to a market manipulation incident on TRB, as OI caps were set as an amount of TRB tokens and not the USD amount. GLP holders have largely been benefiting from traders’ losses in the past but there are questions on the sustainability of the mechanism. With the change in tokenomics, it could bepossible that any significant traders’ wins can be backstopped by the protocol treasury.
We think that this mechanism is crucial for Peer-to-Pool models as incentivising crypto liquidity from users is needed for growth. GMX has been effective in doing so over time with a high percentage of revenue share and accruing traders’ losses. While there are risks for LPs when traders win, we believe that the yield of a midsize protocol can offset that risk significantly. Thus, we believe that incentivising LPs sufficiently will be important for bootstrapping a strong user base.
Trading
Trading rewards are mainly used to incentivise volume as they are mostly emissions, usually in the protocol’s native token. Rewards are usually calculated as a percentage of trading volume/fees against the total rewards scheduled for a certain period.
For dYdX v3, the total supply reserved for trading rewards was 25% and was the main source of emission in the first 2 years. As a result, the amount of trading rewards was often more than the amount of fees paid by traders, which means that token emissions were highly inflationary. With little incentive to hold (mainly only for trading fee discounts), it resulted in large selling pressures for DYDX over time. This changes for dYdX v4, which are explored in the section below.
Kwenta had also enabled trading rewards to traders on the platform, capped at 5% of the total supply. It requires users to stake KWENTA and carry out trades on the platform to be eligible. Rewards are determined by a percentage of staked KWENTA and trading fees paid multiplied together, which means that rewards do not exceed the upfront cost (staked capital + trading fees) of users. Rewards are subjected to a 12 month escrow, and rewards can be reduced by up to 90% if users want to vest their rewards early.
Overall, a clear benefit to introducing trading rewards would be additional incentive volumes in the short term. With dYdX V3 rewards, traders were essentially getting paid to trade, which helped in driving volume up.
However, it is advisable to ponder, which type of users the protocol wants to attract. For dYdX, the ease of qualifying for the rewards and lack of vesting terms would have likely attracted many short term users that dilute rewards away from real users. For Kwenta, the requirement of upfront capital and vesting terms would have made it unattractive for short term users — which could reduce the dilution of rewards for long term users.
We think that trading rewards could be an effective way to bootstrap a protocol at the start, but should not be used indefinitely given that constant emissions will reduce the value of the token. It should also not take up a significant percentage of supply and inflation per month, and vesting would be important for protocols to spread out selling pressure over time.
Value Accruals to the Chain
Let us unveil the details in this part using a case study of dYdX. The launch of dYdX Chain has marked a new milestone for the protocol. On January 18, 2024, dYdX Chain even once passed Uniswap as the biggest DEX by volume.
![](/images/research/2024/2024-01-31-perp/2024-01-31-perp-5.webp)
Going forward, we would potentially see more perpetual DEXs following such footsteps. The main changes from the updated tokenomics for the dYdX Chain include the staking for supporting security, not only for generating yields. In V3, rewards in the Safety Pool were given out as DYDX token emissions but was eventually winded down after the community voted in favour in DIP 17. In V4, the dYdX Chain requires the staking of dYdX tokens for validators to run and secure the chain. Delegation (staking) is an important process whereby a staker delegates validators to perform network validation and block creation.
Additionally, 100% of transaction fees will be distributed to delegators and validators. In V3, all fees generated were collected by the dYdX team, which has been a concern for some in the community. In contrast, the V4 made all fees, including trading fees and gas fees, to be distributed to the delegators (stakers) and validators. This new mechanism is much more decentralised and aligned with the interest of network participants. PoS stakers (delegators) can choose the validator to stake their dYdX tokens and in turn, receive revenue share from their validators. The commission to delegators (stakers) ranges from a minimum of 5% to a maximum of 100%. Currently, according to Mintscan, the average validator commission rate on the dYdX Chain is 6.82%.
![](/images/research/2024/2024-01-31-perp/2024-01-31-perp-6.webp)
On top of these changes, the new trading incentives also ensure that rewards do not exceed fees paid. This is an important factor, as many concerns around V3 were on the inflationary and unsustainable tokenomics which was updated along the way, but with little impact on the token performance. The issue of being able to “gamify” rewards was brought up by Xenophon Labs and other community members, which has also been discussed several times in the past.
In V4, users can only earn trading rewards up to 90% net trading fees paid to the network. This will lead to an improved demand (fees) vs supply (rewards) balance and control token inflations. Rewards are capped at 50,000 DYDX per day and will last for 6 months, ensuring that the inflation will not be significant.
We believe that dYdX Chain is at the forefront of the industry towards greater decentralisation The validation process plays several crucial roles in the new chain: securing the network, voting on on-chain proposals, and distributing staking rewards to the stakers. Coupled with the 100% fee distribution back to stakers and validators, it ensures that rewards are aligned with the interests of network participants.
Value Accruals to the Liquidity Hub
Synthetix serves as a liquidity hub for multiple front-ends of perpetuals and options exchanges, such as Kwenta, Polynomial, Lyra, dHEDGE. These integrators have created their own custom features, built their own communities, and provide users with trading frontends.
![](/images/research/2024/2024-01-31-perp/2024-01-31-perp-7.webp)
Among all integrators, Kwenta is the leading perpetual exchange that drives most volume and fees to the entire Synthetix platform. Synthetix is able to capture the values brought by Kwenta and other exchanges, and the success is contributed by Synthetix’s tokenomics. The key reasons are:
- Staking SNX is the first required step to trade on integrators. Even with their own governance token, Kwenta only quotes its asset prices against sUSD, which can only be minted by staking SNX tokens. Apart from Kwenta, other integrators like Lyra and 1inch & Curve (Atomic Swaps) also utilise sUSD and thus SNX tokens. Therefore, the front-end integrators of Synthetix are allowing value accruals to the SNX tokens.
- Liquidity hub’s rewards distribution to integrators. In April 2023, Synthetix announced exciting news of giving out their massive Optimism (OP) token allocation to traders. For 20 weeks, Synthetix gave out 300,000 OP tokens per week, while Kwenta gave out 30,000 OP per week. The protocol was able to attract higher trading volume and fees from Q2 to Q3 2023. This had been a major catalyst for Synthetix price growth.
![](/images/research/2024/2024-01-31-perp/2024-01-31-perp-8.webp)
We can see from the chart that the amount of staked SNX TVL has been highly correlated to the price performance of SNX. As of 22 Jan 2022, Synthetix had around $832 million worth of staked TVL. They had the highest percentage of tokens staked (81.35%) compared to dYdX, GMX, and Gains Network.
In our opinion, for a liquidity hub, relationships should be mutually beneficial. While Synthetix is providing liquidity to these integrators, it is also receiving values from these integrators in terms of fees from trading volume that indirectly drives Synthetix’s TVL. With greater demand for trading on their integrators, this will lead to increased demand and lower selling pressure for SNX, which indirectly drives up the value of the token. Therefore, it is beneficial for Synthetix and its token holders to collaborate with more front-end integrators.
2. Buybacks and Burn
Buybacks can be done by taking a portion of revenue to buy the token on the market to directly drive prices up or burn to reduce the circulating supply of the token. This reduces the circulating supply of a token, which has the allure of raising prices in the future due to reduced supply.
Gains Network has a buyback and burn program whereby depending on the collateralisation of gDAI, a percentage of trader’s losses can be directed to buy back and burn GNS. The mechanism has resulted in the burn of over 606,000 GNS tokens, equivalent to about 1.78% of the current supply. Due to the dynamic nature of GNS supply through its mint and burn mechanism, it is unclear whether the buyback and burn has a significant impact on the price of the token. Nonetheless, it is one way to counter the inflation of GNS which has kept supply hovering between 30–33m in the past year.
Synthetix has recently voted to introduce a buyback and burn mechanism as part of its Andromeda upgrade. The proposal potentially sparked renewed interest in SNX, as stakers receive a double-sided effect on their position of staking for fees and holding a deflationary token. This reduces the risk of the pure staking as the allocation for the buyback and burn can be used as a backstop for any incidents such as what happened with TRB.
Key benefits of the mechanism:
- Ability to control/reduce supply. This ensures that token holders are not diluted over time by emissions from rewards or other stakeholders’ tokens.
- Incentivises users to hold onto token. Holders and stakers of the token get additional utility of earning rewards combined with holding an asset that is ‘deflationary’.
However, the effects of buybacks is also largely dependent on the revenue of the protocol to maintain the significance of burning. Without a steady stream of revenue, the mechanism will be unsustainable and its reduced impact can disincentivise users to keep holding onto the token.
We think that the burning mechanism may not have a direct impact on price yet can promote the narrative of buying a deflationary token. This was largely effective for protocols that generate strong revenues and have a large percentage of their total supply already circulating (eg. RLB). Thus, it would be ideal to be used for protocols like Synthetix that are already established and do not have much supply inflation.
3. Token Allocations and Vesting Schedule
Taking note of token allocations and vesting schedules of different stakeholders are important to ensure that the parameters are not skewed towards certain stakeholders. For most protocols, the main split between stakeholders are the investors, team and the community. For community tokens, this includes airdrop, public sale, rewards and DAO tokens etc.
![SNX allocation is calculated based on the increased supply of tokens for rewards emission in their monetary policy change in February 2019. Source: DWF Ventures](/images/research/2024/2024-01-31-perp/2024-01-31-perp-9.webp)
Some observations as we looked through the allocations and vesting for these protocols:
- GMX and Gains Network are outliers in terms of having fundraised only from the public sale of tokens. Having a ‘community-owned’ protocol could reduce users’ concerns about investors’ overhead and incentivise them to hold the token and engage with the protocol.
- dYdX and Synthetix both had significant supply reserved for investors — 27.7% and 50% (at the start before supply changes). However, dYdX had a long cliff of around 2 years while Synthetix had a 3 months cliff with 4 quarterly unlocks after TGE.
- GMX and Gains Network both converted from another token to the current one, which meant that the majority of the supply was already unlocked upon launch. This means that further emissions from rewards in the future will be minimal as a percentage of the circulating supply.
- dYdX and Synthetix both had a substantial amount of supply (>=50%) reserved for rewards. However, dYdX rewards were pure emissions while Synthetix gave out a part of fees + emissions that were vested over 12 months. This reduced the inflation of SNX in comparison to DYDX.
There is no clear formula for token allocation or vesting as the stakeholders and mechanisms employed differ largely from protocol to protocol. Nonetheless, we believe the factors below would generally lead to favourable tokenomics for all stakeholders.
In our opinion, the community should be allocated the most tokens. Token allocation for the team should not be too much, and the vesting schedule should be longer than most stakeholders as it could signal their conviction in the project. Investors’ tokens should have the smallest allocation and vest for a substantial amount of time. Emissions should be spread out over a period of time and include some form of vesting to prevent significant inflation at any point in time.
4. Governance and Voting
Governance is important for perpetual DEXs, given it empowers token holders to participate in the decision-making processes and influence the direction of the protocol. Some decisions that can be made by governance include:
- Protocol upgrades and maintenance. Perpetual DEXs are often subject to upgrades and improvements to enhance functionality, scalability, and growth. This ensures that the protocol remains up-to-date and competitive. For example, in GMX’s recent snapshots, the governance passed the proposals of creating a BNB market on GMX V2 (Arbitrum) and GMX V2 fee split .
- Risk management and security. Token holders can collectively decide on collateral requirements, liquidation mechanisms, bug bounties, or emergency measures in the event of a breach or exploit. This helps to protect user funds and build trust in the protocol. The recent incident encountered by Synthetix due to TRB price volatility led to a $2 million loss for stakers. This highlights the importance of consistently reviewing parameters, adding volatility circuit breakers, and adding increased sensitivity to the skew parameters on pricing volatility spikes.
- Liquidity and user incentives. Token holders can propose and vote on strategies to incentivise liquidity providers, adjust fee structures, or introduce mechanisms that enhance liquidity provision. For example, the governance of dYdX had passed on the V4 Launch Incentives proposal.
- Decentralised community with transparency. Governance is supposed to build decentralised communities with transparency and accountability. Publicly accessible governance processes and on-chain voting mechanisms provide transparency in decision-making. For example, DEXs like dYdX, Synthetix, GMX adopt on-chain voting mechanisms to foster decentralisation.
From our point of view, governance helps create a robust and inclusive community for stakeholders involved in the decentralised perpetual exchanges. Having on-chain voting mechanisms and transparency in decision making builds trust between stakeholders and the protocol as the process is fair and accountable to the public. Thus, governance is a key feature of most crypto protocols.
Experimenting with New Mechanisms
Apart from the factors mentioned above, we believe that there are many other innovative ways to introduce additional utility and incentivise demand for tokens. Protocols will need to prioritise in introducing new mechanisms, based on which stakeholders they are targeting and what would be the most important to them. The table below shows the key stakeholders and their key concerns:
![Key concerns of crypto perpetual protocols' stakeholders, their importance and the ways of addressing. Source: DWF Ventures](/images/research/2024/2024-01-31-perp/2024-01-31-perp-10.webp)
Given the wide range of concerns, it would not be possible to cater to all stakeholders. Hence, it is important for a protocol to reward the right group of users to ensure sustained growth. We believe that there is room to introduce new mechanisms that can better balance the interests of different stakeholders.
Conclusion
Tokenomics is a core part of any crypto protocol. There is no clear formula for determining successful tokenomics as there are numerous factors that influence performance, including those that lie beyond a project’s control. Regardless, the crypto market is fast moving and constantly changing, which highlights the importance of being responsive and the ability to adapt based on the market. From the examples above, it showed that experimenting with new mechanisms could be very effective at achieving exponential growth as well.
We believe there is further room for growth in experimenting with tokenomics, and, as the crypto venture capital firm, DWF Ventures is dedicated to supporting the teams building in this space. Interested parties can also pitch their project to DWF Ventures using the on-page form.
![DWF Ventures: 2024 Narratives](/images/research/Threads-Intro-Images/Resize/DWFVentures-2024-Narratives-Thread-intro-960x540px.jpg)
- Details
The year 2023 has been a rollercoaster for the crypto market, characterised by challenges and opportunities. As the bear market prompted a focus on improving infrastructure to drive wider adoption, unexpected developments like the resurgence of memecoins fuelled an early bullish sentiment.
Now, as the year closes, DWF Ventures shares the outlook for 2024, and the narratives we believe will shape the market in 2024.
Crypto Market Trends and Emerging Narratives
The current market landscape is undergoing intriguing shifts. While 2023 prioritised building resilient infrastructure, 2024 brings a wave of compelling narratives:
- Bitcoin.
- Memecoins.
- Crypto derivatives.
- Restaking.
- Zero-Knowledge (ZK) use cases.
- Consumer dapps.
- Gaming.
- Decentralised artificial intelligence.
- Decentralised Physical Infrastructure Networks (DePIN).
Each of these trends has the potential to drive significant innovation and adoption in the coming year.
2024: The Year of Bitcoin
Bitcoin remains at the forefront of crypto market discussions, with 2024 set to be an important year for its ecosystem. Key developments include:
- More Scaling Solutions: Rapid advancements in BTC scaling solutions are enhancing Bitcoin’s DeFi and NFT capabilities, while maintaining the security layer. New initiatives could unlock more use cases and attract broader participation.
- ETF Approvals: The anticipated approval of Bitcoin spot ETFs could trigger substantial liquidity inflows into the market. Analysts predict that crypto ETFs may become some of the most successful launches in financial history, potentially capturing a significant share of the $7.2 trillion U.S. ETF market.
- BTC Halving: Scheduled for April 2024, the Bitcoin halving event is expected to drive increased activity in the ecosystem. Projects that prioritise seamless user experiences and reduce onboarding friction will likely capture the greatest benefits from the projected market growth.
Beyond Bitcoin: Other Crypto Narratives
While Bitcoin takes centre stage, several other narratives are set to shape 2024:
- Memecoins. Meme tokens have proven their ability to catalyse market movements, and their cultural resonance continues to drive engagement and activity.
- Crypto derivatives. The derivatives market is expanding, offering new ways to hedge risks and leverage opportunities in crypto trading.
- Restaking. Restaking is an emerging trend, which enhances staking mechanisms, creating additional layers of utility and security for blockchain networks.
- ZK use cases. Zero-knowledge technology is revolutionising privacy and scalability, unlocking applications across DeFi, identity, and more.
- Consumer crypto. The rise of user-friendly decentralised applications is making blockchain technology more accessible to everyday users.
- Gaming. Blockchain gaming and the GameFi sector continue to evolve, integrating immersive experiences with innovative economic models.
- AI. Often coupled with crypto, artificial intelligence is playing a growing role in optimising blockchain applications and enabling smarter decision-making processes.
- DePIN. Decentralised Physical Infrastructure Networks represent an exciting frontier, leveraging blockchain to manage and monetise real-world assets and infrastructure.
Looking Ahead
2024 promises to be a great year for the market, with Bitcoin leading the charge and other narratives gaining momentum.
As a crypto venture capital firm, we’re committed to supporting projects building in the space. If you’re building in the above mentioned spaces, we’d love to hear from you: contact DWF Ventures via the form on our page.
![](/images/research/2023/2023-11-17-depin/2023-11-17-depin-cover-intro.webp)
- Details
In the ever-evolving landscapes of Web3 and the Internet-of-Things (IoT), a revolutionary concept has emerged, challenging the traditional paradigms of connectivity and paving the way for a decentralised future: Decentralised Physical Infrastructure Networks, or DePIN.
In this article, we will delve into the distinctive value proposition offered by DePIN in contrast to existing solutions. Additionally, we will explore some key challenges that projects may face as well as the key elements that we are looking out for projects building in this vertical.
![](/images/research/2023/2023-11-17-depin/2023-11-17-depin-1.webp)
Understanding DePIN: Origins and Market Potential
Messari coined the term DePIN, short for Decentralised Physical Infrastructure Network, in late 2022. This term accurately describes Web3 protocols that bring together and provide services or resources from a decentralised network of physical machines and devices, incentivised via a token model.
To understand the significance of DePIN, it is essential to dive into the origins and evolution of the IoT itself. The inception of the IoT dates back as early as the 1980s. The foundational concept revolved around integrating computing capabilities into ordinary objects, enabling them to communicate with each other. As the internet grew rapidly from the early 2000s, the surge in connected devices prompted a demand for standardisation. Led by the Internet Engineering Task Force (IETF), the Internet Protocol (IP) became the framework for data transmission across networks. Newer iterations of IP are continuously developed over time, and the latest IPv6 pushes the boundaries of the current infrastructure by being able to support the ever evolving demands of the internet.
The rapid expansion of the internet has brought forth various challenges, such as security and privacy issues emerging as some of the key concerns. The interconnected nature of devices poses a significant risk whereby one vulnerable device can potentially compromise the entire network. Data breaches are one of the most common instances of IoT vulnerability, which has experienced a significant uptick from the mid 2010s onwards in the U.S. This has provided compelling reasons for the exploration of decentralised alternatives to protect the dynamic and evolving IoT landscape.
While the IoT space is advancing, the pursuit for enhanced efficiency persists. This drive gave rise to Helium, a decentralised wireless network designed to improve connectivity and equitable user participation. In an oligopolistic domain, the emergence of decentralised alternatives presents a valuable opportunity for both users and potential founders. This sparks healthy competition and the possibility of better products. The IoT market is growing fast, expected to double in revenue within five years, potentially exceeding a $2 billion mark by 2028. Thus, these decentralised alternatives, or, what we know as “DePINs”, certainly have incredible potential to grow over the next few years.
Navigating the DePIN Spectrum: From Physical Machines to Digital Resources in Web3 Evolution
In the dynamic landscape of DePIN, the definition spans a broad spectrum. Blockchains themselves could be considered in line with the DePIN concept. Operating through miners or validators within a decentralised network, they ultimately provide the crucial resource known as “Consensus.” The notion of DePIN might have been present since the beginning of decentralised networks with Bitcoin’s theorisation, gaining official recognition only recently.
![](/images/research/2023/2023-11-17-depin/2023-11-17-depin-2.webp)
To offer clarity, mapping DePINs across a spectrum becomes essential. This spectrum spans from decentralised networks of physical machines, highlighting hardware specificity and software agnosticism, to decentralised networks of digital resources, emphasising software specificity and hardware agnosticism.
![](/images/research/2023/2023-11-17-depin/2023-11-17-depin-3.webp)
Many of these decentralised resource networks trace back to a Web2 counterpart, as they aim to resolve existing issues more efficiently. However, some were created to fulfil on-chain specific needs. Such solutions were either not possible or not needed in the Web2 world. For example, Oracles like Chainlink are product offerings unique to Web3. They enable smart contracts to execute based on data from the real world, by using a decentralised network of independent oracle node operators to retain the trustlessness of the system.
The debate arises on whether solutions like Chainlink can be considered DePINs. This hinges on one’s interpretation of the physicality of nodes/service providers in the network, often run on independent machines. However, the more crucial consideration is the value these decentralised networks bring over their centralised counterparts. By focusing on the strengths of decentralisation, the hope is to guide Web3’s evolution into creating real substantial value for the masses.
DePIN’s Value Proposition
Now that we have covered the What, it’s time to explore the Why. There are 3 main stakeholders in every DePIN protocol which will highlight the value propositions:
- Demand side users (consumers of the resource).
- Supply side users (providers of the resource).
- Protocols themselves (teams and investors).
Taking all involved parties into account, the decentralisation effect can brings forth three distinctive value propositions of DePINs:
1. Economic Efficiency from Underutilised Resources
The primary and most crucial benefit of DePINs lies in the economic advantages they offer, benefiting all stakeholders involved. The significant economic efficiency derived from decentralisation comes from the ability to harness globally idle resources that would otherwise go to waste.
In the case of decentralised computing: idle servers, GPUs, CPUs (supply side) around the globe now have an avenue to monetise their assets as they depreciate or become obsolete over time. At the same time, DePIN protocols (middlemen) can curate these computing resources globally at a lower cost of integration and operation, and offer these aggregated resources to users who need them (demand side) at a lower price. This also means that fungible resources like computing power and bandwidth can be more uniformly available across the globe, not only because of lower costs but also because of the removal of the need to set up infrastructure everywhere.
The market size of global cloud computing standed at $633 billion in 2023, with an anticipated compound annual growth rate (CAGR) of 16% projected from 2023 to 2032. Companies are spending from hundreds of thousands to millions of dollars just on cloud computing needs depending on the use case, which can be drastically reduced via distributed computing.
![](/images/research/2023/2023-11-17-depin/2023-11-17-depin-4.webp)
For example, Akash is a decentralised network of cloud service providers that can aggregate and provide this required cloud computing power to developers at a lower cost. Through decentralisation, this service can be efficiently curated and offered at around an 80% discount compared to traditional centralised players like AWS, Google Cloud, and Microsoft Azure.
![Comparison of prices for Akash Network and leading cloud providers. Source: Akash Network](/images/research/2023/2023-11-17-depin/2023-11-17-depin-5.webp)
Another potential source of economic efficiency is the ability to identify performance differences across nodes for different use cases. Certain nodes may specialise in producing/procuring certain resources and the DePIN protocol can intelligently allocate work to match each node’s strength. This efficiency is analogous to the economic efficiency brought about by trade globalisation, whereby each country naturally specialises in producing a set of resources, then trades with one another for resources they do not specialise in, creating an overall more efficient global economy. This once again lowers costs for the protocol and demand side users, as well as allows supply side users to focus on providing what their asset does best instead of trying to optimise for everything.
2. Lowering Barriers for Stakeholders and Geographic Entry
DePINs revolutionise market entry dynamics by significantly reducing barriers for both resource providers and new projects, fostering a more inclusive and dynamic ecosystem:
- Ease of Resource and Service Entry: DePINs simplify the process for resource and service providers to join, eliminating unnecessary hurdles and welcoming those meeting the DePIN criteria. This streamlined approach has proven effective, drawing in diverse contributors and strengthening the DePIN network. Storj’s success, boasting a more substantial provider base than Filecoin, exemplifies the positive impact of this accessible entry process.
- Adaptable Growth Across Regions with Localised Solutions: DePINs can effectively scale across diverse jurisdictions simultaneously. By utilizing incentives, DePINs can grow their network easily, bypassing the traditional expansion process of a centralized entity. Additionally, DePINs can leverage their decentralized structure to offer tailored resource solutions in various geographic and demographic communities. This flexibility ensures that resources are universally accessible and highly relevant to specific local market needs.
- Hybrid Scaling with Centralised Provisioning: DePINs can employ a flexible scaling approach, incorporating both decentralised and centralised resource provisioning. For instance, Storj optimizes scalability by centralising access and database management through Storj’s controlled satellites. This strategic hybrid model enables individual operators to secure the network with minimal infrastructure, resulting in a higher onboarding success rate, as evident in Storj’s substantial provider base: ~514,000) compared to Filecoin's ~3,800.
- Scaling performance capabilities to match centralised players.
- The ease of onboarding and adoption.
- Alignment of tokenomics and incentives.
- Hardware and software specifications.
- Addressable market.
- Location sensitivity and density.
- Archive Nodes: Data Storage nodes from Filecoin, Storj, Crust, and Arweave.
- Edge Nodes: Network of OORT issued devices with home PC capabilities for decentralised computing.
- Super Nodes: Public and private cloud service providers like Tencent Cloud, Alicloud, Seagate, etc. for higher end compute and storage requirements.
- Demand side users (those who spend tokens).
- Supply side users (receivers and stakers of tokens).
- Teams and investors (token holders).
In essence, by lowering barriers and embracing a hybrid approach to scaling, DePIN fosters a more inclusive, globally responsive, and dynamically adaptive ecosystem for resource providers and project developers alike.
3. Governance and Security
The decentralised networks also allow for implementing better governance systems. Users who are invested in DePIN protocols on both the supply and demand side can have their voices heard via voting on and opening governance proposals. This allows decisions to be made in the best interests of all stakeholders and facilitate proposals that are synergistic for stakeholders. In contrast, centralised systems make decisions driven solely by investors, who are more focused on their personal financial returns.
Trustlessness, security, and liveness of resource provision are inherent in DePINs. They come without incurring the high costs associated with security and high-quality infrastructure that is required in a centralised system. A DePIN network can maintain service provision even if some nodes are down. In contrast, centralized service providers face single points of failure, which can lead to significant disruptions if there’s a breakdown or compromise in their systems. In 2022, various cloud service outages occurred, such as Google Cloud’s increased latency in January and Slack’s three-hour outage in February. Centralised providers, like AWS, faced a two-hour outage in July 2023 due to a power outage. While uncommon, the likelihood of the majority of nodes in a DePIN network going down or being compromised independently is much lower, providing a better liveness and security guarantees.
However, ensuring these benefits truly provide value to stakeholders necessitates proper measures. There are also some factors and obstacles that DePINs may face that should be considered when building a DePIN, which are also what a crypto venture capital firm such as DWF Ventures looks for in a DePIN protocol.
Potential Hurdles and Factors to Consider for Success
While DePINs do have the potential to bring about immense value to both demand and supply side users on the resources we have today, there are several factors that we look for that make up a successful DePIN project:
1. Scaling Performance Capabilities to Match Centralised Players
With the economic benefits addressed, DePIN also have to ensure competitiveness in terms of performance as compared to centralised players. The performance of a DePIN protocol is what drives demand side adoption, which in turn can encourage supply side adoption. There are three main ways performance of a DePIN protocol can be managed:
Hardware and software specifications
Hardware and software specifications pertain to the requirements that supply side participants need to meet in order to provide resources or services at a satisfactory quality. DePINs can require specialised devices/programmes to ensure consistency in quality of service, or allow for compatibility with more commonly owned devices like mobile phones or laptops. Requiring a specialised device/program improves reliability and uptime for end users, but would act as an additional barrier to entry for service providers. In most cases, performance is prioritised, as any barriers to entry are usually worth the cost or effort for service providers if demand for the service is strong as a result.
Projects could also opt to become the exclusive distributor of their specialised device/program or collaborate with third parties to outsource production/development. Outsourcing could foster competition, potentially resulting in enhanced capabilities and lower costs for providers. However, there are risks of quality issues if production becomes too fragmented across different providers, or potential supply shortages if providers are not able to meet the project’s requirements.
For example, Helium started out with creating their own hotspots back in 2019 but only managed to onboard around 15k hotspots by the end of 2021. A proposal passed in early 2021 allowed for third party manufacturers to come onboard as long as they meet the requirements, which likely contributed to the exponential growth in hotspots from early 2021 to 2022. Helium now has over 28 manufacturers onboard, which helps in decentralisation and offers users a wider range of options as well. One of the cheaper hotspots can be obtained from Sensecap for just about $130, significantly cheaper than the original Helium hotspot that costs around $495.
![gg](/images/research/2023/2023-11-17-depin/2023-11-17-depin-6.webp)
Addressable Market
One aspect that many DePINs lack compared to their centralised counterparts is the full suite of features that these more established players offer. Centralised cloud service providers like AWS provide their users with a large ecosystem of tools for deploying their application and developing their product, whereas many DePINs today are more focused on a single service, like computing power or data storage. This shows the amount of potential for the DePINs space to grow to match the range of capabilities of centralised solutions.
OORT is taking a step in that direction, providing both decentralised compute and storage facilities by building a network of service providers for different use cases vertically and horizontally. OORT does this by having 3 tiers of service providers:
This means that OORT can offer a full suite of cloud computing services while retaining the benefits of decentralisation, as any need for intense or simple compute and big or light data storage can be fulfilled by one or more of the 3 tiers of service providers.
Location Sensitivity and Density
For some projects, achieving a sufficient level of user density within a specific geographic area is crucial to ensure the effectiveness and practicality of the service they offer. This is similar to what co-location is in traditional technology. It is particularly significant for services that rely heavily on location-based data and interactions, such as mapping services or ride-hailing platforms.
For mapping services like Hivemapper, having a high concentration of users in a particular region is essential for the platform to provide accurate and up-to-date information which would impact utilisation of the product. For example, most of Hivemapper’s coverage is currently centred around the US and a few European countries. With coverage being fragmented in areas like East Asia, Hivemapper created bounties as part of the targeted mapping initiatives in MIP-2. Additional rewards were reserved for allocation on areas listed in the proposal to encourage building fuller coverage for those areas. This is key in allowing Hivemapper to even be comparable with centralised alternatives.
![Total mapped area in Hivemapper (2023)](/images/research/2023/2023-11-17-depin/2023-11-17-depin-7.webp)
Similarly for ride-hailing services, having a sufficient number of both drivers and riders in a specific area is crucial for ensuring an optimal experience for both. A higher density of drivers means shorter waiting times for riders and more ride options for them to choose from. An example of this is Drife, which has focused its efforts on onboarding drivers to the platform in Bengaluru and has onboarded over 10000 drivers to date. This ensures that riders achieve a better user experience such that there will be recurring demand for rides to match the supply of drivers. Thus, achieving sufficient density in a certain geographical area is critical for increasing the performance of the service.
2. Ease of Onboarding and Adoption
The initial phase of a provider’s engagement with the network is the onboarding process. While hardware specifications play a role, the steps following the acquisition of the necessary hardware are important as well. Once the setup is completed, providers must monitor the hardware to ensure that the necessary conditions for earning rewards are met. Projects that require passive or active management from providers will impact the number of providers incentivised to join.
For example, setting up Helium hotspots is relatively straightforward for most users. For providers utilising Sensecap, an intuitive process involving turning on the device and configuring bluetooth is sufficient to start earning rewards. Although there is a one-time set up fee of $15, it abstracts away the complexities of interacting with the blockchain, making it more appealing to a wider range of potential providers. Providers using the app can easily monitor the status of their hotspots in one interface, ensuring that they remain operational will allow them to earn rewards passively.
![Sensecap's interface for configuring a hotspot](/images/research/2023/2023-11-17-depin/2023-11-17-depin-8.webp)
In contrast, projects like Spexigon would require users to take on an active role in the project to earn rewards. After acquiring a drone, a user has to personally fly and capture imagery from their location while ensuring that they are compliant with local regulations. Furthermore, since Spexigon is focused on capturing imagery across different geographical areas, a user’s earning potential may be limited by their surrounding environment. Thus, the number of users that can be onboarded to Spexigon are limited by a multitude of factors, ranging from the onboarding process and regulations to the need for the active use of a drone to earn rewards.
In general, an easily navigable and swift onboarding procedure will allow the project to reach a broader audience. Nonetheless, this is contingent on the project’s specific requirements and priorities. This factor would be more important for services that benefit from a higher quantity of providers such as Helium, over the curation of high quality providers.
3. Alignment of Tokenomics and Incentives
The last important factor that DePIN protocols should plan out carefully is the utility and value flow enabled by token mechanics. A great product needs an equally good token model to ensure all stakeholders are incentivised in a way, which allows the token to properly reflect the value of the project.
![](/images/research/2023/2023-11-17-depin/2023-11-17-depin-9.webp)
To begin, we can look to Chainlink for a tried and tested token utility model to determine a base token framework for DePINs. In the case of Chainlink’s token utility model, the stakeholders are similar to any decentralised resource network:
Demand side users pay LINK token for Oracle services, while supply side users get LINK token in exchange for their service. This creates demand for tokens in a way that is directly related to demand for the offering.
Supply side users are also required to stake tokens as collateral to guarantee service quality, or face slashing penalties. This encourages higher quality service and creates demand for tokens that is directly related to the aggregated quality of service by all service providers, which itself is dependent on demand for the offering: higher demand encourages higher quality service from more service providers.
Finally, token demand benefits token price, which is what token holders are looking for. This creates more demand for the token from investors in the secondary market.
From this we can see that demand side users will care least about the token price, while the team and investors care the most. Supply side users are in the middle, given that they also have certain exposure to token prices via their staked tokens. Regardless, this framework creates a scenario where all the stakeholders play a role in supporting token price.
Token emissions also have to be planned out. In most cases, token emissions exist as part of early incentive campaigns to attract users. With the 3 stakeholders in mind, it would make sense to spread token incentives across the 3 stakeholders based on how difficult the project thinks it is to attract those users to the network. Of course, with token emissions comes inflation, so projects also have to think of how to manage token emissions and burns. Projects also have to be careful not to rely too heavily on token emissions to incentivise demand side users, as these users should be first and foremost attracted by a superior product offering, which is where the ball starts rolling. Generally, token incentives should focus more on attracting supply side users to fulfil the demand for a superior offering.
Some DePINs also allow demand side users to make payment in other ways. This will not affect token demand as long as payments received are used to buy tokens to pay supply side users. Effectively, this simply gives demand side users the flexibility to choose their payment channel.
For the protocol itself, revenue can come from taking a share of the payments from demand side to supply side users, or charging membership or licensing fees to demand side and supply side users respectively. These revenue streams can be used to cover operating costs, and any excess can be used to further increase token demand via revenue share or buyback and burn mechanisms. Projects can also require token holders to stake tokens or provide liquidity to benefit from revenue share to stabilise token prices or boost liquidity.
Shaping the Next Era of DePINs
As a crypto venture capital firm, we at DWF Ventures believe that the success of a DePIN project lies in the incorporation of specific elements. These elements include tailored hardware specifications that align with the project’s resource demands. Additionally, a robust incentivization model, coupled with well-structured tokenomics, is crucial for sustaining the project’s ecosystem. Prioritising the creation of an optimised user experience for both supply and demand-side participants is important as well. This ensures that the project is able to attract and retain users, ultimately contributing to the project’s overall success and longevity. These combined factors lay the foundation for a successful and sustainable DePIN initiative.
![Key elements of a successful DePIN project](/images/research/2023/2023-11-17-depin/2023-11-17-depin-10.webp)
Currently, we see that existing DePIN projects mostly have the first three elements well-established, as most are predominantly geared towards incentivising supply side users. Understandably, as supply side users are pivotal for the initial network expansion. However, it is still important to dedicate substantial resources into developing a good product that genuinely garners demand, not just being a decentralised alternative for an existing service.
Nonetheless, the entire DePIN ecosystem is still relatively nascent. Despite having projects that have been established for years, we anticipate that there will be more innovation and potentially the rise of a game-changing protocol in this vertical.
Crypto projects, actively building DePIN products that can meaningfully impact the market, that seek a crypto venture capital partner, feel free to to reach out to DWF Ventures.