Web 3.0 – where will it take us?

Maciej Zieliński

01 Mar 2022
Web 3.0 – where will it take us?

Decentralization and token-based economics are concepts that have started to reach far beyond the Blockchain industry. Web 3.0 - check about what the world’s biggest tech and venture capital companies are talking about today. 

Read about:

  • Web 2.0
  • Semantic web 
  • Decentralized web
  • AI and web 3.0
  • Change of user experience

Web 2.0 - How does the World Wide Web work today?

If you wonder which technology benefits from over 3 billion users, here is the answer: the World Wide Web. Today it’s difficult to imagine the modern world without it, even for people who remember times before its creation. This technology changed and defines how we share, create and consume information. It's present in every industry, shaping the way we work, learn and play - for many the internet became the central point of their lifestyle. 

Web 1.0 and web 2.0

Essentially terms web 1.0 and web 2.0 refer to time periods in the web's evolution as it evolved through different formats and technologies. 

Web 1.0, also known as Static Web, was the first version of the World Wide Web created in the 1990s. Back then user interaction wasn't a thing and searching for information was extremely inconvenient for internet users, because of the lack of search engines. 

Thanks to more advanced web technologies, such as Javascript or CSS, web 2.0 made the internet far more interactive. From that moment social networks and interactive platforms have been flourishing. 

Growth of the web 2.0 was largely driven by 3 factors:

  • mobile technology
  • social networks
  • cloud solutions
Growth of web 3.0

Mobile technologies

Smartphones creation resulting in mobile internet access drastically increased both the number of web users and time of its usage. Since then we’ve started living in an always-connected state. Reaching your pocket - that’s all it takes to get access to the web. 

Social Network 

Meta isn’t the 11th most-valuable company for no reason. Before Facebook or Myspace, the internet was largely anonymous with limited interactions between users. Social media platforms brought revolutionary possibilities. User-generated content, sharing, and commenting disrupted the information circulation.

What’s more, our internet persona became an extension of the real one. Thus, not only did social life partly move to the web, but we started to trust each other there, having tools that to some extent enable us to verify each other's identity. Without it, the success of companies such as Airbnb or Uber would never be possible. 

Cloud solutions

This article was created, reviewed, and edited using Google docs - a part of the cloud solution provided by Google, that most of the readers are probably familiar with. 

Cloud providers redefined how we store and share the data. It is the cloud that enables the creation and maintenance of most web pages and applications we know today. Companies were able to move from possessing expensive infrastructure to renting data storage, tools, or even computing power from dedicated companies. 

Disadvantages of Web 2.0

Web 2.0 definitely shapes how the current society functions, giving us possibilities we couldn’t even dream about before. Yet, it's not free from disadvantages. 

  • centralization
  • abundance of information
  • non-sufficient verification
  • monopolization
  • low personalization

With more and more issues that we’re grappling with, one question has become inevitable: What will be next?

web 2.0 vs web 3.0

Semantic Web 

The semantic web is a concept formulated in 1999 by Tim Berners Lee, the World Wide Web creator:

I have a dream for the Web [in which computers] become capable of analyzing all the data on the Web – the content, links, and transactions between people and computers. A "Semantic Web", which makes this possible, has yet to emerge, but when it does, the day-to-day mechanisms of trade, bureaucracy, and our daily lives will be handled by machines talking to machines. The "intelligent agents" people have touted for ages will finally materialize.

The vision of an intelligent internet that can understand the users and work without external governance back then was far from being realistic. Yet, today, with new technologies that we’ve developed, it may become reality sooner than we could ever predict. This is the moment to introduce you to the phenomenon of web 3.0. 

An original concept of Web 3.0 was coined by Gavin Wood, Ethereum, and Polkadot creator, somewhere around 2019, that refers to a "decentralized online ecosystem based on blockchain." The idea of the web which instead of using centralized servers relies on scattered nodes quickly gained a significant number of supporters.

Key features of web 3.0

Web 3.0 - key features

  • Semantic Web
  • Artificial Intelligence
  • Decentralization
  • 3D Graphics
Semantic analysis

Semantic web and web 3.0

In the semantic web, computers are able to analyze data with an understanding of its content, including text, transactions, and connections between users or events. In such systems, machines are able to accurately read our emotions, feelings, and intentions just by analyzing our input.  Applying it would greatly increase data connectivity, and in consequence, provide a better experience to the web users. 

AI in web 3.0

Artificial intelligence

Machine learning and artificial intelligence are key technologies for web 3.0. Currently, Web 2.0 already presents some semantic capabilities, but they are in fact most human-based. Therefore it is prone to biases and manipulations. 

Let’s take online reviews as an example. Today, any company can simply gather a large number of users and pay them to write a positive review of their product or service. Implementing AI, that would be able to distinguish fake from real, would increase the reliability of data available online.

Essentially, AI and machine learning will not only enable computers to decode meanings contained in data but also provide a more personalized experience to web 3.0 users. Online platforms will be able to tailor their appearance or content to an individual web user. This will bring a revolutionary change to the e-commerce sector as targeted advertising will become routine.

3D graphics in web 3.0

3D graphics 

According to some theories, with the introduction of web 3.0 borders between the real and digital world will begin to fade. The constant development of graphic technologies may even enable us to create whole 3D virtual worlds in web 3.0.

This concept is closely related to another issue that recently has gained significant popularity: metaverse. 3D graphics in web 3.0 will revolutionize sectors such as gaming, e-commerce, healthcare, and real estate. 

Decentralised web 3.0

Decentralized web

Current web infrastructure is based on data stored in centralized locations - single servers. That can potentially make it prone to manipulations or attacks. Furthermore, most of the databases are controlled by a limited number of companies such as Meta or Google. Web 3.0 aims to change that by introducing decentralized networks. 

In web 3.0 data will be stored in multiple locations - nodes. Any change of data will have to be authorized by every node in the infrastructure. The exchange of information will be taking place in peer-to-peer networks. It will not only take the data from the central authority but also make it more immune.

Digital assets in 3.0

Web 3.0 is expected to bring a totally new approach to digital assets. Tokens economy based on blockchain technology will become an even more common phenomenon.

Even today we can observe how blockchain technology is shaping the exchange of goods, investments, or ownership rights. Fungible and nonfungible tokens constantly find new applications that provide users with groundbreaking possibilities in industries such as gaming, real estate, or even healthcare.

On the internet of future ownership, control will become an even more vital issue. Blockchain technologies, and NFTs to be more precise can bring significant improvement in this area. What if assets, such as digital art or virtual land plots, were already carrying data about their owners and creators? Data that would be impossible to manipulate because it will be stored and confirmed in distributed ledgers.

What will change for web pages with web 3.0

Where web 3.0 will take us? According to many experts, we shouldn't treat web 3.0 as a totally new internet. It's just another stage of its evolution. Some of the solutions on which web 3.0 will be based already exist and function. In many cases, it's just about the scale.

Yet, the new web will definitely make a place for revolutionary business models. Personalized web pages or shops in 3D virtual spaces are just some examples of new possibilities that web 3.0 will form.

Most viewed


Never miss a story

Stay updated about Nextrope news as it happens.

You are subscribed

Applying Game Theory in Token Design

Kajetan Olas

16 Apr 2024
Applying Game Theory in Token Design

Blockchain technology allows for aligning incentives among network participants by rewarding desired behaviors with tokens.
But there is more to it than simply fostering cooperation. Game theory allows for designing incentive-machines that can't be turned-off and resemble artificial life.

Emergent Optimization

Game theory provides a robust framework for analyzing strategic interactions with mathematical models, which is particularly useful in blockchain environments where multiple stakeholders interact within a set of predefined rules. By applying this framework to token systems, developers can design systems that influence the emergent behaviors of network participants. This ensures the stability and effectiveness of the ecosystem.

Bonding Curves

Bonding curves are tool used in token design to manage the relationship between price and token supply predictably. Essentially, a bonding curve is a mathematical curve that defines the price of a token based on its supply. The more tokens that are bought, the higher the price climbs, and vice versa. This model incentivizes early adoption and can help stabilize a token’s economy over time.

For example, a bonding curve could be designed to slow down price increases after certain milestones are reached, thus preventing speculative bubbles and encouraging steadier, more organic growth.

The Case of Bitcoin

Bitcoin’s design incorporates game theory, most notably through its consensus mechanism of proof-of-work (PoW). Its reward function optimizes for security (hashrate) by optimizing for maximum electricity usage. Therefore, optimizing for its legitimate goal of being secure also inadvertently optimizes for corrupting natural environment. Another emergent outcome of PoW is the creation of mining pools, that increase centralization.

The Paperclip Maximizer and the dangers of blockchain economy

What’s the connection between AI from the story and decentralized economies? Blockchain-based incentive systems also can’t be turned off. This means that if we design an incentive system that optimizes towards a wrong objective, we might be unable to change it. Bitcoin critics argue that the PoW consensus mechanism optimizes toward destroying planet Earth.

Layer 2 Solutions

Layer 2 solutions are built on the understanding that the security provided by this core kernel of certainty can be used as an anchor. This anchor then supports additional economic mechanisms that operate off the blockchain, extending the utility of public blockchains like Ethereum. These mechanisms include state channels, sidechains, or plasma, each offering a way to conduct transactions off-chain while still being able to refer back to the anchored security of the main chain if necessary.

Conceptual Example of State Channels

State channels allow participants to perform numerous transactions off-chain, with the blockchain serving as a backstop in case of disputes or malfeasance.

Consider two players, Alice and Bob, who want to play a game of tic-tac-toe with stakes in Ethereum. The naive approach would be to interact directly with a smart contract for every move, which would be slow and costly. Instead, they can use a state channel for their game.

  1. Opening the Channel: They start by deploying a "Judge" smart contract on Ethereum, which holds the 1 ETH wager. The contract knows the rules of the game and the identities of the players.
  2. Playing the Game: Alice and Bob play the game off-chain by signing each move as transactions, which are exchanged directly between them but not broadcast to the blockchain. Each transaction includes a nonce to ensure moves are kept in order.
  3. Closing the Channel: When the game ends, the final state (i.e., the sequence of moves) is sent to the Judge contract, which pays out the wager to the winner after confirming both parties agree on the outcome.

A threat stronger than the execution

If Bob tries to cheat by submitting an old state where he was winning, Alice can challenge this during a dispute period by submitting a newer signed state. The Judge contract can verify the authenticity and order of these states due to the nonces, ensuring the integrity of the game. Thus, the mere threat of execution (submitting the state to the blockchain and having the fraud exposed) secures the off-chain interactions.

Game Theory in Practice

Understanding the application of game theory within blockchain and token ecosystems requires a structured approach to analyzing how stakeholders interact, defining possible actions they can take, and understanding the causal relationships within the system. This structured analysis helps in creating effective strategies that ensure the system operates as intended.

Stakeholder Analysis

Identifying Stakeholders

The first step in applying game theory effectively is identifying all relevant stakeholders within the ecosystem. This includes direct participants such as users, miners, and developers but also external entities like regulators, potential attackers, and partner organizations. Understanding who the stakeholders are and what their interests and capabilities are is crucial for predicting how they might interact within the system.

Stakeholders in blockchain development for systems engineering

Assessing Incentives and Capabilities

Each stakeholder has different motivations and resources at their disposal. For instance, miners are motivated by block rewards and transaction fees, while users seek fast, secure, and cheap transactions. Clearly defining these incentives helps in predicting how changes to the system’s rules and parameters might influence their behaviors.

Defining Action Space

Possible Actions

The action space encompasses all possible decisions or strategies stakeholders can employ in response to the ecosystem's dynamics. For example, a miner might choose to increase computational power, a user might decide to hold or sell tokens, and a developer might propose changes to the protocol.

Artonomus, Github

Constraints and Opportunities

Understanding the constraints (such as economic costs, technological limitations, and regulatory frameworks) and opportunities (such as new technological advancements or changes in market demand) within which these actions take place is vital. This helps in modeling potential strategies stakeholders might adopt.

Artonomus, Github

Causal Relationships Diagram

Mapping Interactions

Creating a diagram that represents the causal relationships between different actions and outcomes within the ecosystem can illuminate how complex interactions unfold. This diagram helps in identifying which variables influence others and how they do so, making it easier to predict the outcomes of certain actions.

Artonomus, Github

Analyzing Impact

By examining the causal relationships, developers and system designers can identify critical leverage points where small changes could have significant impacts. This analysis is crucial for enhancing system stability and ensuring its efficiency.

Feedback Loops

Understanding feedback loops within a blockchain ecosystem is critical as they can significantly amplify or mitigate the effects of changes within the system. These loops can reinforce or counteract trends, leading to rapid growth or decline.

Reinforcing Loops

Reinforcing loops are feedback mechanisms that amplify the effects of a trend or action. For example, increased adoption of a blockchain platform can lead to more developers creating applications on it, which in turn leads to further adoption. This positive feedback loop can drive rapid growth and success.

Death Spiral

Conversely, a death spiral is a type of reinforcing loop that leads to negative outcomes. An example might be the increasing cost of transaction fees leading to decreased usage of the blockchain, which reduces the incentive for miners to secure the network, further decreasing system performance and user adoption. Identifying potential death spirals early is crucial for maintaining the ecosystem's health.

The Death Spiral: How Terra's Algorithmic Stablecoin Came Crashing Down
the-death-spiral-how-terras-algorithmic-stablecoin-came-crashing-down/, Forbes

Conclusion

The fundamental advantage of token-based systems is being able to reward desired behavior. To capitalize on that possibility, token engineers put careful attention into optimization and designing incentives for long-term growth.

FAQ

  1. What does game theory contribute to blockchain token design?
    • Game theory optimizes blockchain ecosystems by structuring incentives that reward desired behavior.
  2. How do bonding curves apply game theory to improve token economics?
    • Bonding curves set token pricing that adjusts with supply changes, strategically incentivizing early purchases and penalizing speculation.
  3. What benefits do Layer 2 solutions provide in the context of game theory?
    • Layer 2 solutions leverage game theory, by creating systems where the threat of reporting fraudulent behavior ensures honest participation.

Token Engineering Process

Kajetan Olas

13 Apr 2024
Token Engineering Process

Token Engineering is an emerging field that addresses the systematic design and engineering of blockchain-based tokens. It applies rigorous mathematical methods from the Complex Systems Engineering discipline to tokenomics design.

In this article, we will walk through the Token Engineering Process and break it down into three key stages. Discovery Phase, Design Phase, and Deployment Phase.

Discovery Phase of Token Engineering Process

The first stage of the token engineering process is the Discovery Phase. It focuses on constructing high-level business plans, defining objectives, and identifying problems to be solved. That phase is also the time when token engineers first define key stakeholders in the project.

Defining the Problem

This may seem counterintuitive. Why would we start with the problem when designing tokenomics? Shouldn’t we start with more down-to-earth matters like token supply? The answer is No. Tokens are a medium for creating and exchanging value within a project’s ecosystem. Since crypto projects draw their value from solving problems that can’t be solved through TradFi mechanisms, their tokenomics should reflect that. 

The industry standard, developed by McKinsey & Co. and adapted to token engineering purposes by Outlier Ventures, is structuring the problem through a logic tree, following MECE.
MECE stands for Mutually Exclusive, Collectively Exhaustive. Mutually Exclusive means that problems in the tree should not overlap. Collectively Exhaustive means that the tree should cover all issues.

In practice, the “Problem” should be replaced by a whole problem statement worksheet. The same will hold for some of the boxes.
A commonly used tool for designing these kinds of diagrams is the Miro whiteboard.

Identifying Stakeholders and Value Flows in Token Engineering

This part is about identifying all relevant actors in the ecosystem and how value flows between them. To illustrate what we mean let’s consider an example of NFT marketplace. In its case, relevant actors might be sellers, buyers, NFT creators, and a marketplace owner. Possible value flow when conducting a transaction might be: buyer gets rid of his tokens, seller gets some of them, marketplace owner gets some of them as fees, and NFT creators get some of them as royalties.

Incentive Mechanisms Canvas

The last part of what we consider to be in the Discovery Phase is filling the Incentive Mechanisms Canvas. After successfully identifying value flows in the previous stage, token engineers search for frictions to desired behaviors and point out the undesired behaviors. For example, friction to activity on an NFT marketplace might be respecting royalty fees by marketplace owners since it reduces value flowing to the seller.

source: https://www.canva.com/design/DAFDTNKsIJs/8Ky9EoJJI7p98qKLIu2XNw/view#7

Design Phase of Token Engineering Process

The second stage of the Token Engineering Process is the Design Phase in which you make use of high-level descriptions from the previous step to come up with a specific design of the project. This will include everything that can be usually found in crypto whitepapers (e.g. governance mechanisms, incentive mechanisms, token supply, etc). After finishing the design, token engineers should represent the whole value flow and transactional logic on detailed visual diagrams. These diagrams will be a basis for creating mathematical models in the Deployment Phase. 

Token Engineering Artonomous Design Diagram
Artonomous design diagram, source: Artonomous GitHub

Objective Function

Every crypto project has some objective. The objective can consist of many goals, such as decentralization or token price. The objective function is a mathematical function assigning weights to different factors that influence the main objective in the order of their importance. This function will be a reference for machine learning algorithms in the next steps. They will try to find quantitative parameters (e.g. network fees) that maximize the output of this function.
Modified Metcalfe’s Law can serve as an inspiration during that step. It’s a framework for valuing crypto projects, but we believe that after adjustments it can also be used in this context.

Deployment Phase of Token Engineering Process

The Deployment Phase is final, but also the most demanding step in the process. It involves the implementation of machine learning algorithms that test our assumptions and optimize quantitative parameters. Token Engineering draws from Nassim Taleb’s concept of Antifragility and extensively uses feedback loops to make a system that gains from arising shocks.

Agent-based Modelling 

In agent-based modeling, we describe a set of behaviors and goals displayed by each agent participating in the system (this is why previous steps focused so much on describing stakeholders). Each agent is controlled by an autonomous AI and continuously optimizes his strategy. He learns from his experience and can mimic the behavior of other agents if he finds it effective (Reinforced Learning). This approach allows for mimicking real users, who adapt their strategies with time. An example adaptive agent would be a cryptocurrency trader, who changes his trading strategy in response to experiencing a loss of money.

Monte Carlo Simulations

Token Engineers use the Monte Carlo method to simulate the consequences of various possible interactions while taking into account the probability of their occurrence. By running a large number of simulations it’s possible to stress-test the project in multiple scenarios and identify emergent risks.

Testnet Deployment

If possible, it's highly beneficial for projects to extend the testing phase even further by letting real users use the network. Idea is the same as in agent-based testing - continuous optimization based on provided metrics. Furthermore, in case the project considers airdropping its tokens, giving them to early users is a great strategy. Even though part of the activity will be disingenuine and airdrop-oriented, such strategy still works better than most.

Time Duration

Token engineering process may take from as little as 2 weeks to as much as 5 months. It depends on the project category (Layer 1 protocol will require more time, than a simple DApp), and security requirements. For example, a bank issuing its digital token will have a very low risk tolerance.

Required Skills for Token Engineering

Token engineering is a multidisciplinary field and requires a great amount of specialized knowledge. Key knowledge areas are:

  • Systems Engineering
  • Machine Learning
  • Market Research
  • Capital Markets
  • Current trends in Web3
  • Blockchain Engineering
  • Statistics

Summary

The token engineering process consists of 3 steps: Discovery Phase, Design Phase, and Deployment Phase. It’s utilized mostly by established blockchain projects, and financial institutions like the International Monetary Fund. Even though it’s a very resource-consuming process, we believe it’s worth it. Projects that went through scrupulous design and testing before launch are much more likely to receive VC funding and be in the 10% of crypto projects that survive the bear market. Going through that process also has a symbolic meaning - it shows that the project is long-term oriented.

If you're looking to create a robust tokenomics model and go through institutional-grade testing please reach out to contact@nextrope.com. Our team is ready to help you with the token engineering process and ensure your project’s resilience in the long term.

FAQ

What does token engineering process look like?

  • Token engineering process is conducted in a 3-step methodical fashion. This includes Discovery Phase, Design Phase, and Deployment Phase. Each of these stages should be tailored to the specific needs of a project.

Is token engineering meant only for big projects?

  • We recommend that even small projects go through a simplified design and optimization process. This increases community's trust and makes sure that the tokenomics doesn't have any obvious flaws.

How long does the token engineering process take?

  • It depends on the project and may range from 2 weeks to 5 months.