Crypto – Finematics https://finematics.com decentralized finance education Mon, 02 Aug 2021 16:23:54 +0000 en-GB hourly 1 https://wordpress.org/?v=5.8.1 https://finematics.com/wp-content/uploads/2017/09/cropped-favicon-32x32.png Crypto – Finematics https://finematics.com 32 32 Rollups – The Ultimate Ethereum Scaling Solution https://finematics.com/rollups-explained/?utm_source=rss&utm_medium=rss&utm_campaign=rollups-explained&utm_source=rss&utm_medium=rss&utm_campaign=rollups-explained https://finematics.com/rollups-explained/#respond Mon, 02 Aug 2021 16:23:52 +0000 https://finematics.com/?p=1390

So what are rollups all about? What’s the difference between optimistic and ZK rollups? How is Arbitrum different from Optimism? And why are rollups considered to be the holy grail when it comes to scaling Ethereum? You’ll find answers to these questions in this article.

Intro

Ethereum scaling has been one of the most discussed topics in crypto. The scaling debate usually heats up during periods of high network activity such as the CryptoKitties craze in 2017, DeFi Summer of 2020 or the crypto bull market at the beginning of 2021. 

During these periods, the unparalleled demand for the Ethereum network resulted in extremely high gas fees making it expensive for everyday users to pay for their transactions. 

To tackle this problem, the search for the ultimate scaling solution has been one of the top priorities for multiple teams and the Ethereum community as a whole. 

In general, there are 3 main ways to scale Ethereum or in fact, most other blockchains: scaling the blockchain itself – layer 1 scaling; building on top of layer 1 – layer 2 scaling and building on the side of layer 1 – sidechains. 

When it comes to layer 1, Eth2 is the chosen solution for scaling the Ethereum blockchain. Eth2 refers to a set of interconnected changes such as the migration to Proof-of-Stake (PoS), merging the state of the Proof-of-Work (PoW) blockchain into the new PoS chain and sharding. 

Sharding, in particular, can dramatically increase the throughput of the Ethereum network, especially when combined with rollups. 

If you’d like to learn more about Eth2 you can check out this article here.   

When it comes to scaling outside of layer 1, multiple different scaling solutions have been tried with some mixed results. 

On the one hand, we have layer 2 solutions such as Channels that are fully secured by Ethereum but work well only for a specific set of applications. 

Sidechains, on the other hand, are usually EVM-compatible and can scale general-purpose applications. The main drawback – they are less secure than layer 2 solutions by not relying on the security of Ethereum and instead having their own consensus models. 

Most rollups aim at achieving the best of these 2 worlds by creating a general-purpose scaling solution while still fully relying on the security of Ethereum.

This is the holy grail of scaling as it allows for deploying all of the existing smart contracts present on Ethereum to a rollup with little or no changes while not sacrificing security. 

No wonder rollups are probably the most anticipated scaling solution of them all. 

But what are rollups in the first place? 

Rollups 

A rollup is a type of scaling solution that works by executing transactions outside of Layer 1 but posting transaction data on Layer 1. This allows the rollup to scale the network and still derive its security from the Ethereum consensus. 

Moving computation off-chain allows for essentially processing more transactions in total as only some of the data of the rollup transactions has to fit into the Ethereum blocks. 

To achieve this, rollup transactions are executed on a separate chain that can even run a  rollup-specific version of the EVM. 

The next step after executing transactions on a rollup is to batch them together and post them to the main Ethereum chain. 

The whole process essentially executes transactions, takes the data, compresses it and rolls it up to the main chain in a single batch, hence the name – a rollup. 

Although this looks like a potentially good solution, there is a natural question that comes next: 

“How does Ethereum know that the posted data is valid and wasn’t submitted by a bad actor trying to benefit themselves?” 

The exact answer depends on a specific rollup implementation, but in general, each rollup deploys a set of smart contracts on Layer 1 that are responsible for processing deposits and withdrawals and verifying proofs. 

Proofs are also where the main distinction between different types of rollups comes into play. 

Optimistic rollups use fraud proofs. In contrast, ZK rollups use validity proofs.

Let’s explore these two types of rollups further. 

Optimistic Vs ZK Rollups

Optimistic rollups post data to layer 1 and assume it’s correct hence the name “optimistic”. If the posted data is valid we are on the happy path and nothing else has to be done. The optimistic rollup benefits from not having to do any additional work in the optimistic scenario.

In case of an invalid transaction, the system has to be able to identify it, recover the correct state and penalize the party that submits such a transaction. To achieve this, optimistic rollups implement a dispute resolution system that is able to verify fraud proofs, detect fraudulent transactions and disincentivize bad actors from submitting other invalid transactions or incorrect fraud proofs. 

In most of the optimistic rollup implementations, the party that is able to submit batches of transactions to layer 1 has to provide a bond, usually in the form of ETH. Any other network participant can submit a fraud proof if they spot an incorrect transaction. 

After a fraud proof is submitted, the system enters the dispute resolution mode. In this mode, the suspicious transaction is executed again this time on the main Ethereum chain. If the execution proves that the transaction was indeed fraudulent, the party that submitted this transaction is punished, usually by having their bonded ETH slashed. 

To prevent the bad actors from spamming the network with incorrect fraud proofs, the parties wishing to submit fraud proofs usually also have to provide a bond that can be subject to slashing.

In order to be able to execute a rollup transaction on layer 1, optimistic rollups have to implement a system that is able to replay a transaction with the exact state that was present when the transaction was originally executed on the rollup. This is one of the complicated parts of optimistic rollups and is usually achieved by creating a separate manager contract that replaces certain function calls with a state from the rollup. 

It’s worth noting that the system can work as expected and detect fraud even if there is only 1 honest party that monitors the state of the rollup and submits fraud proofs if needed. 

It’s also worth mentioning that because of the correct incentives within the rollup system, entering the dispute resolution process should be an exceptional situation and not something that happens all the time. 

When it comes to ZK rollups, there is no dispute resolution at all. This is possible by leveraging a clever piece of cryptography called Zero-Knowledge proofs hence the name ZK rollups. In this model, every batch posted to layer 1 includes a cryptographic proof called a ZK-SNARK. The proof can be quickly verified by the layer 1 contract when the transaction batch is submitted and invalid batches can be rejected straight away. 

Sounds simple right? Maybe on the surface. In practice to make it work, multiple researchers spent countless hours iterating on these clever pieces of cryptography and maths. 

There are a few other differences between optimistic and ZK rollups, so let’s go through them one by one. 

Due to the nature of the dispute resolution process, optimistic rollups have to give enough time to all the network participants to submit the fraud proofs before finalizing a transaction on layer 1. This period is usually quite long to make sure that even in the worst-case scenario, fraudulent transactions can still be disputed. 

This causes the withdrawals from optimistic rollups to be quite long as the users have to wait even as much as a week or two to be able to withdraw their funds back to layer 1. 

Fortunately, there are a few projects that are working to improve this situation by providing fast “liquidity exists”. These projects offer almost instant withdrawals back to layer 1, another layer 2 or even a sidechain and charge a small fee for the convenience. The Hop protocol and Connext are the projects to look at. 

ZK rollups don’t have the problem of long withdrawals as the funds are available for withdrawals as soon as the rollup batch, together with a validity proof, is submitted to layer 1. 

So far it looks like ZK rollups are just a better version of optimistic rollups, but they also come with a few drawbacks. 

Due to the complexity of the technology, it’s much harder to create an EVM-compatible ZK rollup which makes it more difficult to scale general-purpose applications without having to rewrite the application logic. Saying this, ZKSync is making significant progress in this area and they might be able to launch an EVM-compatible ZK rollup quite soon. 

Optimistic rollups have a little bit of an easier time with the EVM compatibility. They still have to run their own version of the EVM with a few modifications, but 99% of contracts can be ported without making any changes. 

ZK rollups are also way more computation-heavy than optimistic rollups. This means that nodes that compute ZK proofs have to be high-spec machines, making it hard for other users to run them. 

When it comes to scaling improvements, both types of rollups should be able to scale Ethereum from around 15 to 45 transactions per second (depending on the transaction type) up to as many as 1000-4000 transactions per second.

It’s worth noting that it is possible to process even more transactions per second by offering more space for the rollup batches on layer 1. This is also why Eth2 can create a massive synergy with rollups as it increases the possible data availability space by creating multiple shards – each one of them able to store a significant amount of data. The combination of Eth2 and rollups could bring Ethereum transaction speed up to as many as 100k transactions per second. 

Now, let’s talk about all the different projects working on both optimistic and ZK rollups.

Optimistic Rollups 

Optimism and Arbitrum are currently the most popular options when it comes to optimistic rollups. 

Optimism has been partially rolled out to the Ethereum mainnet with a limited set of partners such as Synthetix or Uniswap to ensure that the technology works as expected before the full launch. 

Arbitrum already deployed its version to the mainnet and started onboarding different projects into its ecosystem. Instead of allowing only limited partners to be able to deploy their protocols first, they decided to give a window of time for all protocols that want to launch on their rollups. When this period of time is finished, they will open the flood gates to all the users in one go. 

Some of the most notable projects launching on Arbitrum are Uniswap, Sushi, Bancor, Augur, Chainlink, Aave and many more. 

Arbitrum has also recently announced its partnership with Reddit. They’ll be focusing on launching a separate rollup chain that will allow Reddit to scale their reward system. 

Optimism is partnering with MakerDAO to create the Optimism Dai Bridge and enable fast withdrawals of DAI and other tokens back to layer 1. 

Although both Arbitrum and Optimism try to achieve the same goal – building an EVM-compatible optimistic rollups solution – there are a few differences in their design.

Arbitrum has a different dispute resolution model. Instead of rerunning a whole transaction on layer 1 to verify if the fraud proof is valid, they have come up with an interactive multi-round model which allows narrowing down the scope of the dispute and potentially executes only a few instructions on layer 1 to check if a suspicious transaction is valid. 

This also resulted in a nice side effect where smart contracts deployed on Arbitrum can be larger than the maximum allowed contract size on Ethereum. 

Another major difference is the approach to handling transaction ordering and MEV.

Arbitrum will be initially running a sequencer that is responsible for ordering transactions, but they want to decentralize it in the long run.

Optimism prefers another approach where the ordering of transactions, and hence the MEV, can be auctioned off to other parties for a certain period of time.  

It’s also worth mentioning a few other projects working on optimistic rollups. Fuel, the OMG team with OMGX and Cartesi to name a few. Most of them try to also work on an EVM-compatible version of their rollups. 

ZK Rollups

Although it looks like the Ethereum community is mostly focusing on optimistic rollups, at least in the short run, let’s not forget that the projects working on ZK rollups are also progressing extremely quickly. 

With ZK rollups we have a few options available.

Loopring uses ZK rollup technology to scale its exchange and payment protocol. 

Hermez and ZKTube are working on scaling payments using ZK rollups with Hermez also building an EMV-compatible ZK rollup. 

Aztec is focusing on bringing privacy features to their ZK rollup technology. 

StarkWare-based rollups are already extensively used by projects such as DeversiFi, Immutable X and dYdX. 

As we mentioned earlier, ZKSync is working on an EMV-compatible virtual machine that will be able to fully support any arbitrary smart contracts written in Solidity. 

Summary

As we can see, there are a lot of things going on in both the optimistic and the ZK rollup camps and the competition between different rollups will be interesting to watch. 

Rollups should also have a big impact on DeFi. Users who were previously not able to transact on Ethereum due to high transaction fees will be able to stay in the ecosystem the next time the network activity is high. They will also enable a new breed of applications that require cheaper transactions and faster confirmation time. All of this while being fully secured by the Ethereum consensus. It looks like rollups may trigger another high growth period for DeFi. 

There are however a few challenges when it comes to rollups. 

Composability is one of them. In order to compose a transaction that uses multiple protocols, all of them would have to be deployed on the same rollup. 

Another challenge is fractured liquidity. For example, without the new money coming into the Ethereum ecosystem as a whole, the existing liquidity present on layer 1 in protocols such as Uniswap or Aave will be shared between layer 1 and multiple rollup implementations. Lower liquidity usually means higher slippage and worse trade execution. 

This also means that naturally there will be winners and losers. At the moment, the existing Ethereum ecosystem is not big enough to make use of all scaling solutions. This may and probably will change in the long run, but in the short run, we may see some of the rollups, and other scaling solutions, becoming ghost towns. 

In the future, we may also see users living entirely within one rollup ecosystem and not interacting with the main Ethereum chain and other scaling solutions for long periods of time. This could be particularly visible if we’re going to see more centralized exchanges enabling direct deposits and withdrawals to and from rollups. 

Nevertheless, rollups seem like the ultimate strategy for scaling Ethereum and the challenges will be most likely mitigated in one way or another. It will be clearly super interesting to see how rollups gain more and more users’ adoption. 

One question that comes up very often when discussing rollups is if they are a threat to sidechains. Personally, I think that sidechains will still have their place in the Ethereum ecosystem. This is because, although the cost of transactions on Layer 2 will be much lower than on Layer 1, it will most likely still be high enough to price out certain types of applications such as games and other high volume apps. 

This may change when Ethereum introduces sharding, but by then sidechains may create enough network effect to survive long term. It will be interesting to see how this plays out in the future. 

Also, the fees on rollups are higher than on sidechains because each rollup batch still has to pay for the Ethereum block space. 

It’s worth remembering that the Ethereum community puts a huge focus on rollups in the Ethereum scaling strategy – at least in the short to mid-term and potentially even longer. I recommend reading Vitalik Buterin’s post on a rollup-centric Ethereum roadmap.

So what do you think about rollups? What are your favourite rollup technologies? 

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

]]>
https://finematics.com/rollups-explained/feed/ 0
Bank Run in DeFi – Iron Finance Fiasco Explained https://finematics.com/bank-run-in-defi-iron-finance-explained/?utm_source=rss&utm_medium=rss&utm_campaign=bank-run-in-defi-iron-finance-explained&utm_source=rss&utm_medium=rss&utm_campaign=bank-run-in-defi-iron-finance-explained https://finematics.com/bank-run-in-defi-iron-finance-explained/#respond Thu, 24 Jun 2021 14:01:10 +0000 https://finematics.com/?p=1377

So what was the first large scale bank run in DeFi all about? Why is it so hard to create a working algorithmic stablecoin? And what can we learn from the IronFinance fiasco? You’ll find answers to these questions in this article. 

Algorithmic Stablecoins 

IronFinance initially launched on Binance Smart Chain in March 2021 and aimed at creating an ecosystem for a partially collateralized algorithmic stablecoin. 

As we know, building algorithmic stablecoins is hard. Most projects either completely fail or end up in a no man’s land by struggling to maintain their peg to the US Dollar. Because of this, building an algorithmic stablecoin has become one of the holy grails in DeFi. 

Achieving it would clearly revolutionize the DeFi space as we know it today. 

The current ecosystem relies heavily on stablecoins that come with major trade-offs. They maintain their peg to the US Dollar at the cost of either centralization or capital inefficiency. 

For example, the custody of USDC or USDT is fully centralized. On the flip side, stablecoins like DAI or RAI require a lot of collateral which makes them capital inefficient.  

IronFinance tried to address these problems by creating a partially collateralized stablecoin – IRON. 

IronFinance 

Despite a few hiccups along the road, such as short periods of time when IRON unpegged from USD or when ValueDeFi exploits affected some of the IronFinance users, the protocol kept marching forward. 

In retrospect, recovering from these issues most likely built a false level of confidence in the protocol design as its users thought they were dealing with a “battle-tested” project.

In May 2021 IronFinance expanded to Polygon and started gaining more and more traction. 

Total value locked in the protocol quickly went from millions to billions of dollars, surpassing 2 billion before the final collapse. The value of TITAN – protocol’s native token on Polygon – went from $10 to $64 just in the last week leading to the bank run. 

This parabolic growth was mostly driven by extremely high yield farming rewards and subsequent high demand for both the TITAN and the IRON tokens. Yield farmers were able to benefit from around 500% APR on stablecoin pairs: IRON/USDC and around 1700% APR on more volatile pairs like TITAN/MATIC.  

To add even more fuel to this parabolic growth, IronFinance was mentioned by a famous investor – Mark Cuban – in his blog post. This further legitimised the project and brought even more attention to it. 

On the 16th of June 2021, the protocol experienced a massive bank run that crashed the TITAN price to 0 and resulted in thousands of people experiencing major financial losses.

Before we start unfolding all of the events that led to the collapse of IronFinance, let’s try to understand how the protocol was built.

It’s worth noting that reviewing the design of projects, including the ones that failed, is important as it allows us to better understand what works and what doesn’t work in DeFi. It also makes it easier to assess new protocols that very often reuse a lot of elements of the already existing systems. 

Protocol Design 

The IronFinance protocol was designed around 3 types of tokens: 

  • Its own partially collateralized stablecoin – IRON that should maintain a soft peg to the US Dollar,
  • Its own token: TITAN on Polygon and STEEL on BSC, 
  • an established stablecoin used as collateral: USDC on Polygon and BUSD on BSC   

The combination of USDC and TITAN on Polygon or BUSD and STEEL on BSC was supposed to allow the protocol to decrease the amount of stablecoin collateral over time and in turn, making IRON partially collateralized leading to a greater capital efficiency. 

The protocol, although using different tokens on Polygon and BSC, worked in an analogous way on both platforms so in order to simplify this article going further I’m going to skip the BSC tokens BUSD and STEEL in the explanation. 

In order to achieve price stability of the IRON token, the protocol introduced a mechanism for minting and redeeming IRON that relied on market incentives and arbitrageurs. 

Whenever the price of the IRON token was less than $1, anyone could purchase it on the open market and redeem it for approximately $1 worth of value paid in a mix of USDC and TITAN. 

Whenever the price of the IRON token was greater than $1, anyone could mint new IRON tokens for approximately $1 worth of USDC and TITAN and sell the freshly minted IRON tokens on the open market, driving the price of IRON back to $1. 

To understand the process of minting and redeeming better, we have to introduce the concept of Target Collateral Ratio (TCR) and Effective Collateral Ratio (ECR). 

Target Collateral Ratio is used by the minting function to determine the ratio between USDC and TITAN required to mint IRON. 

As an example, let’s say the TCR is at 75%. In this case, 75% of collateral needed to mint IRON would come from USDC and 25% would come from TITAN. 

The protocol started at 100% TCR and gradually lowered the TCR over time. 

TCR can increase or decrease depending on the IRON price. On one hand, if the time-weighted average price of IRON is greater than $1, TCR is lowered. On the other hand, if the time-weighted average price of IRON is less than $1, the TCR is increased. 

Effective Collateral Ratio is used by the redeeming mechanism to determine the ratio between USDC and TITAN when redeeming IRON. ECR is calculated as current USDC collateral divided by the total IRON supply. 

If TCR is lower than ECR, the protocol has excess collateral. On the flip side, if TCR is higher than ECR it means the protocol is undercollateralized. 

As an example, if the ECR is at 75%, each time IRON is redeemed the user would get 75% of their collateral back in USDC and 25% in TITAN. 

What is important is that every time someone mints IRON the TITAN portion of collateral is burned. If someone redeems IRON, new TITAN tokens are minted.

As we can see, the whole mechanism, although a bit complicated, should work – at least in theory.  

Now, let’s see how the events leading to the collapse of IronFinance unfolded. 

Events Unfolding

Around 10 am UTC on 16th June 2021, the team behind the protocol noticed that a few larger liquidity providers a.k.a “whales” started removing liquidity from IRON/USDC and then started selling their TITAN to IRON. Instead of redeeming IRON, they sold it directly to USDC via liquidity pools. This caused the IRON price to unpeg from the value of the US Dollar. This in turn spooked the TITAN holders who started selling their TITAN causing the token price to drop from around $65 to $30 in approx 2 hours. The TITAN price later came back to $52 and IRON fully recovered its peg. 

This event, although quite severe, wasn’t that unusual considering that the protocol had a history of native tokens sharply dropping in value and IRON unpegging for a short period of time. 

Later on the same day, a few whales started selling again. This time it was different. The market panicked and users started redeeming IRON and selling their TITAN in masses. Because of the extremely quick and sharp drop in the TITAN price, the time-weighted price oracle used for reporting TITAN prices started reporting stale prices that were still higher than the actual market price of TITAN. 

This created a negative feedback loop as the price oracle was used to determine the number of TITAN tokens that have to be printed while redeeming IRON. 

Because IRON was trading off-peg at under $1, the users could buy IRON, for let’s say $0.90 and redeem it for $0.75 in USDC and $0.25 in TITAN and sell TITAN immediately. This situation created a death spiral for TITAN that drove its price to pretty much 0 as the lower the TITAN price was the more TITAN tokens would have to be printed to account for the correct amount of the redeemed capital. 

The TITAN price hitting almost 0 exposed another flaw in the protocol – users being unable to redeem their IRON tokens. This was later fixed by the team and users were able to recover around $0.75 worth of USDC collateral from their IRON tokens. 

Unfortunately, TITAN holders didn’t get away with “only” a 25% haircut and instead took heavy losses. This also included TITAN liquidity providers.

This is because when one token in a 50/50 liquidity pool goes to 0 the impermanent loss can reach pretty much 100%. Liquidity providers end up losing both tokens in the pool as the non-TITAN token is sold out for TITAN that keeps going lower and lower in value. 

This situation exposed a major flaw in the IronFinance mechanism that resulted in what we can call the first large scale bank run in DeFi.

Similarly to banks with fractional-reserve systems, where there are not enough funds to cover all depositors at any one time, the IronFinance protocol didn’t have enough collateral to cover all minted IRON. At least not when the TITAN token used as 25% of the overall collateral became worthless in a matter of minutes.   

The IronFinance fiasco also shows us why DeFi protocols shouldn’t fully rely on human coordination, especially when under certain circumstances incentives work against the protocol. In theory, if people just stopped selling TITAN for a short period of time, the system would recover as it had previously done in the past. In practice, most market participants are driven by making a profit and the arbitrage opportunity present in the protocol caused them to fully take advantage of this situation. This is also why all DeFi protocols should always account for the worst-case scenario. 

Lessons Learned 

As with most major protocol failures in DeFi, there are always some lessons to be learned.

In the case of IronFinance, there are a few important takeaways. 

First of all, we always have to consider what would happen to the protocol in the worst-case scenario. This usually involves one of the tokens used in the protocol sharply losing its value. 

What happens when the protocol stops expanding and starts contracting? What if the contraction is way quicker than expansion? 

Another important element of the protocol design that always has to be fully understood is the usage of price oracles. Could they report stale prices or get manipulated by flash loan attacks? If so, what basic protocol mechanisms rely on these oracles and how would they behave when the oracle is compromised. 

Next lesson, providing liquidity in a pool where at least 1 asset can drop to 0 means that we can lose pretty much all of our capital, even if the second token doesn’t lose any value. 

Another lesson, following celebrities and their investments might be risky. With great power comes great responsibility and unfortunately, even a single mention of a certain protocol or a token can cause people to invest in something they don’t fully understand – don’t be that person and always make sure you do your own due diligence. 

One good indicator of high-risk protocols is extremely high APR in yield farming. If something looks too good to be true there are usually some risks that have to be accounted for. 

Last but not least, building algorithmic stablecoins is hard. I hope one day we can see a fully functioning algorithmic stablecoin competing in size with USDT or USDC, but this will most likely take a bit of time and hundreds of failed attempts. If you want to become an early adopter of such a coin it’s great, but keep in mind that the numbers are not on your side. 

What’s Next

So what’s next when it comes to IronFinance and algorithmic stablecoins? 

At the moment, the team behind the protocol is planning on conducting an in-depth analysis of the situation, in order to understand the circumstances which led to such an outcome. 

It’s hard to say if the team behind IronFinance will decide to fix the shortcomings of the existing protocol and relaunch it. 

Historically, second versions of failed protocols usually don’t get nearly as much traction as their original versions. Yam Finance was a good example of such a protocol. 

After the collapse of IronFinance, there is still a lot of capital sitting on the sideline looking for other high-risk opportunities. It will be interesting to see where this capital goes next. 

So what do you think about the IronFinance fiasco? Are you optimistic about the future of algorithmic stablecoins? 

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

Finematics is also participating in Round 10 of Gitcoin Grants. If you’d like to support us, check out our grant here.

]]>
https://finematics.com/bank-run-in-defi-iron-finance-explained/feed/ 0
Sushi – Most Underrated Protocol in DeFi? (BentoBox, Kashi, Miso Explained) https://finematics.com/sushi-explained/?utm_source=rss&utm_medium=rss&utm_campaign=sushi-explained&utm_source=rss&utm_medium=rss&utm_campaign=sushi-explained https://finematics.com/sushi-explained/#respond Fri, 04 Jun 2021 23:29:21 +0000 https://finematics.com/?p=1361

Intro

So why is Sushi believed to be one of the most underrated protocols in DeFi? What are some of its new features such as BentoBox, Kashi and Miso all about? And what is Sushi’s approach to launching on different blockchains and scaling solutions? You’ll find answers to these questions in this article. 

Let’s start with a bit of background. 

Sushi launched in August 2020 during DeFi Summer – the first period of major growth of DeFi. The project quickly gained a lot of traction mostly due to the nature of its launch. 

Sushi – back then known as SushiSwap – aimed at directly competing with Uniswap by forking it and encouraging liquidity providers to move their liquidity to a new platform in a process called a vampire attack. 

The full story behind Sushi, although super interesting, is out of the scope of this article. Fortunately enough, I’ve already written an article about it some time ago and you can read it here if you’re interested. 

Sushi 

Almost a year later and the rocky launch of Sushi seems like a distant past and the team behind the protocol has been working hard on delivering new, interesting features and building the Sushi ecosystem. 

Besides the main function of Sushi – a decentralized exchange for swapping assets, the protocol offers a growing range of other products: a liquidity bootstrapping feature for other projects – Onsen; a lending platform – Kashi; a launchpad for new protocols – Miso. More on these later in the article. 

The Sushi team has a very open approach when it comes to deploying the protocol to different chains and scaling solutions. 

Instead of trying to predict which environment will be the most dominant one and will capture the most value, they deploy the protocol to all popular and upcoming environments and let the market decide. 

Besides the Ethereum mainnet, Sushi has already been deployed to Polygon, xDai, BSC, Fantom and Moonbeam with an upcoming launch on Arbitrum – a layer 2 Ethereum scaling solution.

Another interesting move was the acquisition of the sushi.com domain that should give the project even more visibility. 

Now, let’s dive a bit deeper into each of the Sushi features one by one.

AMM 

Automated Market Maker or AMM is the main function of Sushi that allows users to swap their assets in a decentralized and permissionless way. 

Sushi’s AMM is a fork of Uniswap V2, so these two work in exactly the same way. If you need a recap on AMMs and liquidity pools here’s a popular article that I wrote some time ago.  

Currently, Sushi is the second-largest AMM on Ethereum with around 16% of the market share. Uniswap remains an undisputed leader capturing around 54% of the total AMM market. 

Sushi’s daily trading volume, which is one of the most important metrics when it comes to AMMs, has been steadily growing from around $250M at the end of 2020 to over $500M in 2021 with some days hitting well over $1B.

Another metric – total value locked in the protocol – has also been growing from around $1B at the end of 2020 to as high as $5.5B and currently sitting at around $3.5B after the recent market downturn. 

One major difference between Uniswap V2 and Sushi is that the latter has enabled the profit-sharing mechanism which benefits the SUSHI token holders. Instead of 0.3% of trading fees going to the liquidity providers like in the case of Uniswap, Sushi enabled the fee switch which lowers the trading fee for the LPs to 0.25% while distributing the remaining 0.05% to the SUSHI token holders. 

And this leads us straight to the SushiBar. 

SushiBar

In order to benefit from profit sharing, SUSHI holders have to stake their SUSHI tokens in the SushiBar smart contract and receive xSUSHI that can be later redeemed for their original SUSHI plus additional SUSHI tokens coming from the swap fees. 

For every swap, on every chain, going through Sushi, 0.05% of the swap fees are distributed as SUSHI proportionally to the user’s share of the SushiBar. 

The xSUSHI tokens are fully composable and maintain voting rights in the Sushi governance. xSUSHI tokens can also be added to the xSUSHI/ETH liquidity pool where users can benefit from stacked yield coming from xSUSHI itself plus the extra rewards coming from the pool. 

The yield on SushiBar depends on the trading volume going through the Sushi AMM and has recently been at around 10% APR with days as high as 40% APR. 

Because of this profit-sharing mechanism, the SUSHI token is essentially one of the most productive assets in the DeFi space. In contrast to many other tokens driven mostly by speculation, SUSHI tokens should better represent the actual value of the Sushi protocol. 

Time for another Sushi feature also connected to the SUSHI token – Onsen. 

Yield Farming and Onsen

Onsen is a liquidity incentive system that accelerates new projects by providing extra rewards in the form of SUSHI tokens. 

Projects selected to be on Onsen are given a certain allocation of SUSHI tokens to incentivise liquidity provisioning for their own token. This means that the projects themselves don’t have to distribute their own token through liquidity mining and they can still benefit from incentivised liquidity. This is really useful for new projects that very often struggle to bootstrap liquidity, especially if they don’t want to initially distribute large amounts of their own tokens.

Onsen also benefits the overall Sushi ecosystem as the swap fees from the Onsen-enabled liquidity pools are distributed to the xSUSHI holders. 

Projects featured on Onsen are chosen based on their quality and the demand for their products. Some projects are featured only for a certain amount of time, while others can remain on the Onsen menu indefinitely, assuming the quality of the project and demand for their token remain high. 

On top of Onsen, Sushi offers permanent yield farming opportunities for popular and established tokens. These opportunities are also available on other layers, for example, Sushi has recently started a liquidity mining program on Polygon that offers high yields to liquidity providers. 

Time for yet another Sushi feature – BentoBox. 

BentoBox 

BentoBox is a special smart contract that acts as a vault for certain tokens. This vault is basically a pool of funds that can be used by Bento-enabled applications in the Sushi ecosystem. 

Users who deposit funds into one of the BentoBox vaults benefit from earning extra yield on their tokens. Vaults can generate yield in multiple ways, for example, by allowing other participants to take flash loans and pay a small fee that goes back to the users providing liquidity to the vaults or by lending out assets in the vaults. 

This structure is also very gas efficient as different applications operating on the same vault don’t have to go through as many steps as they would have to go through without the BentoBox architecture.  

At the moment, the first and only available Bento-enabled application is the lending platform – Kashi, but the team is working on bringing more applications to BentoBox in the future.  

And this is a good segue into Kashi. 

Kashi 

Kashi which means “lending” in Japanese is Sushi’s first lending and margin trading solution powered by BentoBox. Kashi allows anyone to create customised and gas-efficient markets for lending and borrowing.

In contrast to other popular DeFi money markets such as Aave or Compound, Kashi isolates each of the markets. This means that users can create markets for more risky assets without having an impact on other markets. 

Having the ability to borrow an asset also opens up the possibility for shorting it. This is useful for speculators who believe that the asset will go down in value but also allows for hedging, which can be extremely handy, for instance, when yield farming risky assets. 

As an example, let’s say a new token is launched. 

Someone can create a money market for the new token on Kashi which allows anyone to provide collateral in a chosen coin, let’s say ETH, and borrow the new token. The short seller can now borrow the new token and sell it immediately for ETH. If the price of the new token goes down in relation to ETH the short seller can buy back the new token at a lower price in the future and repay their loan denominated in new tokens. 

The main caveat is that in order to create a money market for a new token there has to be a reliable price oracle available. Kashi allows the user to choose a price oracle at the time of creating a new market. At the moment, only price feeds available on Chainlink can be used, limiting the number of possible new markets that can be created. However, the Sushi team is working on adding their own TWAP price oracle that would expand the set of available price feeds. 

Adding a new risky asset to one of the existing money markets would threaten the solvency of the whole protocol. This is because if such a coin was used as collateral and experienced a sharp drop in price this could make a lot of accounts undercollateralized and allow for a cascade of liquidations. On the other hand, if such a coin is borrowed and quickly multiplies in price this also creates a problem as borrowed assets are worth more than collateral, making the account undercollateralized again. 

Miso

The last but not least feature of Sushi that we’re going to cover in this article is Miso.

Miso is a token launchpad platform. It facilitates launching new tokens on Sushi. 

Miso focuses on providing a good experience for both the project creators launching new tokens and for people interested in finding and supporting these projects. 

When it comes to project creators, Miso offers a set of smart contracts that makes the process of creating a new token easier. On top of that it allows the projects to attract a larger initial audience than they may have been capable to reach on their own. 

When it comes to project supporters, they can benefit from having peace of mind that the token and infrastructure around the token was created using audited and battle-tested contracts. They can also easily discover new projects and participate in reliable token launches. 

Miso is clearly yet another important element of the overall Sushi ecosystem. 

Summary

With a steadily growing trading volume on its decentralized exchange, the profit-sharing mechanism for SUSHI holders, an increasing number of chains and scaling solutions to launch on and new features being added to the ecosystem, Sushi looks like one of the strongest DeFi projects. 

This is why a lot of people in the DeFi community believe that Sushi is underrated, especially when compared to other decentralized exchanges. It’s hard to say exactly why this is the case, but it might come from the fact that Sushi started as a fork of Uniswap and had a bit of a rocky launch. 

Nevertheless, Sushi is clearly one of the top DeFi protocols to keep an eye on and it will be interesting to watch new elements being added to BentoBox and the rest of the Sushi ecosystem, with the team pursuing new chains and scaling solutions. 

One of them is the previously mentioned Arbitrum – an Ethereum Layer 2 optimistic rollup-based scaling solution that looks like the next place where Sushi is about to launch. 

So what do you think about Sushi? 

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

]]>
https://finematics.com/sushi-explained/feed/ 0
How Does Thorchain Work? DeFi Explained https://finematics.com/thorchain-explained/?utm_source=rss&utm_medium=rss&utm_campaign=thorchain-explained&utm_source=rss&utm_medium=rss&utm_campaign=thorchain-explained https://finematics.com/thorchain-explained/#respond Mon, 17 May 2021 14:54:30 +0000 https://finematics.com/?p=1345

Intro 

So what is Thorchain all about? How does it work? And how does it make it possible to swap between native assets across different blockchains? You’ll find answers to these questions in this article. 

With billions of dollars in trading volume, decentralised exchanges have been gaining more and more traction. It’s not uncommon to see over $1B in daily trading volume on Uniswap alone.

Although protocols like Uniswap, Sushiswap or Curve are great when it comes to exchanging assets within the Ethereum ecosystem, they don’t support swaps between different blockchains. 

To accommodate this problem, a common approach is to represent external assets in the form of wrapped or synthetic tokens on Ethereum. The most popular asset on other blockchains outside of Ethereum is of course Bitcoin. There are multiple ways of representing Bitcoin on Ethereum that allows it to be traded on decentralized exchanges. Wrapped Bitcoin, renBTC, sBTC to name a few. 

Even though most of these approaches work fine, they usually make certain tradeoffs when it comes to either the custody or the security of the assets. If you want to learn more about it check out this article here

What if there was a way of swapping native assets instead? For example, making a trade between Bitcoin on the Bitcoin blockchain and Ether on the Ethereum blockchain. 

And this is exactly where Thorchain comes into play. 

Thorchain is a decentralized liquidity protocol that allows for swapping native assets between different blockchains such as Bitcoin, Ethereum or Binance Smart Chain. 

When it comes to managing liquidity, Thorchain uses a liquidity pool model known from protocols like Uniswap or Bancor. 

In this model, liquidity providers lock 2 assets in a liquidity pool. This in turn provides liquidity for traders who want to swap between these 2 assets and pay a small fee that goes back to the liquidity providers.

If you want to better understand how liquidity pools work you can check out this article here.

Thorchain is often explained as a cross-chain Uniswap. This is usually a good simplification for understanding the general idea behind Thorchain although there are some big differences between these two protocols that we’re going to explain later. 

Before we dive deeper into the mechanics of Thorchain, let’s have a quick look at how the project came into existence. 

Thorchain History

Thorchain started as a small project at the Binance hackathon in 2018.

The team behind Thorchain continued their research after the hackathon ended, but decided to put some of their effort on pause later that year as they were waiting for a few missing pieces of technology needed for creating a fully functioning cross-chain decentralized exchange. 

These were mainly Tendermint & Cosmos SDK and a working implementation of TSS – Threshold Signature Scheme.  

Seeing the viability of the product, the team decided to raise a small seed round and worked on a proof-of-concept of a decentralised exchange built on top of the Thorchain protocol, called Instaswap, which was later demonstrated during the Cosmos hackathon in Berlin. 

After that, they announced their first go-to-market product – BEPSwap –  in July 2019. The main goal of BEPSwap was to enable BEP2 asset swaps and was limited to Binance Chain. 

Also in July 2019, the team decided to raise more funds through an Initial Dex Offering on Binance Dex. The IDO resulted in $1.5M raised that was sufficient to enable further development of the project.  

The team continued their work on the protocol which resulted in the limited mainnet release, called multi-chain chaos network or MCCN, in April 2021.

Interestingly, the Thorchain team decided to remain mostly anonymous, even to this day. 

Now, let’s see how Thorchain works under the hood. 

How Does It Work 

At the core of the Thorchain protocol is a network of nodes built with Tendermint and Cosmos SDK.

This approach allowed Thorchain to create a separate blockchain with its own consensus and network layer without having to build all of its elements from scratch. 

Thorchain leverages Tendermint BFT model that allows the network to reach consensus even if up to ⅓ of all the nodes start failing. 

The consensus mechanism is important as Thorchain nodes have to work together, for example, in order to record transactions coming from other blockchains.

To see how this works in practice, let’s go through a quick example. 

Let’s say a user wants to swap their Bitcoin on the Bitcoin network to Ether on the Ethereum network. 

The user sends a standard Bitcoin transaction to the Bitcoin vault – a Bitcoin address controlled by the Thorchain network. 

Thorchain nodes keep monitoring vault addresses in order to acknowledge new transactions. 

To achieve this, each Thorchain node a.k.a THORNode consists of a few major components. The most important ones being: the service running the Thorchain blockchain itself; a full node for each of the connected blockchains, for example, a Bitcoin or an Ethereum node; and the Bifrost. 

The Bifrost Protocol acts as a connective layer between the Thorchain network and other networks such as Bitcoin or Ethereum. One of its main responsibilities is to watch the vault addresses in order to find inbound transactions that are later converted into THORChain witness transactions. 

The witness transactions are initially recorded as pending – which is one of the states in the Thorchain state machine. After the majority of nodes agree on the state of the inbound transaction, the transaction is moved to the “finalised” state. 

At this point, the user’s Bitcoin deposit is recorded on the Thorchain blockchain. 

Time for the other part of the swap – sending Ether back to the user. 

Once a new inbound transaction is finalised, the Thorchain protocol initiates a swap. The swap transaction is recorded on the Thorchain blockchain and the Bifrost Protocol is used again – this time to initiate a withdrawal of ETH from the Ether outbound vault. 

This outbound transaction is translated from Thorchain’s internal representation into a valid transaction for the destination chain using the respective chain client – in our case the Ethereum Client – and broadcast to the respective network. 

At this point, the swap is completed and the user ends up with Ether in their Ethereum wallet. 

Although this sounds quite simple, there is quite a lot of detail to make it all possible. 

TSS

In order to sign transactions, the network has to be able to control vault addresses on each of the integrated blockchains. 

Of course, storing private keys on each of the nodes would be a huge security risk and this is also why Thorchain uses the previously mentioned Threshold Signature Scheme or TSS. 

TSS is a cryptographic primitive for distributed key generation and signing. You can think about it as a better version of multisig. Both of them focus on achieving the same goal – allowing multiple parties to come together and sign a transaction only when a certain, previously set threshold is reached. The main difference is that multisig is usually implemented on the application layer of the blockchain, for example, as a smart contract on Ethereum, whereas TSS support is always possible regardless of the blockchain as it relies on basic cryptographic elements. 

This allows for making the whole process of signing transactions cheaper and more secure. 

Although TSS has a lot of benefits, it hasn’t yet been as battle-tested as other popular cryptographic elements such as ECDSA or certain hash functions. 

Vaults

Another interesting detail of Thorchain architecture is the way vaults operate. 

There are 2 types of vaults – “inbound” and “outbound”. 

Inbound vaults store most of the funds in the system. They are slower but more secure as they require ⅔ of all TSS signers to sign a transaction which can take even up to 20 seconds.  

This would be quite limiting for the whole system, so Thorchain introduced smaller, less secure outbound vaults that are run by each of the THORNodes. These vaults are faster as they require only a single signature from the node they run on. The funds in these vaults are limited to 25% of the value of its bond in assets. More on the bonding process a bit later in the article, but this basically creates incentives that prevent the node operator from stealing funds from the outbound vaults. These vaults are also being constantly topped up by the system as the funds are being used for the outbound transactions. 

PoS & Churning

As mentioned earlier, Thorchain uses Tendermint and Cosmos SDK. In this model, the Thorchain network operates as a Proof-Of-Stake (PoS) system where the nodes that want to be able to sign and verify transactions have to stake a certain amount of the RUNE tokens. 

In the Thorchain ecosystem, this process of staking RUNE tokens is also called bonding. 

At the moment of writing this article, 1,000,000 RUNE tokens worth around $18M are required to run a fully functioning Thorchain node. 

In contrast to most variations of PoS systems, the delegation of tokens is not allowed. This helps with making sure all nodes in the network are treated equally and there are no node operators that capture the majority of tokens for a long period of time. 

In fact, all nodes on the Thorchain network are anonymous and only identifiable by their IP address and public key. There is no branding or marketing of nodes like in other systems that allow delegation. 

In order to avoid having the same nodes with the highest amount of RUNE tokens always signing transactions, Thorchain introduces the concept of churning. 

The network maintains one set of nodes that are active and able to sign transactions and another set of nodes that are on standby. 

Every 50,000 blocks, which is around every 3 days, the churning process kicks in and the oldest or the most unreliable nodes from the active set are replaced by the nodes from the standby set. 

Churning makes sure that new nodes that meet the staking criteria can eventually have their turn at signing transactions. Also, each time the set of validators change, the Thorchain network moves funds to new vaults, ensuring that active nodes can still access funds. 

At the moment, there are 28 active nodes and 45 nodes in standby mode on the single-chain chaos network that supports BEPSwap and 11 active nodes and 9 nodes in standby mode on the recently released multi-chain chaos network. 

Currently, the multi-chain chaos network is in expansion mode which means that for every node that is churned out from the network, 2 nodes are churned in. 

The multi-chain network can grow to 99 nodes before hitting the Tendermint and TSS limits. 

Even when the network grows to 99 active nodes, it can still expand further by having the capability of sharded vaults.

It’s also important to note that even though a high amount of RUNE is required to run a fully functioning node, people can still run nodes without bonding RUNE. These nodes are able to validate transactions without the ability to sign transactions. 

RUNE Token 

This brings us to the last key element in the Thorchain architecture – The RUNE token. 

RUNE powers the Thorchain ecosystem and provides the economic incentives required to secure the network. 

All liquidity pools in the system consist of a native token and RUNE. For example, to swap from Bitcoin to Ether, the trade has to go through the BTC-RUNE and ETH-RUNE pools. In this model each asset has to be paired with RUNE. This usually results in a fewer number of pools than in a system that can create a pool out of any 2 assets like in case of Uniswap. 

Besides this, Thorchain nodes have to meet the staking criteria by bonding a particular amount of RUNE. This bond is then used to secure the system by underwriting the assets in the pools. If the node attempts to steal funds from the protocol, their bond is deducted by the amount of the assets they stole (1.5x) and the pools are made whole. Also, if the node doesn’t offer reliable service they put themselves at risk of their bond being slashed. 

The Thorchain protocol also encourages the node operators to always bond the optimal amount of RUNE. This is achieved by a mechanism called – The Incentive Pendulum. 

The Incentive Pendulum aims at keeping the system in the most optimal state. This is when 67% of all the RUNE in the system are bonded and 33% is pooled in pools. 

If there is too much capital in the liquidity pools, the network increases rewards for node operators and reduces rewards for liquidity providers. If there is too much capital bonded by the nodes, the system boosts rewards for liquidity providers and reduces rewards for node operators. 

In the optimal state, for every $1M worth of assets in the pools there would be $2M worth of RUNE bonded by the nodes. 

On top of this, RUNE is used to pay transaction fees on the network, subsidise gas needed for sending outbound transactions to different networks and participate in the Thorchain governance where users can signal which chains and assets the network should add next. 

Uniswap 

As I mentioned earlier, there are some big differences between Thorchain and Uniswap or in fact any other decentralized exchange on Ethereum. Let’s have a look at some of them. 

First of all, Uniswap allows for swapping ERC-20 tokens only, so if we want to trade assets from other blockchains they have to be represented in the form of wrapped or synthetic tokens. Thorchain allows for swapping native assets without wrapping them. 

Swaps on Thorchain are charged both a fixed network fee, as well as a dynamic slip-based fee. This means that swaps incurring more slippage are charged more in trading fees. This makes it harder for bots to extract value from swaps like in the case of a sandwich attack – a popular way of affecting the price in the liquidity pool that results in users getting worse prices on their trades.  

When it comes to speed of swaps, assets on Ethereum can be swapped in 1 Ethereum block which happens every 13 seconds on average. On Thorchain, this is a bit more complicated. The swap time depends on which networks we’re swapping between. In case of a Bitcon to Ether swap, it’d take at least 1 block on the Bitcoin network – 10 minutes on average, plus the internal time for executing a swap on the Thorchain blockchain, plus the outbound Ethereum transaction – around 13 seconds. 

Interestingly, a swap from Ether to Bitcoin would be way faster as the Thorchain network would only have to wait for the Ethereum transaction before sending an outbound Bitcoin transaction that would result in the receiving wallet having the Bitcoin UTXO spendable and available straight after the transaction is broadcast. 

Also, Thorchain as a separate blockchain loses some of the benefits of decentralized applications on Ethereum. One of them is composability. For example, a Uniswap swap can be incorporated into more complicated contracts as a part of one transaction. This is not possible with Thorchain swaps. 

Thorchain network is also, of course, not even remotely as decentralized as the Bitcoin or the Ethereum networks and the system instead relies on strong economic incentives. 

This is not necessarily that bad, considering that Thorchain has a completely different use case than the Bitcoin network that secures an asset worth over $1T or the Ethereum network that secures billions of dollars locked in smart contracts. 

For it’s main use case – swapping assets between different blockchains where most users won’t store their assets on the Thorchain blockchain for a very long period of time – it looks like it could be decentralized enough.

Summary 

After the long-awaited multi-chain chaos network release, the Thorchain team is focusing on growing the Thorchain ecosystem while making sure the system works as expected. 

Users have multiple choices when it comes to interacting with the Thorchain protocol. They can use one of the decentralized exchanges like Thorswap or Asgardex or wallets integrated with Thorchain like ShapeShift. 

We should see more applications and wallets integrating with Thorchain in the future. 

On top of this, it looks like we should see more chains and assets being onboarded, more Thorchain nodes joining the network and hopefully seeing more and more trading volume and total value locked in the liquidity pools. 

Eventually, the extra protective measures will be removed and the chaosnet will become the mainnet.

Thorchain clearly looks like an interesting protocol and a missing piece in the defi ecosystem that would allow people to swap between native assets without using centralized exchanges. 

So what do you think about Thorchain? How big can it grow in the future? 

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

]]>
https://finematics.com/thorchain-explained/feed/ 0
Polygon PoS Chain – A Commit Chain And Not A Sidechain? https://finematics.com/polygon-commit-chain-explained/?utm_source=rss&utm_medium=rss&utm_campaign=polygon-commit-chain-explained&utm_source=rss&utm_medium=rss&utm_campaign=polygon-commit-chain-explained https://finematics.com/polygon-commit-chain-explained/#respond Thu, 29 Apr 2021 19:32:17 +0000 https://finematics.com/?p=1335

So what is a commit chain? How is it different from a sidechain? And what makes Polygon Commit Chain a commit chain rather than a sidechain? We’ll answer all of these questions in this article. 

Let’s start with understanding what exactly a sidechain is. 

Sidechain 

A sidechain, in essence, is a separate blockchain that can be used as one of the ways of scaling a Layer 1 blockchain such as Ethereum or Bitcoin. As the name suggests, a sidechain runs in parallel or “on the side” of the main chain. 

Sidechains have their own consensus mechanisms usually in the form of Proof-Of-Stake, Delegated-Proof-Of-Stake or Proof-Of-Authority. 

Sidechains allow users to send their tokens from the main chain and receive them on the sidechain. Once the funds are transferred to the sidechain they can be used within the sidechain ecosystem. Similarly, users can withdraw their tokens from a sidechain back to the main chain. The whole process is called a 2-way peg or a 2-way bridge. Thing to note is that once the user tokens are on the sidechain then they are completely reliant on the consensus mechanism of the sidechain

Initially, all scaling solutions such as sidechains, Plasma and rollups were classified as Layer 2 solutions as they are built on top of Layer 1. 

After a while, the Ethereum community started differentiating between scaling solutions fully secured by the Ethereum main chain – Layer 2 and other scaling options with their own consensus mechanisms – sidechains. At the moment, pretty much all scaling solutions are classified as either one or the other. 

When it comes to Polygon Commit Chain, it is worth differentiating it from a sidechain as it has a lot of extra features that rely on the security of the main Ethereum layer.

Let’s review them one by one. 

Permissionless Validators on Ethereum

Many sidechains use a consensus mechanism that limits the number of entities able to verify the chain. For example, in a Delegated-Proof-Of-Stake (DPoS) there are usually 21 validators who are chosen by the token holders and only these validators are able to validate the state of the blockchain. Similarly, in a Proof-Of-Authority (PoA) model the chain initiator chooses authorities to run the chain. This excludes most participants and creates a situation where only a selected few are responsible for making sure the transactions are validated correctly.  

In Polygon PoS Chain anyone can join the network and start validating the state of the blockchain. This is important as it allows any participants to become validators and check by themselves that all transactions are processed correctly. 

Validators on Polygon PoS Chain have to stake their MATIC tokens and run a full node. 

MATIC tokens are staked on the Ethereum main chain. This is also where the set of all validators is maintained. If a validator starts acting in a malicious way, for example, by double signing or having a significant downtime their stake is slashed.  

This is also a good time to introduce 2 core components of the Polygon PoS Chain architecture – Heimdall Chain and Bor Chain.

Heimdall & Bor

Heimdall works in conjunction with the Stake Manager contract deployed on the Ethereum mainnet to coordinate validator selection and updating validators.

Since staking is actually done on the Ethereum smart contract, we don’t have to rely on validator honesty and instead inherit Ethereum chain security for this key part. Even if a majority of validators collude and start acting maliciously, the community can come together and redeploy the contracts on Ethereum to fork out, i.e. slash the malicious validators, and the chain can continue to operate as intended. 

Heimdall is also responsible for checkpointing – more on this later in the article. 

Bor is the block producer layer of the PoS Chain architecture that is responsible for aggregating transactions into blocks. 

Bor block producers are a subset of the validators that are periodically shuffled by the Heimdall validators. Block producers are selected to validate blocks only for a set number of blocks, also called “span”. After this time period, the selection process is triggered again. 

Let’s have a closer look at the process of selecting block producers

  1. Let’s suppose we have 3 validators in the pool, and they are Alice, Bill and Clara.
  2. Alice staked 100 MATIC tokens whereas Bill and Clara staked 40 MATIC tokens each.
  3. Validators are given slots according to their stake, as Alice has 100 MATIC tokens staked, and there are 10 tokens per slot (maintained by validator’s governance), Alice will get 5 slots in total. Similarly, Bill and Clara get 2 slots in total.
  4. All the validators are given these slots [ A, A, A, A, A, B, B, C, C ]
  5. Using the historical Ethereum blocks as a seed we shuffle this array.
  6. After shuffling the slots using the seed we get this array [ A, B, A, A, C, B, A, A, C]
  7. Now depending on Producer count(maintained by validator’s governance), we pop validators from the top, for eg if we want to select 5 producers we get the producer set as [ A, B, A, A, C]
  8. Hence the producer set for the next span is defined as [ A: 3, B:1, C:1 ].
  9. Using this validator set and Tendermint’s proposer selection algorithm we choose a producer for every sprint on Bor.

This model allows anyone to participate in securing the network with any amount of MATIC tokens. It also doesn’t sacrifice the speed of transaction as not all validators have to validate blocks all the time. 

Let’s go back to the other important function of Heimdall – checkpointing. 

Checkpointing 

Checkpoints are important as they provide finality on the Ethereum chain.

Heimdall layer allows for aggregating blocks produced by Bor into a single Merkle root and periodically publishing it to the Ethereum main chain. This published state is also called a checkpoint hence the whole process is known as checkpointing. 

Checkpoint proposers are initially selected via Tendermint’s weighted round-robin algorithm. A further custom check is implemented based on the success of checkpoint submission. This allows Polygon PoS Chain to decouple from Tendermint proposer selection and provides it with abilities like selecting a proposer only when the checkpoint transaction on the Ethereum mainnet succeeds or submitting a checkpoint transaction for previous blocks if the checkpoint transaction failed. 

Submitting a checkpoint on Tendermint is a 2-phase commit process. A proposer, selected via the above-mentioned algorithm, sends a checkpoint with their address in the proposer field and all other proposers validate it.

The next proposer then sends an acknowledgement transaction to prove that the previous checkpoint transaction has succeeded on the Ethereum mainnet. Every Validator set change will be relayed by the validator node on Heimdall which is embedded onto the validator node. This allows Heimdall to remain in sync with the Polygon contract state on the Ethereum mainchain at all times.

The Polygon PoS Chain contract deployed on the main chain is considered to be the ultimate source of truth, and therefore all validation is done via querying the Ethereum main chain contract.

Checkpoints also provide “proof of burn” in the withdrawal of assets. 

Speaking about withdrawals, let’s have a look at another important element of the PoS chain – the two-way Ethereum Bridge.

Two-way Ethereum Bridge

Typical two-way bridges rely on a small set of authorities which are often not even staked, nor part of the sidechains’s validators set – basically bridges are often operated, i.e. controlled by several PoA signers. This is a significant security concern.,. 

Polygon provides 2 separate ways for moving assets between Ethereum and Polygon – Plasma Bridge and the PoS Bridge.

Plasma Bridge provides increased security guarantees due to the Plasma exit mechanism. However, there is a 7-day withdrawal period associated with all exits/withdrawals caused by certain restrictions in the Plasma architecture. 

The PoS Bridge doesn’t have this restriction and it is secured by a robust set of validators that we discussed earlier in this article. The state of these validators is maintained on the Ethereum mainnet and they are secured by all the funds staked in the system – around $500M at the time of writing this article. To the best of our knowledge, the PoS bridge is the only bridge secured by the whole validator set of a bridged chain; bridges are normally secured by a small set of PoA signers, as already mentioned earlier. 

As we can see, Polygon PoS Chain offers a lot of extra security measures based on the Ethereum main chain and it is not just a mere sidechain. Perhaps, a commit chain is a better name for it. 

So what do you think about Polygon Commit Chain? Do you think it’s valuable to differentiate it from a sidechain?

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

]]>
https://finematics.com/polygon-commit-chain-explained/feed/ 0
How (Not) To Get Rekt – DeFi Hacks Explained https://finematics.com/defi-hacks-explained/?utm_source=rss&utm_medium=rss&utm_campaign=defi-hacks-explained&utm_source=rss&utm_medium=rss&utm_campaign=defi-hacks-explained https://finematics.com/defi-hacks-explained/#respond Sat, 17 Apr 2021 19:25:00 +0000 https://finematics.com/?p=1325

Intro


Every opportunity comes with risks, and DeFi is no exception. For every way you can make money in DeFi, it seems there are at least two ways you can lose it. Although these risks can’t be avoided entirely, with careful risk management and sensible judgement, you can at least decrease your chances of getting totally rekt.

So what are some of the most common ways people lose money in DeFi? What are the different types of hacks and exploits? And most importantly, how can you minimise your chances of being negatively affected by hacks in the future? You’ll find answers to these questions in this article. 

And one more thing before we start. This article is a collaboration between Finematics and rekt.news.  

Rekt News is an anonymous platform for whistleblowers and DeFi detectives to present their findings to the community. They analyse all the major hacks and exploits, and provide creative commentary on all things crypto and DeFi, with an aim to educate and entertain their readers. Their site rekt.news contains their own articles as well as an AI-generated news aggregator that gives coverage of all the most important recent events within crypto and DeFi in particular.

So, back to the topic of the article, how can I get rekt in DeFi?

We wouldn’t have time to cover every single type of exploit, and of course, many types remain unknown, but there are a few techniques that occur regularly, and we’ll now look at a few examples.

The Rug Pull 

To “rug pull” has become a commonly used term across all of DeFi, and is now used to refer to many types of hacks and exploits, but it actually refers to a specific technique of suddenly removing the majority of liquidity from a liquidity pool.

The sudden loss of liquidity can create a death spiral for the token, as token holders try to sell as fast as possible in order to save their profits. 

To rug pull is usually the final move from a malicious team and is a common form of “exit scam”, where the protocol deletes all traces of Social Media as they try to escape with the funds.

As this type of attack is technically very simple, it is often the chosen technique for quick cash grabs by low effort projects, however, this does not mean the profits are low, and there have been several major rug pulls where users have lost multiple millions.

One such example would be the case of Meerkat Finance, who after just one day of operation, were rugged for 13 million busd and about 73,000 BNB, totalling around $31m at that time. 

If a large liquidity pool is used within a project, then the project team should not have the ability to retrieve these assets. If they do, then you are placing your trust in the project team. 

Initially, Meerkat Finance did not have this ability, however, shortly before the attack, the Meerkat Finance Deployer “upgraded” 2 of their own vaults, giving themselves the ultimate backdoor into the vaults. 

How can we avoid getting rug pulled?

Check how the liquidity is locked, is there a timelock, is there a multisig? 

Do your research into the project, find out who is backing it and what is the purpose of the project.

Is the team known? If they are, what can you find out about them? Proving identity online is becoming increasingly difficult, and scammers are turning to unusual methods to build the trust of others, as was the case with DeTrade Fund, who some suspected to have used deepfake technology to create a video of a fake CEO. This story was covered by rekt.news (Deepfake – A Scam so Surreal).

On the other hand, if you can’t find any information on who is behind the project, remember that an anonymous team isn’t necessarily a bad thing, as the founder of Bitcoin remains anonymous to this day. 

Economic exploit / Flash Loan

There was a period of time when it seemed every week brought a new DeFi hack, and the words “flash loan” were never far from the scene.

The association of flash loans with “hacks and exploits” led many in the community to believe that their impact was solely negative. 

However, it’s worth remembering that these transactions were already possible for whales with large accounts, and that flash loans in themselves are not a malicious tool – they simply give access to large amounts of funding on a very short time frame. This funding can then be used to take advantage of loopholes in code, or to manipulate pricing and profit from arbitrage.

Flash loans are uncollateralised, unsecured loans that must be paid back before the blockchain transaction ends; if not repaid the smart contract reverses the transaction, so that it’s like the loan never happened in the first place.

Because the smart contract for the loan must be fulfilled in the same transaction that it is lent out, the borrower has to use other smart contracts to perform instant trades with the loaned funds before the transaction ends.

If you want to learn more about flash loans check out this article here.  

Most flash loan attacks involve the manipulation of the token price using a large amount of capital.

One example of a major flash loan attack would be Harvest Finance, who lost $33.8 million to an attacker in October of 2020.

fUSDT fell 13.7% and $FARM fell 67% over two hours as the hacker took out a $50m USDT flash loan, then used the Curve Finance Y pool to swap funds and stretch stable coin prices out of proportion.

The following actions took place in a 7 minute time period. 

  1. Take a $50m USDT flash loan
  2. Swap 11.4m USDC to USDT -> causing USDT price to go up
  3. Deposit 60.6m USDT into Vault
  4. Exchange 11.4m USDT to USDC -> USDT price goes down
  5. Withdraw 61.1m USDT from Vault -> resulting in 0.5m USDT profit
  6. Rinse and repeat 32 times. (without any prior testing)
  7. Convert to renBTC and exit to BTC and ETH via Tornado Cash (a service that allows for making anonymous transactions on Ethereum, therefore covering the attacker’s tracks)

The attacker was able to withdraw more USDT at step 4 because of the changed USDT price. As the price of USDT was lower during the time of the withdrawal, their shares represent more USDT from the Vault pool.

Approximately 4 cycles can fit into a 10m gas limit, and although the profit on each cycle was less than 1%, ~$500k per repetition adds up quickly.

Flash loans are often used to manipulate prices, which allows for arbitrage where it would otherwise not be possible. To avoid flash loan price manipulation attacks, protocols should look to use reliable decentralised oracles.

Flash loans can also be used for other attack methods such as re-entrancy, front-running, or Arbitrage.

Arbitrage

Arbitrage refers to taking advantage of price differences between different markets in order to generate a profit. These types of opportunities are especially common in immature markets such as DeFi and crypto. Arbitrage opportunities tend to decrease as liquidity increases and the market becomes more efficient.

If a pool is manipulated (with flash loans for example) to allow room for arbitrage, then this may also be considered an exploit, as liquidity providers can end up losing their funds, as was the case with Saddle Finance. 

During their launch in January this year, at least three major arbs took over 7.9 BTC ($275,735) from the early liquidity providers within 6 minutes, despite their claim to “have solved the problem of slippage”

4.01 BTC $139,961 Jan-19-2021 04:06:54 PM +UTC

0.79 BTC $27,573 Jan-19-2021 04:08:46 PM +UTC

3.11 BTC $108,548 Jan-19-2021 04:12:37 PM +UTC

Although this was only arbitrage, the users still lost out, as Saddle Finance was unable to protect them from the arbitrageurs, who were simply buying and selling, within the limitations of the code.  

This brings us to one of the common questions regarding losing funds in DeFi

“Was it a hack, or was it an exploit?”

DeFi is still such a new concept, and the entire industry is a live experiment, testing new ideas as we build a new financial system. This means that loopholes are often found in live code, and when these loopholes can be used to withdraw funds without forcibly manipulating anything, then perhaps it’s best called “an exploit”. 

However, this labelling could be applied to all hacks, as they can only operate with the code that has been written. Whether we call them a hack or an exploit, the end result is the same. If loopholes exist, then eventually someone will take advantage of them, and there is little we can do to stop this. 

Even security audits do not guarantee safety.

Audits 

rekt.news also ranks each hack and exploit on the rekt.news leaderboard, which shows not only how much was stolen from the protocol in dollar value at that time, but also who audited the protocol before the hack. 

If we look at the rekt leaderboard, we can see that the majority of hacked (or exploited) protocols actually had a security audit completed prior to the attack. This proves that an audit is not a guarantee of safety, and audit firms can also fail.

The leaderboard shows who audited the specific piece of code that was exploited. 

According to the leaderboard, the most notorious security companies are currently; Peckshield with 3 failed audits, Certik with 2, and Quantstamp also with 2.

Many of the most recent rekt.news articles covered audited protocols, showing that in the end, there is very little difference between an audited and unaudited protocol.

Users often make the mistake of believing that one security audit can cover an entire protocol forever, however, all DeFi protocols are full of moving parts, and even if a protocol is audited very thoroughly, a single small update can render the audit useless.

Summary 

So what do you think about hacks in DeFi? Have you ever been affected by any of them? 

Also, don’t forget to check out rekt.news for more content like this. 

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

]]>
https://finematics.com/defi-hacks-explained/feed/ 0
Polygon (Matic) – Ethereum’s Internet Of Blockchains https://finematics.com/polygon-matic-explained/?utm_source=rss&utm_medium=rss&utm_campaign=polygon-matic-explained&utm_source=rss&utm_medium=rss&utm_campaign=polygon-matic-explained https://finematics.com/polygon-matic-explained/#respond Sun, 14 Mar 2021 17:33:58 +0000 https://finematics.com/?p=1285

So what is Polygon, previously Matic, all about? How will it help with scaling Ethereum? Why does it claim to be Ethereum’s Internet of Blockchains? And why is it sometimes compared to Polkadot and Cosmos? You’ll find answers to these questions in this article. 

Now, let’s see how Polygon came into existence. 

Matic Network

Before the rebranding to Polygon, the project was known as Matic or Matic Network.

Matic was started in 2017 by 3 founders who were active participants in the cryptocurrency community in India and decided to band together and tackle Ethereum’s scaling problems. 

The team worked on 2 main solutions:

Plasma Chains – a layer 2 scaling solution based on Matic’s implementation of Plasma and a PoS Chain – a Proof-Of-Stake Ethereum sidechain. 

The token behind Matic Network – MATIC – was distributed through the Binance Launchpad’s initial exchange offering in April 2019 and the team was able to raise $5.6M. 

After over 2.5 years of work, The Matic Network Mainnet went live in mid-2020 and quickly started attracting more and more attention. This was supercharged by the increasing gas fees on Ethereum that showed an urgent need for finding robust scaling solutions.  

Polygon Rebranding

At the beginning of 2021, the Matic team decided to expand the scope of their project and rebranded themselves to Polygon. 

Polygon aims at creating a more generalised scaling solution. 

When it comes to scaling there are 2 main ways of doing it: Layer 2 scaling and sidechains. 

Layer 2 scaling relies on the security of the main layer – the Ethereum blockchain. Plasma, Optimistic rollups and ZK rollups are the most popular options. 

Sidechains rely on their own security models, usually by having a separate consensus mechanism. Matic PoS chain or xDai are good examples here.

If you’re interested in learning more about these different scaling solutions you can check out this article here

Instead of providing one or two scaling solutions, Polygon aims at creating an ecosystem that makes it easy to connect multiple different scaling solutions – everything from sidechains with different consensus mechanisms to Layer 2 options such as Plasma, Optimistic Rollups and ZK rollups. 

We can think about the existing Matic scaling solutions – the PoS and the Plasma chains basically becoming one of many scaling options available in the whole Polygon ecosystem. 

Polygon also provides a framework that makes it easy for new projects to quickly build their own highly-customisable scaling solution if they decide this is the path they want to choose. 

Now, let’s have a closer look at the technology behind Polygon to understand the project a bit better. 

Polygon Architecture

Polygon supports two major types of Ethereum-compatible blockchain networks: stand-alone networks and secured chains – networks that leverage a “security as a service” model. 

Stand-alone chains rely on their own security, for example, they can have their own consensus models such as Proof-Of-Stake or Delegated-Proof-Of-Stake. These kind of networks are fully sovereign which presents them with the highest level of independence and flexibility, but it makes it more difficult for them to establish a reliable security model. For example, PoS requires a high number of reliable validators. This kind of model is usually suitable for enterprise blockchains and already established projects with strong communities. 

Secured chains utilise the “security as a service” model. This can be either provided directly by Ethereum, for example via fraud proofs used by Plasma, or by a pool of professional validators. These validators are run in the Polygon ecosystem and they can be shared by multiple projects – a concept similar to Polkadot’s shared security model. Secured chains offer the highest level of security but sacrifice independence and flexibility. This model is usually preferred by startups and security-focused projects. 

As we’ve probably noticed, the distinction between stand-alone chains and secured chains is even more generic than the usual split between sidechains and layer 2 solutions that we described earlier. This allows Polygon to accommodate pretty much all possible scaling solutions. 

Now, as we know what kind of solutions are supported by Polygon, let’s dive a bit deeper into the architecture. 

Polygon architecture consists of 4 abstract and composable layers. 

The Ethereum Layer. Polygon chains can use Ethereum as their base layer and leverage Ethereum’s high security. This layer is implemented as a set of smart contracts on Ethereum and can be used for things like finality and checkpointing, staking, dispute resolution and messaging between Ethereum and Polygon chains. 

This layer is optional as Polygon-based chains are not obligated to use it.

The next one is the Security Layer. This is another non-mandatory layer that can provide a “validators as a service” function. This function allows Polygon chains to make use of a set of validators that can periodically check the validity of any Polygon chain in return for a fee.  

This layer is usually implemented as a meta blockchain that runs in parallel to Ethereum and is responsible for things like validator management – registering/deregistering, rewards, shuffling and Polygon Chains validation. 

The Security Layer is fully abstract and can have multiple implementations with different characteristics. This layer can also be implemented directly on Ethereum and leverage Ethereum’s miners as validators. 

The next layer is the Polygon Networks Layer. This is the first mandatory layer in the Polygon architecture. This layer consists of sovereign blockchain networks where each network can maintain the following functions: transaction collation, local consensus and block production. 

Last, but not least is the Execution Layer. This layer is responsible for interpreting and executing transactions that are included in Polygon’s chains. It consists of the execution environment and execution logic sublayers. 

The main takeaway when it comes to Polygon’s architecture is that it is deliberately made to be generic and abstract. This allows other applications, that are looking to scale, to choose the best scaling solution that perfectly fits their needs. 

As we know, different applications may want to optimise for different things such as security, transaction speed, transaction cost or sovereignty and to excel in one of them usually means making a sacrifice somewhere else. 

As an example, a DeFi protocol that aims at storing billions of dollars locked in smart contracts probably wants to optimise for security and might be happy with sacrificing sovereignty. A protocol like this would most likely use the Ethereum Layer. 

Another project, let’s say an NFT marketplace, may want to optimise for ultra-low transaction costs and would be ok with making a sacrifice by lowering their security from extremely high to high enough. Such a project could skip the Ethereum Layer and rely on the Security Layer, with a set of shared validators. 

Or maybe a blockchain-based game wants to rely on its own consensus mechanism with ultra-fast block time. In this case, they can entirely skip both the Ethereum and the Security Layer and focus on the Polygon Network Layer. 

As we can see, Polygon can provide multiple options and the teams behind different applications can decide to pick the one that perfectly fits their use cases. It also aims at making it easy to migrate from one scaling solution to another one. This might be needed if the circumstances behind a project change, or if another better scaling solution becomes available.

This architecture also allows for multiple different Polygon-based scaling solutions to communicate with each other. This is super important, as it prevents creating siloed systems. 

At the moment, the only scaling solutions available in the Polygon ecosystem are the Matic PoS chain and the Matic Plasma Chain solutions. The team is also actively working on adding multiple other options such as ZK rollups, optimistic rollups, enterprise chains and other side chains.

Currently, projects launching on Polygon start with the Matic PoS and the Matic Plasma Chains.

Matic PoS Chain And Matic Plasma Chains

Matic Plasma Chains is an Ethereum Layer 2, predicate-based Plasma implementation. Plasma, in essence, is a framework for building scalable decentralized applications.

Plasma allows for offloading transactions from the main chain into child chains which enables fast and cheap transactions. One of the drawbacks of Plasma is a long waiting period for users who want to withdraw their funds from Layer 2. Plasma cannot be used to scale general-purpose smart contracts.

Matic PoS Chain is a permissionless sidechain and runs in parallel to the Ethereum chain. The chain is secured by the Proof-Of-Stake consensus mechanism with its own validators. 

Although Matic PoS Chain has its own consensus mechanism it also relies on Ethereum’s security when it comes to validator staking and checkpoints. 

Matic PoS chain is EVM-compatible, which allows Ethereum-based projects to effortlessly migrate their smart contracts. 

So far, Matic PoS Chain and Matic Plasma Chains were able to onboard over 80 applications, process over 5M transactions and secure over $200M of users’ funds.

Some of the projects that have either already migrated to Matic PoS Chain, launched directly on Matic, or are in the process of migrating include Quickswap – a fork of Uniswap, Sushiswap, Aavegotchi, Polymarket, Polkamarkets and Superfarm.

Besides that, infrastructure projects such as The Graph and Chainlink also decided to expand to Polygon. 

Recently, Polygon also announced a partnership with a big player in the video gaming industry – Atari.

Summary

So far, it looks like Matic’s rebranding to Polygon and the expansion of the scope of the project was a really good idea. 

This is mainly because it becomes clearer and clearer that there will be a spectrum of different scaling solutions available in the future. 

In the new architecture, Polygon can facilitate connections between these different scaling options. 

It also looks like there is a big focus on both optimistic rollups and ZK rollups which Polygon can also add to the multitude of available scaling solutions. 

The main risk here would be if one of these popular scaling technologies gets adopted on its own without integrating with Polygon. But even if this happens, there is a chance that this technology would eventually be integrated into Polygon anyway.  

It will be interesting to track Polygon’s progress when it comes to adding more scaling options and onboarding more and more projects. 

We’ll also have to see how easy it will be to communicate between different scaling solutions with different security guarantees. 

It’s also worth mentioning that Polygon’s whitepaper has direct comparisons to other layer 1 blockchains such as Polkadot, Cosmos or Avalanche that also focus on the interoperability between different blockchains. 

In contrast to these projects, Polygon focuses on the Ethereum ecosystem, with the Ethereum chain being the main hub that connects everything. 

This, of course, has a lot of advantages such as a strong community of users and developers, a well-known programming language – Solidity – and the most popular virtual machine in the cryptocurrency space  – EVM. 

Besides that, Ethereum has a long history of serving as a reliable base chain that handles trillions of dollars in economic activities. Something that took time to develop and is hard to be replicated by a completely new blockchain. 

The power of familiarity with the Ethereum stack cannot be overlooked as shown by the recent rise of Binance Smart Chain. 

So what do you think about Polygon? Will this be the ultimate way of scaling Ethereum? 

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

Finematics is also now participating in Round 9 of Gitcoin Grants based on the quadratic funding mechanism where even the smallest contribution matters. If you’d like to show your extra support, check out this link here.

]]>
https://finematics.com/polygon-matic-explained/feed/ 0
What Is Gas? Ethereum High Transaction Fees Explained https://finematics.com/what-is-gas-ethereum-high-transaction-fees-explained/?utm_source=rss&utm_medium=rss&utm_campaign=what-is-gas-ethereum-high-transaction-fees-explained&utm_source=rss&utm_medium=rss&utm_campaign=what-is-gas-ethereum-high-transaction-fees-explained https://finematics.com/what-is-gas-ethereum-high-transaction-fees-explained/#respond Sat, 20 Feb 2021 18:41:23 +0000 https://finematics.com/?p=1261

So what exactly is gas? Why are transaction fees so high at the moment? And what are some of the ways to make the transaction cost lower? You’ll find answers to these questions in this article. 

Let’s start with what gas actually is. 

What Is Gas 

Gas is a unit used for measuring the amount of computational effort required to perform specific actions on the Ethereum blockchain.

The name itself hasn’t been chosen by accident. Similarly to gasoline fueling a car and allowing it to drive, gas on the Ethereum network fuels transactions and allows them to perform different operations. 

Every operation on the Ethereum blockchain, or to be precise on the Ethereum Virtual Machine (EVM), has an associated gas cost. For example: adding 2 numbers costs 3 gas; getting the balance of an account – 400 gas; sending a transaction – 21,000 gas. 

Smart contracts usually consist of multiple operations that together can cost even hundreds of thousands of gas.

What is interesting is that the gas cost by itself doesn’t tell us how much we have to pay for a particular transaction. To calculate the transaction fee we have to multiply the gas cost by gas price. 

The gas price is measured in gwei – a smaller unit than ether where 1 gwei equals 0.000000001 ETH. We can think about it as a major and a minor unit similarly to dollars and cents. 

As an example, let’s say we want to send a simple Ethereum transaction and the ETH price is at $1,800. Most of the popular Ethereum wallets such as Metamask estimate necessary gas prices and allow us to choose between fast, medium and slow transaction confirmation speed. Let’s assume that the wallet estimated the gas price to be set to 100 gwei if we want to have a chance of having our transaction confirmed within the next minute. 

We can now quickly calculate that we have to pay $3.78 for such a transaction. We multiply the gas cost for sending a transaction – 21,000 gas – and the gas price – 100 gwei. This is equal to 2,100,000 gwei which is 0.0021 ETH. At the ETH price of $1,800, this gives us $3.78. 

ETH Price And Gas

It’s worth mentioning that gas is only an abstract unit that exists only inside the EVM and the user always pays for their transactions in ETH.

The main reason for having a separate unit for measuring computational effort is to decouple it from the price of ETH. 

This means that the increase in the ETH price should not change the cost of transactions. If the network activity stays the same and the price goes up we should see the gas price going down, so the final transaction cost measured in ETH stays the same in dollar value. 

Saying this, a price increase of ETH is very often correlated with an increase in the activity on the Ethereum network – something that indeed increases the cost of transactions. 

Now, let’s see how exactly an increase in network activity causes the transaction cost to go up. 

To start with – all transactions sent to the Ethereum network land in the mempool. This is a place where all pending transactions are waiting for the miners to pick them up and include them in the next Ethereum block. 

Miners are incentivised to pick up transactions with the highest gas price first as they are basically doing a fixed unit of work for a better price. 

Miners are also limited to how many transactions they can include in one single block. This is determined by the maximum gas limit per block. At the time of writing this article, this limit is set to 12.5M gas. 

As a quick example, let’s assume there are only simple ETH transactions in the mempool each one costing 21,000 gas. A miner can include ~595 such transactions (12.5M/21K). If there are, let’s say, 1,000 pending transactions in the mempool, the miner would choose transactions by sorting all pending transactions by the gas price and choosing 595 most profitable ones. 

The current fee model is based on a simple auction mechanism and the users who want to have their transaction picked up by miners first have to essentially outbid other people for the space in a block. This in turn drives the gas prices up, especially at times when a lot of users have urgent transactions that they want to confirm. 

Why Do We Need Gas

To wrap up the gas explanation, it’s also important to understand why gas has to exist in the first place. EVM as a Turing-complete machine allows for executing any arbitrary code. Although this is one of the main reasons that makes Ethereum so powerful it also makes it vulnerable to the halting problem. The halting problem is the problem of determining, from a description of an arbitrary computer program and an input, whether the program will finish running, or continue to run forever.

Without gas, a user could execute a program that never stops, either by making a mistake in their code or just by being malicious. To prevent this, Ethereum introduced a gas cost associated with each operation that would prevent a program from running forever and bringing the whole network to the grinding halt. 

Besides the gas price, each transaction also has a gas limit that has to be equal or higher to the anticipated amount of computation needed for successfully executing a particular transaction. 

EVM, before executing each operation within a transaction, checks if there is enough gas left for that operation. In case there is not enough gas left, the whole transaction is reverted with “out of gas” exception and all state changes are rolled back. The user would still pay the transaction fee for the amount of work that has been done by the miner even if the transaction fails. This is again to avoid attacks on the network. 

If the transaction consumes less gas than initially anticipated the remaining amount of gas is converted to ETH and refunded to the sender. 

It’s also really important that all operations on Ethereum have the correct gas cost in relation to each other; otherwise, that could be another attack vector. One of such attacks took place in 2016 and resulted in a hard fork that repriced certain low-level operations. 

Now, as we know a bit more about gas, let’s have a look at the recent period of high transaction fees and a few solutions that can lower the transaction cost now and in the future. 

High Fees on Ethereum

With record volumes on decentralised exchanges, the highest total value locked on defi lending platforms, multiple yield farming opportunities available, and minting more and more NFTs – the Ethereum network is as busy as ever. 

This popularity results in high demand for block space which in turn results in high transaction cost. 

It’s not uncommon anymore to pay more than $10 for a simple ERC20 transfer or $50-100 for a Uniswap transaction. This, of course, is not ideal as it makes it really hard for smaller players to participate in the Ethereum ecosystem. 

Fortunately, there are multiple solutions either already available or being actively worked on. Let’s go through some of the most important ones. 

Layer 2 Scaling and Eth2

Layer 2 scaling is a collective term for solutions that help with increasing the capabilities of the main Ethereum chain – Layer 1 – by handling transactions off-chain. Besides improving transaction speed and transaction throughput, layer 2 solutions can greatly reduce the transaction fees. 

Loopring is a good example of a decentralized exchange built on Layer 2 that is getting more and more popular. The exchange has recently hit $200M in total value locked and over $10M in daily trading volume.

Another project – Matic – that was recently rebranded to Polygon also hit over $200M in TVL on their Plasma+POS chain. 

A more general-purpose solution – Optimism – that is based on optimistic rollups is also being rolled out. This is important as it will allow DeFi smart contracts to interact with each other in a similar way to how they interact on Layer 1. 

One of the missing pieces that can increase the adoption of Layer 2 solutions even further is direct onboarding to Layer 2. This could decrease the cost of transactions even further as users would be able to transfer their ETH directly from an exchange to a Layer 2 solution like Loopring. 

If you want to learn more about Layer 2 Scaling check out this article here

Besides Layer 2 scaling, another solution that can decrease the transaction cost, in the long run, is Eth2 which introduces sharding and Proof-Of-Stake. You can learn more about these concepts here

EIP-1559

EIP-1559 is another solution for optimising the transaction cost. 

Although the proposal will not have a direct effect on lowering the transaction cost, it will allow for optimising the fee model by smoothing fee spikes and limiting the number of overpaid transactions. This will make transaction fees more predictable.

From the timeline perspective, it looks like EIP 1559 could be implemented in early 2021. 

Here is a separate article that explains EIP-1559 in depth. 

Optimising Gas Usage

Besides using Layer 2 scaling solutions and waiting for other improvements, there are a few other tricks that can help us with lowering our transaction cost on Layer 1. 

First of all, if we don’t have any urgent transactions, we can try to find times of the day when the gas prices are the lowest. 

Besides this, we should always double-check the gas cost estimated by our wallet with a separate reliable source such as https://ethgasstation.info/

Another trick, used by 1Inch exchange, allows for lowering transaction fees with CHI tokens. These tokens must be burned alongside the primary operation, which allows for reducing the total amount of gas spent in a transaction.

This can be achieved by leveraging an EVM mechanism that refunds gas when storage space is freed. When CHI tokens are minted, dummy smart contracts on the Ethereum network are created. Burning CHI destroys these contracts and results in a gas refund. 

Other Chains 

So how about other chains besides Ethereum? 

There is no doubt that the recent period of high transaction fees on Ethereum resulted in a few other chains capturing a meaningful amount of users and volume.

At this point, it’s hard to say how much of this will be a short-term play versus a longer-term user acquisition. 

Saying this, we have to keep in mind that some of these chains are not fully decentralized and permissionless. This basically creates a fake DeFi ecosystem that may be fun to play with but is actually not that much different from using a centralized exchange. 

So what do you think about gas and high transaction fees? What is your favourite way of lowering it? 

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

]]>
https://finematics.com/what-is-gas-ethereum-high-transaction-fees-explained/feed/ 0
Derivatives in DeFi Explained https://finematics.com/derivatives-in-defi-explained/?utm_source=rss&utm_medium=rss&utm_campaign=derivatives-in-defi-explained&utm_source=rss&utm_medium=rss&utm_campaign=derivatives-in-defi-explained https://finematics.com/derivatives-in-defi-explained/#respond Sat, 30 Jan 2021 20:53:49 +0000 https://finematics.com/?p=1233

So what are derivatives? Why are they important? And what are some of the most popular derivatives protocols in defi?  You’ll find answers to these questions in this article.

Derivatives  

Derivatives are one of the key elements of any mature financial system. As the name suggests derivatives derive their value from something. This “something” is usually the price of another underlying financial asset such as a stock, a bond, a commodity, an interest rate, a currency or a cryptocurrency. Some of the most commonly used derivatives are forwards, futures, options and swaps. 

There are two main use cases for derivatives: hedging and speculation. Hedging allows for managing financial risks. To understand hedging a bit better let’s revisit one of the commonly used examples.

Imagine a farmer that primarily focuses on growing wheat. The wheat price can fluctuate throughout the year depending on the current supply and demand. As the farmer plants wheat, they are committed to it for the entire growing season which presents them with a big risk in case the price of wheat is low when the harvest time comes. 

To accommodate this risk, the farmer will sell short wheat futures contracts for the amount that they predict to harvest. As the time of harvest approaches, the farmer will close their position and incur a profit or a loss depending on the price of wheat. 

If the price of wheat is lower than initially anticipated the short position makes a profit that offsets the loss from selling the actual wheat. 

If the price of wheat is higher, the short position will be at a loss but the profit from selling the wheat offsets that loss. 

What is important to understand is that no matter what happens to the wheat price the farmer will end up with a predictable income. 

To stay in the agricultural world, yield farmers in decentralized finance can also use hedging to offset a potential loss that can occur if the price of one of the tokens used for yield farming loses its value in relation to another token. This can happen, for example, while providing liquidity to an automated market maker like Uniswap and is known as impermanent loss

Besides our agricultural examples, derivatives allow other crypto companies to hedge their exposure to different cryptocurrencies and run more predictable businesses.  

The other popular use case for derivatives is speculation. 

In a lot of financial instruments including derivatives, speculation can represent a significant amount of traded volume. This is because derivatives offer an easy exposure to particular assets that may be hard to access otherwise, for example, trading oil futures instead of actual barrels of oil. They can also provide easy access to leverage – a trader can purchase a call or a put option by providing only enough funds to cover the option premium and gain exposure to a significant amount of the underlying asset. 

Speculators are important market participants as they provide liquidity to the market and allow people, who actually need to buy a particular derivative to hedge their risk, to easily enter and exit the market. 

Derivatives have a long and interesting history. From clay tokens representing commodities traded by the Sumerians, through the use of “fair letters” to buy and sell agricultural commodities in Medieval Europe, to the establishment of the Chicago Board of Trade (CBOT) in 1848 – one of the world’s oldest futures and options exchanges.

When it comes to more modern times, derivatives have been one of the major forces that drive the whole financial industry forward since the 1970s. 

The total market size of all derivatives is estimated to be as high as $1 quadrillion which completely dwarfs any other market including the stock or bond markets and of course the tiny cryptocurrency market that has just recently touched the $1 trillion mark. 

Every growing market naturally develops its own derivatives market that can end up being an order of magnitude bigger than its underlying market.   

This is also why a lot of people in the decentralized finance space are extremely bullish on the potential of decentralized derivatives that, in contrast to traditional finance, can be created by anyone in a completely permissionless and open way. This in turn increases the rate of innovation that has been stagnating in traditional finance already for a while. 

Now, as we know a bit more about derivatives, let’s jump into some of the most important derivatives protocols in DeFi.

Synthetix 

Synthetix is usually the first protocol that comes to our minds when talking about derivatives in DeFi. 

Synthetix allows for creating synthetic assets that track the price of their underlying assets. The protocol currently supports synthetic fiat currencies, cryptocurrencies and commodities that can be traded on trading platforms such as Kwenta, DHedge or Paraswap. 

Synthetix model is based on a debt pool. In order to issue a particular synthetic asset, a user has to provide collateral in the form of the SNX token. 

The protocol is highly overcollateralized – currently at 500%. This means that for each $500 of SNX locked in the system, only $100 worth of synthetic assets can be issued. This is mainly to absorb any sharp price changes in synthetic assets and the collateralization ratio will be most likely lowered in the future. 

Synthetix is also one of the first DeFi projects leading the effort of moving to layer 2 in order to lower the gas fees and make the protocol more scalable.

There is currently around $1.8B locked in the synthetix protocol – the biggest number across all defi derivatives protocols by a wide margin. 

UMA

UMA is another protocol that enables the creation of synthetic assets. 

The main difference here is that UMA, instead of highly overcollateralizing the protocol, relies on liquidators, who are financially incentivised, to find improperly collateralized positions and liquidate them. 

UMA’s model allows for creating “priceless” derivatives. This is because the model doesn’t rely on price oracles – at least not in the optimistic scenario. This in turn allows for adding a very long tail of synthetic assets that otherwise wouldn’t have a reliable price feed hence it wouldn’t be possible to create them in Synthetix. 

There is currently over $63M of total value locked in UMA’s smart contracts. 

Hegic

Hegic is a relatively new defi project that allows for trading options in a non-custodial and permissionless way. 

Users can buy put or call options on ETH and WBTC. They can also become liquidity providers and sell ETH call and put options.

Three months after the launch, Hegic had almost $100M in total value locked in the protocol, a total cumulative options trading volume of ~$168M and generated over $3.5M in fees. 

Interestingly, Hegic has been developed by a single anonymous developer which again shows the power of DeFi where, in contrast to traditional finance, even a single person or a small group of people can build a useful financial product. 

Opyn

Another DeFi project that allows for trading options is Opyn.

Opyn, launched in early 2020, started from offering ETH downside and upside protection which allowed users to hedge against ETH price movements, flash crashes, and volatility.

They’ve recently launched a V2 of the protocol that offers European, cash-settled options that auto-exercise upon expiry.  

There are 2 main option styles: European and American.

European options can only be exercised at the time of expiration whereas American options can be exercised at any time up to the expiration date. 

In contrast to Opyn, Hegic uses American style options. 

The Opyn protocol auto-exercises options that are in the money, so option holders don’t need to take any action at or before the expiration date.

Since its first release, the protocol had over $100M in traded volume.

Perp

Perpetual is yet another fairly new entrant into the decentralized derivatives space. 

As the name suggests Perpetual allows for trading perpetual contracts. A perpetual contract is a popular trading product in the cryptocurrency space used by well-known centralized platforms such as Bitmex, Binance and Bybit. It is a derivative financial contract with no expiration or settlement date, hence it can be held and traded for an indefinite amount of time.

Perpetual Protocol, at the moment, allows for trading ETH, BTC, YFI, DOT and SNX.

Trades are funded and settled in USDC – a popular stable coin in the defi space. 

All trades on Perpetual Protocol are processed using the xDai Chain – a layer 2 scaling solution. This allows for incredibly low gas fees that are currently subsidised by the protocol. 

This means that currently there are no gas fees while trading on Perpetual Protocol. Paying the gas fee is only required when depositing USDC onto the platform. 

The protocol has been live for only just over a month but it has already managed to achieve over $500M in volume and $500k in trading fees. 

dYdX

dYdX is a decentralized derivatives exchange that offers spot, margin and more recently – perpetuals trading.

dYdX architecture combines non-custodial, on-chain settlement with an off-chain low-latency matching engine with order books.

Besides that, the dYdX team has been building a new product for Perpetual Contracts on Layer 2, powered by StarkWare’s ZK Rollups that is due to launch in early 2021.

The total cumulative trade volume across all products on dYdX reached $2.5 billion in 2020, a 40x increase when compared to the previous year. 

dYdX has recently raised $10M in a Series B round led by Three Arrows Capital and DeFiance Capital.

BarnBridge

BarnBridge is a risk tokenizing protocol that allows for hedging yield sensitivity and price volatility. 

This can be achieved by accessing debt pools of other defi protocols, and transforming single pools into multiple assets with different risk/return characteristics. 

BarnBridge, at the moment, offers two products: 

Smart Yield Bonds: interest rate volatility risk mitigation using debt based derivatives

And Smart Alpha Bonds: market price exposure risk mitigation using tranched volatility derivatives. 

There is currently over $350M of total value locked in the protocol. 

BarnBridge is also running a liquidity mining program that distributes its token – BOND –  to all users who stake stable coins, Uniswap BOND-USDC LP tokens or BOND tokens on their platform. 

Summary

As we mentioned earlier, the Derivatives Market in traditional finance is huge and it will be interesting to see how big it will become in decentralized finance. 

It is also amazing to see more and more projects launching derivatives protocols and being able to create new and exciting financial products in a permissionless and decentralized way. 

One more important thing – interacting with new DeFi protocols can be risky. So before using any of the protocols mentioned in this article always do your own due diligence as most of these projects are still in their beta or even alpha versions.

So what do you think about derivatives in DeFi? How big will they become in the future? Would you like to see a deep dive into one of the projects we mentioned in this article?

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

]]>
https://finematics.com/derivatives-in-defi-explained/feed/ 0
The Graph – Google Of Blockchains? https://finematics.com/the-graph-explained/?utm_source=rss&utm_medium=rss&utm_campaign=the-graph-explained&utm_source=rss&utm_medium=rss&utm_campaign=the-graph-explained https://finematics.com/the-graph-explained/#respond Wed, 13 Jan 2021 17:12:49 +0000 https://finematics.com/?p=1206

So what is The Graph Protocol all about? Why do some people call it the Google of Blockchains? And what is the use case for the GRT token? You’ll find answers to these questions in this article. 

Let’s start with what The Graph actually is. 

Introduction

The Graph is an indexing protocol for querying blockchain data that enables the creation of fully decentralized applications. 

The project was started in late 2017 by a trio of software engineers who were frustrated by the lack of tooling in the Ethereum ecosystem which made building decentralized applications hard. After a few years of work and a lot of iterations, The Graph went live in Dec 2020.

The Graph, as one of the infrastructure protocols, can be quite tricky to grasp, so before we jump into the details, let’s try to understand what indexing – the main concept behind The Graph – actually is.

Indexing

Indexing, in essence, allows for reducing the time required to find a particular piece of information. A real-life example is an index in a book. Instead of going through the whole book page by page to find a concept we’re looking for, we can find it much quicker in the index, which is sorted alphabetically and it contains a reference to the actual page in a book. 

Similarly, in computer science, database indexes are used to achieve the same goal – cutting the search time. Instead of scanning the whole database table multiple times to provide data to an SQL query – indexes can dramatically speed up queries by providing quick access to relevant rows in a table. 

When it comes to blockchains such as Ethereum, indexing is super important. To understand why this is the case, let’s see how a typical blockchain is built. 

A typical blockchain consists of blocks that contain transactions. Blocks are connected to their adjacent blocks and provide a linear immutable history of what happened on the blockchain to date. 

Because of this design, a naive approach for searching for a particular piece of data, such as a transaction, would be to start with Block 1 and search for a transaction across all transactions in that block. If the data is not found we move to Block 2 and continue our search. 

As you can imagine this process would be highly inefficient. This is also why every popular blockchain explorer, such as Etherscan, built their own service for reading all the data on the blockchain and storing it in a database in a way that allows for quick retrieval of data.

These kinds of services are very often called ingestion services as they basically consume all the data and transform it into a queriable format. 

Although this approach usually works fine, it requires trusting the company that provides the data – this is not ideal for building fully decentralized and permissionless applications. 

On top of that, all private crypto companies that don’t want to trust other APIs have to build their own ingestion service which creates a lot of redundant work. 

This is also why a decentralized query protocol for blockchains was needed and this is where The Graph comes into play. 

The Graph 

The Graph aims at becoming one of the main core infrastructure projects necessary for building fully decentralized applications. It focuses on decentralizing the query and API layer of decentralized web (Web3) by removing a tradeoff that dApp developers have to make today: whether to build an app that is performant or truly decentralized.

The protocol allows for querying different networks such as Ethereum or IPFS by using a query language – GraphQL. GraphQL allows for specifying which fields we’re interested in and what search criteria we would like to apply. 

Queriable data is organised in the form of subgraphs. One decentralized application can make use of one or multiple subgraphs. One subgraph can also consist of other subgraphs and provide a consolidated view of data that the application may be interested in. 

The Graph provides an explorer that makes it easy to find subgraphs of the most popular protocols such as Uniswap, Compound, Balancer or ENS. 

Uniswap subgraph provides access to a lot of useful data, for example, the total volume across all trading pairs since the protocol was launched, volume data per trading pair and data about particular tokens or transactions. 

Now, let’s jump into the architecture of The Graph Protocol. 

The Graph Architecture

The easiest way to explain this is to focus on different network participants first. 

Let’s start with Indexers. 

Indexers are the node operators of The Graph. They can join the network by staking the GRT tokens and running a Graph node. Their main function is to index relevant subgraphs. Indexers earn rewards for indexing subgraphs and fees for serving queries on those subgraphs. They also set prices for their services. To keep prices in check each Indexer competes with other Indexers, on top of ensuring the highest quality of their data. This basically creates a marketplace for the services provided by Indexers. 

Consumers query Indexers and pay them for providing data from different subgraphs. Consumers can be either end-users, other web services or middleware.

Curators are other important network participants. They use their GRT tokens to signal what subgraphs are worth indexing. Curators can be either developers that want to make sure their subgraph is indexed by Indexers or end-users that find a particular subgraph valuable and worth indexing. Curators are financially incentivised as they receive rewards that are proportional to how popular a particular subgraph becomes. 

Delegators are yet another network participant. They stake their GRT on behalf of Indexers in order to earn a portion of Indexers’ rewards and fees. Delegators don’t have to run a Graph Node. 

Last but not least are Fishermen and Arbitrators. They become useful in case of a dispute that can happen, for example, when an Indexer provides incorrect data to the Consumer. 

Now, let’s see how the network participants cooperate in order to create a trustless and decentralized system. 

Let’s say a new decentralized exchange has launched and the team behind the project wants to allow other applications for easy access to the exchange’s historical volume and other data points.

To encourage Indexers to index the new subgraph, a Curator has to step in and signal that the new subgraph is worth indexing. 

Here we have 2 options. If the new exchange was a highly anticipated project with a lot of potential, an already existing Curator would most likely step in and use their GRT tokens to signal the usefulness of the new subgraph. If the subgraph becomes popular, the curator would financially benefit from their signalling. In the case that the new exchange is not highly anticipated, the developers behind the project can become Curators themselves and use their GRT to encourage Indexers. 

Once this happens, the Indexers can step in and start indexing the subgraph. This process can take a few hours or even a few days depending on how much data has to be indexed. 

Once indexing is completed, the Consumers can start querying the subgraph. Each query issued by the consumers requires payment in GRT that is handled by the query engine. The query engine also acts as a trading engine, making decisions such as which Indexers to do business with. 

To make this process smoother, The Graph uses payment channels between the Consumer and the Indexer. If the Indexer provides incorrect results a dispute process can be initiated. 

If you’d like to dive deeper into the architecture behind The Graph protocol, you can check this link here.

Now, time to discuss the GRT token. 

The GRT Token

GRT is a utility token that plays an important role in The Graph Network design. As we mentioned earlier GRT is used by Curators to signal subgraphs that are worth indexing. On top of this, it’s staked by Indexers to keep their incentives in check. Besides that, people who own GRT tokens, but don’t want to be Indexers and run the GRT node, can become Delegators and earn a portion of Indexers reward. And also, Consumers pay for their queries in GRT.

The Graph had an initial supply of 10 billion GRT tokens and new token issuance at 3% annually that is used for paying the indexing rewards.

There is also a token burning mechanism that is expected to start at ~1% of total protocol query fees. 

The Graph protocol had a huge interest from VCs, with plenty of big names including Coinbase Ventures participating in their initial offering. 

Future

The Graph core team aims at decentralizing the protocol further by launching on-chain governance – The Graph Council – in the future.

The protocol that is currently deployed to Ethereum mainnet only supports indexing Ethereum, but multi-blockchain support is one of the areas for further research. 

The Graph is already used by other popular projects such as Uniswap, Synthetix, Decentraland and Aragon. 

It looks like The Graph could be one of the missing puzzles in the effort of increasing the decentralization of dApps. 

Some people went as far as calling The Graph the Google Of Blockchains, pointing at similarities between indexing websites by Google and indexing blockchains and decentralized applications by The Graph. 

If this analogy is correct, and The Graph indeed becomes a go-to protocol for indexing web3, it has a lot of potential to grow. 

So what do you think about The Graph? Will it become a core piece of infrastructure in the decentralized world? 

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

]]>
https://finematics.com/the-graph-explained/feed/ 0