Defi – Finematics https://finematics.com decentralized finance education Mon, 02 Aug 2021 16:23:54 +0000 en-GB hourly 1 https://wordpress.org/?v=5.8.1 https://finematics.com/wp-content/uploads/2017/09/cropped-favicon-32x32.png Defi – Finematics https://finematics.com 32 32 Rollups – The Ultimate Ethereum Scaling Solution https://finematics.com/rollups-explained/?utm_source=rss&utm_medium=rss&utm_campaign=rollups-explained&utm_source=rss&utm_medium=rss&utm_campaign=rollups-explained https://finematics.com/rollups-explained/#respond Mon, 02 Aug 2021 16:23:52 +0000 https://finematics.com/?p=1390

So what are rollups all about? What’s the difference between optimistic and ZK rollups? How is Arbitrum different from Optimism? And why are rollups considered to be the holy grail when it comes to scaling Ethereum? You’ll find answers to these questions in this article.

Intro

Ethereum scaling has been one of the most discussed topics in crypto. The scaling debate usually heats up during periods of high network activity such as the CryptoKitties craze in 2017, DeFi Summer of 2020 or the crypto bull market at the beginning of 2021. 

During these periods, the unparalleled demand for the Ethereum network resulted in extremely high gas fees making it expensive for everyday users to pay for their transactions. 

To tackle this problem, the search for the ultimate scaling solution has been one of the top priorities for multiple teams and the Ethereum community as a whole. 

In general, there are 3 main ways to scale Ethereum or in fact, most other blockchains: scaling the blockchain itself – layer 1 scaling; building on top of layer 1 – layer 2 scaling and building on the side of layer 1 – sidechains. 

When it comes to layer 1, Eth2 is the chosen solution for scaling the Ethereum blockchain. Eth2 refers to a set of interconnected changes such as the migration to Proof-of-Stake (PoS), merging the state of the Proof-of-Work (PoW) blockchain into the new PoS chain and sharding. 

Sharding, in particular, can dramatically increase the throughput of the Ethereum network, especially when combined with rollups. 

If you’d like to learn more about Eth2 you can check out this article here.   

When it comes to scaling outside of layer 1, multiple different scaling solutions have been tried with some mixed results. 

On the one hand, we have layer 2 solutions such as Channels that are fully secured by Ethereum but work well only for a specific set of applications. 

Sidechains, on the other hand, are usually EVM-compatible and can scale general-purpose applications. The main drawback – they are less secure than layer 2 solutions by not relying on the security of Ethereum and instead having their own consensus models. 

Most rollups aim at achieving the best of these 2 worlds by creating a general-purpose scaling solution while still fully relying on the security of Ethereum.

This is the holy grail of scaling as it allows for deploying all of the existing smart contracts present on Ethereum to a rollup with little or no changes while not sacrificing security. 

No wonder rollups are probably the most anticipated scaling solution of them all. 

But what are rollups in the first place? 

Rollups 

A rollup is a type of scaling solution that works by executing transactions outside of Layer 1 but posting transaction data on Layer 1. This allows the rollup to scale the network and still derive its security from the Ethereum consensus. 

Moving computation off-chain allows for essentially processing more transactions in total as only some of the data of the rollup transactions has to fit into the Ethereum blocks. 

To achieve this, rollup transactions are executed on a separate chain that can even run a  rollup-specific version of the EVM. 

The next step after executing transactions on a rollup is to batch them together and post them to the main Ethereum chain. 

The whole process essentially executes transactions, takes the data, compresses it and rolls it up to the main chain in a single batch, hence the name – a rollup. 

Although this looks like a potentially good solution, there is a natural question that comes next: 

“How does Ethereum know that the posted data is valid and wasn’t submitted by a bad actor trying to benefit themselves?” 

The exact answer depends on a specific rollup implementation, but in general, each rollup deploys a set of smart contracts on Layer 1 that are responsible for processing deposits and withdrawals and verifying proofs. 

Proofs are also where the main distinction between different types of rollups comes into play. 

Optimistic rollups use fraud proofs. In contrast, ZK rollups use validity proofs.

Let’s explore these two types of rollups further. 

Optimistic Vs ZK Rollups

Optimistic rollups post data to layer 1 and assume it’s correct hence the name “optimistic”. If the posted data is valid we are on the happy path and nothing else has to be done. The optimistic rollup benefits from not having to do any additional work in the optimistic scenario.

In case of an invalid transaction, the system has to be able to identify it, recover the correct state and penalize the party that submits such a transaction. To achieve this, optimistic rollups implement a dispute resolution system that is able to verify fraud proofs, detect fraudulent transactions and disincentivize bad actors from submitting other invalid transactions or incorrect fraud proofs. 

In most of the optimistic rollup implementations, the party that is able to submit batches of transactions to layer 1 has to provide a bond, usually in the form of ETH. Any other network participant can submit a fraud proof if they spot an incorrect transaction. 

After a fraud proof is submitted, the system enters the dispute resolution mode. In this mode, the suspicious transaction is executed again this time on the main Ethereum chain. If the execution proves that the transaction was indeed fraudulent, the party that submitted this transaction is punished, usually by having their bonded ETH slashed. 

To prevent the bad actors from spamming the network with incorrect fraud proofs, the parties wishing to submit fraud proofs usually also have to provide a bond that can be subject to slashing.

In order to be able to execute a rollup transaction on layer 1, optimistic rollups have to implement a system that is able to replay a transaction with the exact state that was present when the transaction was originally executed on the rollup. This is one of the complicated parts of optimistic rollups and is usually achieved by creating a separate manager contract that replaces certain function calls with a state from the rollup. 

It’s worth noting that the system can work as expected and detect fraud even if there is only 1 honest party that monitors the state of the rollup and submits fraud proofs if needed. 

It’s also worth mentioning that because of the correct incentives within the rollup system, entering the dispute resolution process should be an exceptional situation and not something that happens all the time. 

When it comes to ZK rollups, there is no dispute resolution at all. This is possible by leveraging a clever piece of cryptography called Zero-Knowledge proofs hence the name ZK rollups. In this model, every batch posted to layer 1 includes a cryptographic proof called a ZK-SNARK. The proof can be quickly verified by the layer 1 contract when the transaction batch is submitted and invalid batches can be rejected straight away. 

Sounds simple right? Maybe on the surface. In practice to make it work, multiple researchers spent countless hours iterating on these clever pieces of cryptography and maths. 

There are a few other differences between optimistic and ZK rollups, so let’s go through them one by one. 

Due to the nature of the dispute resolution process, optimistic rollups have to give enough time to all the network participants to submit the fraud proofs before finalizing a transaction on layer 1. This period is usually quite long to make sure that even in the worst-case scenario, fraudulent transactions can still be disputed. 

This causes the withdrawals from optimistic rollups to be quite long as the users have to wait even as much as a week or two to be able to withdraw their funds back to layer 1. 

Fortunately, there are a few projects that are working to improve this situation by providing fast “liquidity exists”. These projects offer almost instant withdrawals back to layer 1, another layer 2 or even a sidechain and charge a small fee for the convenience. The Hop protocol and Connext are the projects to look at. 

ZK rollups don’t have the problem of long withdrawals as the funds are available for withdrawals as soon as the rollup batch, together with a validity proof, is submitted to layer 1. 

So far it looks like ZK rollups are just a better version of optimistic rollups, but they also come with a few drawbacks. 

Due to the complexity of the technology, it’s much harder to create an EVM-compatible ZK rollup which makes it more difficult to scale general-purpose applications without having to rewrite the application logic. Saying this, ZKSync is making significant progress in this area and they might be able to launch an EVM-compatible ZK rollup quite soon. 

Optimistic rollups have a little bit of an easier time with the EVM compatibility. They still have to run their own version of the EVM with a few modifications, but 99% of contracts can be ported without making any changes. 

ZK rollups are also way more computation-heavy than optimistic rollups. This means that nodes that compute ZK proofs have to be high-spec machines, making it hard for other users to run them. 

When it comes to scaling improvements, both types of rollups should be able to scale Ethereum from around 15 to 45 transactions per second (depending on the transaction type) up to as many as 1000-4000 transactions per second.

It’s worth noting that it is possible to process even more transactions per second by offering more space for the rollup batches on layer 1. This is also why Eth2 can create a massive synergy with rollups as it increases the possible data availability space by creating multiple shards – each one of them able to store a significant amount of data. The combination of Eth2 and rollups could bring Ethereum transaction speed up to as many as 100k transactions per second. 

Now, let’s talk about all the different projects working on both optimistic and ZK rollups.

Optimistic Rollups 

Optimism and Arbitrum are currently the most popular options when it comes to optimistic rollups. 

Optimism has been partially rolled out to the Ethereum mainnet with a limited set of partners such as Synthetix or Uniswap to ensure that the technology works as expected before the full launch. 

Arbitrum already deployed its version to the mainnet and started onboarding different projects into its ecosystem. Instead of allowing only limited partners to be able to deploy their protocols first, they decided to give a window of time for all protocols that want to launch on their rollups. When this period of time is finished, they will open the flood gates to all the users in one go. 

Some of the most notable projects launching on Arbitrum are Uniswap, Sushi, Bancor, Augur, Chainlink, Aave and many more. 

Arbitrum has also recently announced its partnership with Reddit. They’ll be focusing on launching a separate rollup chain that will allow Reddit to scale their reward system. 

Optimism is partnering with MakerDAO to create the Optimism Dai Bridge and enable fast withdrawals of DAI and other tokens back to layer 1. 

Although both Arbitrum and Optimism try to achieve the same goal – building an EVM-compatible optimistic rollups solution – there are a few differences in their design.

Arbitrum has a different dispute resolution model. Instead of rerunning a whole transaction on layer 1 to verify if the fraud proof is valid, they have come up with an interactive multi-round model which allows narrowing down the scope of the dispute and potentially executes only a few instructions on layer 1 to check if a suspicious transaction is valid. 

This also resulted in a nice side effect where smart contracts deployed on Arbitrum can be larger than the maximum allowed contract size on Ethereum. 

Another major difference is the approach to handling transaction ordering and MEV.

Arbitrum will be initially running a sequencer that is responsible for ordering transactions, but they want to decentralize it in the long run.

Optimism prefers another approach where the ordering of transactions, and hence the MEV, can be auctioned off to other parties for a certain period of time.  

It’s also worth mentioning a few other projects working on optimistic rollups. Fuel, the OMG team with OMGX and Cartesi to name a few. Most of them try to also work on an EVM-compatible version of their rollups. 

ZK Rollups

Although it looks like the Ethereum community is mostly focusing on optimistic rollups, at least in the short run, let’s not forget that the projects working on ZK rollups are also progressing extremely quickly. 

With ZK rollups we have a few options available.

Loopring uses ZK rollup technology to scale its exchange and payment protocol. 

Hermez and ZKTube are working on scaling payments using ZK rollups with Hermez also building an EMV-compatible ZK rollup. 

Aztec is focusing on bringing privacy features to their ZK rollup technology. 

StarkWare-based rollups are already extensively used by projects such as DeversiFi, Immutable X and dYdX. 

As we mentioned earlier, ZKSync is working on an EMV-compatible virtual machine that will be able to fully support any arbitrary smart contracts written in Solidity. 

Summary

As we can see, there are a lot of things going on in both the optimistic and the ZK rollup camps and the competition between different rollups will be interesting to watch. 

Rollups should also have a big impact on DeFi. Users who were previously not able to transact on Ethereum due to high transaction fees will be able to stay in the ecosystem the next time the network activity is high. They will also enable a new breed of applications that require cheaper transactions and faster confirmation time. All of this while being fully secured by the Ethereum consensus. It looks like rollups may trigger another high growth period for DeFi. 

There are however a few challenges when it comes to rollups. 

Composability is one of them. In order to compose a transaction that uses multiple protocols, all of them would have to be deployed on the same rollup. 

Another challenge is fractured liquidity. For example, without the new money coming into the Ethereum ecosystem as a whole, the existing liquidity present on layer 1 in protocols such as Uniswap or Aave will be shared between layer 1 and multiple rollup implementations. Lower liquidity usually means higher slippage and worse trade execution. 

This also means that naturally there will be winners and losers. At the moment, the existing Ethereum ecosystem is not big enough to make use of all scaling solutions. This may and probably will change in the long run, but in the short run, we may see some of the rollups, and other scaling solutions, becoming ghost towns. 

In the future, we may also see users living entirely within one rollup ecosystem and not interacting with the main Ethereum chain and other scaling solutions for long periods of time. This could be particularly visible if we’re going to see more centralized exchanges enabling direct deposits and withdrawals to and from rollups. 

Nevertheless, rollups seem like the ultimate strategy for scaling Ethereum and the challenges will be most likely mitigated in one way or another. It will be clearly super interesting to see how rollups gain more and more users’ adoption. 

One question that comes up very often when discussing rollups is if they are a threat to sidechains. Personally, I think that sidechains will still have their place in the Ethereum ecosystem. This is because, although the cost of transactions on Layer 2 will be much lower than on Layer 1, it will most likely still be high enough to price out certain types of applications such as games and other high volume apps. 

This may change when Ethereum introduces sharding, but by then sidechains may create enough network effect to survive long term. It will be interesting to see how this plays out in the future. 

Also, the fees on rollups are higher than on sidechains because each rollup batch still has to pay for the Ethereum block space. 

It’s worth remembering that the Ethereum community puts a huge focus on rollups in the Ethereum scaling strategy – at least in the short to mid-term and potentially even longer. I recommend reading Vitalik Buterin’s post on a rollup-centric Ethereum roadmap.

So what do you think about rollups? What are your favourite rollup technologies? 

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

]]>
https://finematics.com/rollups-explained/feed/ 0
Bank Run in DeFi – Iron Finance Fiasco Explained https://finematics.com/bank-run-in-defi-iron-finance-explained/?utm_source=rss&utm_medium=rss&utm_campaign=bank-run-in-defi-iron-finance-explained&utm_source=rss&utm_medium=rss&utm_campaign=bank-run-in-defi-iron-finance-explained https://finematics.com/bank-run-in-defi-iron-finance-explained/#respond Thu, 24 Jun 2021 14:01:10 +0000 https://finematics.com/?p=1377

So what was the first large scale bank run in DeFi all about? Why is it so hard to create a working algorithmic stablecoin? And what can we learn from the IronFinance fiasco? You’ll find answers to these questions in this article. 

Algorithmic Stablecoins 

IronFinance initially launched on Binance Smart Chain in March 2021 and aimed at creating an ecosystem for a partially collateralized algorithmic stablecoin. 

As we know, building algorithmic stablecoins is hard. Most projects either completely fail or end up in a no man’s land by struggling to maintain their peg to the US Dollar. Because of this, building an algorithmic stablecoin has become one of the holy grails in DeFi. 

Achieving it would clearly revolutionize the DeFi space as we know it today. 

The current ecosystem relies heavily on stablecoins that come with major trade-offs. They maintain their peg to the US Dollar at the cost of either centralization or capital inefficiency. 

For example, the custody of USDC or USDT is fully centralized. On the flip side, stablecoins like DAI or RAI require a lot of collateral which makes them capital inefficient.  

IronFinance tried to address these problems by creating a partially collateralized stablecoin – IRON. 

IronFinance 

Despite a few hiccups along the road, such as short periods of time when IRON unpegged from USD or when ValueDeFi exploits affected some of the IronFinance users, the protocol kept marching forward. 

In retrospect, recovering from these issues most likely built a false level of confidence in the protocol design as its users thought they were dealing with a “battle-tested” project.

In May 2021 IronFinance expanded to Polygon and started gaining more and more traction. 

Total value locked in the protocol quickly went from millions to billions of dollars, surpassing 2 billion before the final collapse. The value of TITAN – protocol’s native token on Polygon – went from $10 to $64 just in the last week leading to the bank run. 

This parabolic growth was mostly driven by extremely high yield farming rewards and subsequent high demand for both the TITAN and the IRON tokens. Yield farmers were able to benefit from around 500% APR on stablecoin pairs: IRON/USDC and around 1700% APR on more volatile pairs like TITAN/MATIC.  

To add even more fuel to this parabolic growth, IronFinance was mentioned by a famous investor – Mark Cuban – in his blog post. This further legitimised the project and brought even more attention to it. 

On the 16th of June 2021, the protocol experienced a massive bank run that crashed the TITAN price to 0 and resulted in thousands of people experiencing major financial losses.

Before we start unfolding all of the events that led to the collapse of IronFinance, let’s try to understand how the protocol was built.

It’s worth noting that reviewing the design of projects, including the ones that failed, is important as it allows us to better understand what works and what doesn’t work in DeFi. It also makes it easier to assess new protocols that very often reuse a lot of elements of the already existing systems. 

Protocol Design 

The IronFinance protocol was designed around 3 types of tokens: 

  • Its own partially collateralized stablecoin – IRON that should maintain a soft peg to the US Dollar,
  • Its own token: TITAN on Polygon and STEEL on BSC, 
  • an established stablecoin used as collateral: USDC on Polygon and BUSD on BSC   

The combination of USDC and TITAN on Polygon or BUSD and STEEL on BSC was supposed to allow the protocol to decrease the amount of stablecoin collateral over time and in turn, making IRON partially collateralized leading to a greater capital efficiency. 

The protocol, although using different tokens on Polygon and BSC, worked in an analogous way on both platforms so in order to simplify this article going further I’m going to skip the BSC tokens BUSD and STEEL in the explanation. 

In order to achieve price stability of the IRON token, the protocol introduced a mechanism for minting and redeeming IRON that relied on market incentives and arbitrageurs. 

Whenever the price of the IRON token was less than $1, anyone could purchase it on the open market and redeem it for approximately $1 worth of value paid in a mix of USDC and TITAN. 

Whenever the price of the IRON token was greater than $1, anyone could mint new IRON tokens for approximately $1 worth of USDC and TITAN and sell the freshly minted IRON tokens on the open market, driving the price of IRON back to $1. 

To understand the process of minting and redeeming better, we have to introduce the concept of Target Collateral Ratio (TCR) and Effective Collateral Ratio (ECR). 

Target Collateral Ratio is used by the minting function to determine the ratio between USDC and TITAN required to mint IRON. 

As an example, let’s say the TCR is at 75%. In this case, 75% of collateral needed to mint IRON would come from USDC and 25% would come from TITAN. 

The protocol started at 100% TCR and gradually lowered the TCR over time. 

TCR can increase or decrease depending on the IRON price. On one hand, if the time-weighted average price of IRON is greater than $1, TCR is lowered. On the other hand, if the time-weighted average price of IRON is less than $1, the TCR is increased. 

Effective Collateral Ratio is used by the redeeming mechanism to determine the ratio between USDC and TITAN when redeeming IRON. ECR is calculated as current USDC collateral divided by the total IRON supply. 

If TCR is lower than ECR, the protocol has excess collateral. On the flip side, if TCR is higher than ECR it means the protocol is undercollateralized. 

As an example, if the ECR is at 75%, each time IRON is redeemed the user would get 75% of their collateral back in USDC and 25% in TITAN. 

What is important is that every time someone mints IRON the TITAN portion of collateral is burned. If someone redeems IRON, new TITAN tokens are minted.

As we can see, the whole mechanism, although a bit complicated, should work – at least in theory.  

Now, let’s see how the events leading to the collapse of IronFinance unfolded. 

Events Unfolding

Around 10 am UTC on 16th June 2021, the team behind the protocol noticed that a few larger liquidity providers a.k.a “whales” started removing liquidity from IRON/USDC and then started selling their TITAN to IRON. Instead of redeeming IRON, they sold it directly to USDC via liquidity pools. This caused the IRON price to unpeg from the value of the US Dollar. This in turn spooked the TITAN holders who started selling their TITAN causing the token price to drop from around $65 to $30 in approx 2 hours. The TITAN price later came back to $52 and IRON fully recovered its peg. 

This event, although quite severe, wasn’t that unusual considering that the protocol had a history of native tokens sharply dropping in value and IRON unpegging for a short period of time. 

Later on the same day, a few whales started selling again. This time it was different. The market panicked and users started redeeming IRON and selling their TITAN in masses. Because of the extremely quick and sharp drop in the TITAN price, the time-weighted price oracle used for reporting TITAN prices started reporting stale prices that were still higher than the actual market price of TITAN. 

This created a negative feedback loop as the price oracle was used to determine the number of TITAN tokens that have to be printed while redeeming IRON. 

Because IRON was trading off-peg at under $1, the users could buy IRON, for let’s say $0.90 and redeem it for $0.75 in USDC and $0.25 in TITAN and sell TITAN immediately. This situation created a death spiral for TITAN that drove its price to pretty much 0 as the lower the TITAN price was the more TITAN tokens would have to be printed to account for the correct amount of the redeemed capital. 

The TITAN price hitting almost 0 exposed another flaw in the protocol – users being unable to redeem their IRON tokens. This was later fixed by the team and users were able to recover around $0.75 worth of USDC collateral from their IRON tokens. 

Unfortunately, TITAN holders didn’t get away with “only” a 25% haircut and instead took heavy losses. This also included TITAN liquidity providers.

This is because when one token in a 50/50 liquidity pool goes to 0 the impermanent loss can reach pretty much 100%. Liquidity providers end up losing both tokens in the pool as the non-TITAN token is sold out for TITAN that keeps going lower and lower in value. 

This situation exposed a major flaw in the IronFinance mechanism that resulted in what we can call the first large scale bank run in DeFi.

Similarly to banks with fractional-reserve systems, where there are not enough funds to cover all depositors at any one time, the IronFinance protocol didn’t have enough collateral to cover all minted IRON. At least not when the TITAN token used as 25% of the overall collateral became worthless in a matter of minutes.   

The IronFinance fiasco also shows us why DeFi protocols shouldn’t fully rely on human coordination, especially when under certain circumstances incentives work against the protocol. In theory, if people just stopped selling TITAN for a short period of time, the system would recover as it had previously done in the past. In practice, most market participants are driven by making a profit and the arbitrage opportunity present in the protocol caused them to fully take advantage of this situation. This is also why all DeFi protocols should always account for the worst-case scenario. 

Lessons Learned 

As with most major protocol failures in DeFi, there are always some lessons to be learned.

In the case of IronFinance, there are a few important takeaways. 

First of all, we always have to consider what would happen to the protocol in the worst-case scenario. This usually involves one of the tokens used in the protocol sharply losing its value. 

What happens when the protocol stops expanding and starts contracting? What if the contraction is way quicker than expansion? 

Another important element of the protocol design that always has to be fully understood is the usage of price oracles. Could they report stale prices or get manipulated by flash loan attacks? If so, what basic protocol mechanisms rely on these oracles and how would they behave when the oracle is compromised. 

Next lesson, providing liquidity in a pool where at least 1 asset can drop to 0 means that we can lose pretty much all of our capital, even if the second token doesn’t lose any value. 

Another lesson, following celebrities and their investments might be risky. With great power comes great responsibility and unfortunately, even a single mention of a certain protocol or a token can cause people to invest in something they don’t fully understand – don’t be that person and always make sure you do your own due diligence. 

One good indicator of high-risk protocols is extremely high APR in yield farming. If something looks too good to be true there are usually some risks that have to be accounted for. 

Last but not least, building algorithmic stablecoins is hard. I hope one day we can see a fully functioning algorithmic stablecoin competing in size with USDT or USDC, but this will most likely take a bit of time and hundreds of failed attempts. If you want to become an early adopter of such a coin it’s great, but keep in mind that the numbers are not on your side. 

What’s Next

So what’s next when it comes to IronFinance and algorithmic stablecoins? 

At the moment, the team behind the protocol is planning on conducting an in-depth analysis of the situation, in order to understand the circumstances which led to such an outcome. 

It’s hard to say if the team behind IronFinance will decide to fix the shortcomings of the existing protocol and relaunch it. 

Historically, second versions of failed protocols usually don’t get nearly as much traction as their original versions. Yam Finance was a good example of such a protocol. 

After the collapse of IronFinance, there is still a lot of capital sitting on the sideline looking for other high-risk opportunities. It will be interesting to see where this capital goes next. 

So what do you think about the IronFinance fiasco? Are you optimistic about the future of algorithmic stablecoins? 

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

Finematics is also participating in Round 10 of Gitcoin Grants. If you’d like to support us, check out our grant here.

]]>
https://finematics.com/bank-run-in-defi-iron-finance-explained/feed/ 0
Sushi – Most Underrated Protocol in DeFi? (BentoBox, Kashi, Miso Explained) https://finematics.com/sushi-explained/?utm_source=rss&utm_medium=rss&utm_campaign=sushi-explained&utm_source=rss&utm_medium=rss&utm_campaign=sushi-explained https://finematics.com/sushi-explained/#respond Fri, 04 Jun 2021 23:29:21 +0000 https://finematics.com/?p=1361

Intro

So why is Sushi believed to be one of the most underrated protocols in DeFi? What are some of its new features such as BentoBox, Kashi and Miso all about? And what is Sushi’s approach to launching on different blockchains and scaling solutions? You’ll find answers to these questions in this article. 

Let’s start with a bit of background. 

Sushi launched in August 2020 during DeFi Summer – the first period of major growth of DeFi. The project quickly gained a lot of traction mostly due to the nature of its launch. 

Sushi – back then known as SushiSwap – aimed at directly competing with Uniswap by forking it and encouraging liquidity providers to move their liquidity to a new platform in a process called a vampire attack. 

The full story behind Sushi, although super interesting, is out of the scope of this article. Fortunately enough, I’ve already written an article about it some time ago and you can read it here if you’re interested. 

Sushi 

Almost a year later and the rocky launch of Sushi seems like a distant past and the team behind the protocol has been working hard on delivering new, interesting features and building the Sushi ecosystem. 

Besides the main function of Sushi – a decentralized exchange for swapping assets, the protocol offers a growing range of other products: a liquidity bootstrapping feature for other projects – Onsen; a lending platform – Kashi; a launchpad for new protocols – Miso. More on these later in the article. 

The Sushi team has a very open approach when it comes to deploying the protocol to different chains and scaling solutions. 

Instead of trying to predict which environment will be the most dominant one and will capture the most value, they deploy the protocol to all popular and upcoming environments and let the market decide. 

Besides the Ethereum mainnet, Sushi has already been deployed to Polygon, xDai, BSC, Fantom and Moonbeam with an upcoming launch on Arbitrum – a layer 2 Ethereum scaling solution.

Another interesting move was the acquisition of the sushi.com domain that should give the project even more visibility. 

Now, let’s dive a bit deeper into each of the Sushi features one by one.

AMM 

Automated Market Maker or AMM is the main function of Sushi that allows users to swap their assets in a decentralized and permissionless way. 

Sushi’s AMM is a fork of Uniswap V2, so these two work in exactly the same way. If you need a recap on AMMs and liquidity pools here’s a popular article that I wrote some time ago.  

Currently, Sushi is the second-largest AMM on Ethereum with around 16% of the market share. Uniswap remains an undisputed leader capturing around 54% of the total AMM market. 

Sushi’s daily trading volume, which is one of the most important metrics when it comes to AMMs, has been steadily growing from around $250M at the end of 2020 to over $500M in 2021 with some days hitting well over $1B.

Another metric – total value locked in the protocol – has also been growing from around $1B at the end of 2020 to as high as $5.5B and currently sitting at around $3.5B after the recent market downturn. 

One major difference between Uniswap V2 and Sushi is that the latter has enabled the profit-sharing mechanism which benefits the SUSHI token holders. Instead of 0.3% of trading fees going to the liquidity providers like in the case of Uniswap, Sushi enabled the fee switch which lowers the trading fee for the LPs to 0.25% while distributing the remaining 0.05% to the SUSHI token holders. 

And this leads us straight to the SushiBar. 

SushiBar

In order to benefit from profit sharing, SUSHI holders have to stake their SUSHI tokens in the SushiBar smart contract and receive xSUSHI that can be later redeemed for their original SUSHI plus additional SUSHI tokens coming from the swap fees. 

For every swap, on every chain, going through Sushi, 0.05% of the swap fees are distributed as SUSHI proportionally to the user’s share of the SushiBar. 

The xSUSHI tokens are fully composable and maintain voting rights in the Sushi governance. xSUSHI tokens can also be added to the xSUSHI/ETH liquidity pool where users can benefit from stacked yield coming from xSUSHI itself plus the extra rewards coming from the pool. 

The yield on SushiBar depends on the trading volume going through the Sushi AMM and has recently been at around 10% APR with days as high as 40% APR. 

Because of this profit-sharing mechanism, the SUSHI token is essentially one of the most productive assets in the DeFi space. In contrast to many other tokens driven mostly by speculation, SUSHI tokens should better represent the actual value of the Sushi protocol. 

Time for another Sushi feature also connected to the SUSHI token – Onsen. 

Yield Farming and Onsen

Onsen is a liquidity incentive system that accelerates new projects by providing extra rewards in the form of SUSHI tokens. 

Projects selected to be on Onsen are given a certain allocation of SUSHI tokens to incentivise liquidity provisioning for their own token. This means that the projects themselves don’t have to distribute their own token through liquidity mining and they can still benefit from incentivised liquidity. This is really useful for new projects that very often struggle to bootstrap liquidity, especially if they don’t want to initially distribute large amounts of their own tokens.

Onsen also benefits the overall Sushi ecosystem as the swap fees from the Onsen-enabled liquidity pools are distributed to the xSUSHI holders. 

Projects featured on Onsen are chosen based on their quality and the demand for their products. Some projects are featured only for a certain amount of time, while others can remain on the Onsen menu indefinitely, assuming the quality of the project and demand for their token remain high. 

On top of Onsen, Sushi offers permanent yield farming opportunities for popular and established tokens. These opportunities are also available on other layers, for example, Sushi has recently started a liquidity mining program on Polygon that offers high yields to liquidity providers. 

Time for yet another Sushi feature – BentoBox. 

BentoBox 

BentoBox is a special smart contract that acts as a vault for certain tokens. This vault is basically a pool of funds that can be used by Bento-enabled applications in the Sushi ecosystem. 

Users who deposit funds into one of the BentoBox vaults benefit from earning extra yield on their tokens. Vaults can generate yield in multiple ways, for example, by allowing other participants to take flash loans and pay a small fee that goes back to the users providing liquidity to the vaults or by lending out assets in the vaults. 

This structure is also very gas efficient as different applications operating on the same vault don’t have to go through as many steps as they would have to go through without the BentoBox architecture.  

At the moment, the first and only available Bento-enabled application is the lending platform – Kashi, but the team is working on bringing more applications to BentoBox in the future.  

And this is a good segue into Kashi. 

Kashi 

Kashi which means “lending” in Japanese is Sushi’s first lending and margin trading solution powered by BentoBox. Kashi allows anyone to create customised and gas-efficient markets for lending and borrowing.

In contrast to other popular DeFi money markets such as Aave or Compound, Kashi isolates each of the markets. This means that users can create markets for more risky assets without having an impact on other markets. 

Having the ability to borrow an asset also opens up the possibility for shorting it. This is useful for speculators who believe that the asset will go down in value but also allows for hedging, which can be extremely handy, for instance, when yield farming risky assets. 

As an example, let’s say a new token is launched. 

Someone can create a money market for the new token on Kashi which allows anyone to provide collateral in a chosen coin, let’s say ETH, and borrow the new token. The short seller can now borrow the new token and sell it immediately for ETH. If the price of the new token goes down in relation to ETH the short seller can buy back the new token at a lower price in the future and repay their loan denominated in new tokens. 

The main caveat is that in order to create a money market for a new token there has to be a reliable price oracle available. Kashi allows the user to choose a price oracle at the time of creating a new market. At the moment, only price feeds available on Chainlink can be used, limiting the number of possible new markets that can be created. However, the Sushi team is working on adding their own TWAP price oracle that would expand the set of available price feeds. 

Adding a new risky asset to one of the existing money markets would threaten the solvency of the whole protocol. This is because if such a coin was used as collateral and experienced a sharp drop in price this could make a lot of accounts undercollateralized and allow for a cascade of liquidations. On the other hand, if such a coin is borrowed and quickly multiplies in price this also creates a problem as borrowed assets are worth more than collateral, making the account undercollateralized again. 

Miso

The last but not least feature of Sushi that we’re going to cover in this article is Miso.

Miso is a token launchpad platform. It facilitates launching new tokens on Sushi. 

Miso focuses on providing a good experience for both the project creators launching new tokens and for people interested in finding and supporting these projects. 

When it comes to project creators, Miso offers a set of smart contracts that makes the process of creating a new token easier. On top of that it allows the projects to attract a larger initial audience than they may have been capable to reach on their own. 

When it comes to project supporters, they can benefit from having peace of mind that the token and infrastructure around the token was created using audited and battle-tested contracts. They can also easily discover new projects and participate in reliable token launches. 

Miso is clearly yet another important element of the overall Sushi ecosystem. 

Summary

With a steadily growing trading volume on its decentralized exchange, the profit-sharing mechanism for SUSHI holders, an increasing number of chains and scaling solutions to launch on and new features being added to the ecosystem, Sushi looks like one of the strongest DeFi projects. 

This is why a lot of people in the DeFi community believe that Sushi is underrated, especially when compared to other decentralized exchanges. It’s hard to say exactly why this is the case, but it might come from the fact that Sushi started as a fork of Uniswap and had a bit of a rocky launch. 

Nevertheless, Sushi is clearly one of the top DeFi protocols to keep an eye on and it will be interesting to watch new elements being added to BentoBox and the rest of the Sushi ecosystem, with the team pursuing new chains and scaling solutions. 

One of them is the previously mentioned Arbitrum – an Ethereum Layer 2 optimistic rollup-based scaling solution that looks like the next place where Sushi is about to launch. 

So what do you think about Sushi? 

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

]]>
https://finematics.com/sushi-explained/feed/ 0
How Does Thorchain Work? DeFi Explained https://finematics.com/thorchain-explained/?utm_source=rss&utm_medium=rss&utm_campaign=thorchain-explained&utm_source=rss&utm_medium=rss&utm_campaign=thorchain-explained https://finematics.com/thorchain-explained/#respond Mon, 17 May 2021 14:54:30 +0000 https://finematics.com/?p=1345

Intro 

So what is Thorchain all about? How does it work? And how does it make it possible to swap between native assets across different blockchains? You’ll find answers to these questions in this article. 

With billions of dollars in trading volume, decentralised exchanges have been gaining more and more traction. It’s not uncommon to see over $1B in daily trading volume on Uniswap alone.

Although protocols like Uniswap, Sushiswap or Curve are great when it comes to exchanging assets within the Ethereum ecosystem, they don’t support swaps between different blockchains. 

To accommodate this problem, a common approach is to represent external assets in the form of wrapped or synthetic tokens on Ethereum. The most popular asset on other blockchains outside of Ethereum is of course Bitcoin. There are multiple ways of representing Bitcoin on Ethereum that allows it to be traded on decentralized exchanges. Wrapped Bitcoin, renBTC, sBTC to name a few. 

Even though most of these approaches work fine, they usually make certain tradeoffs when it comes to either the custody or the security of the assets. If you want to learn more about it check out this article here

What if there was a way of swapping native assets instead? For example, making a trade between Bitcoin on the Bitcoin blockchain and Ether on the Ethereum blockchain. 

And this is exactly where Thorchain comes into play. 

Thorchain is a decentralized liquidity protocol that allows for swapping native assets between different blockchains such as Bitcoin, Ethereum or Binance Smart Chain. 

When it comes to managing liquidity, Thorchain uses a liquidity pool model known from protocols like Uniswap or Bancor. 

In this model, liquidity providers lock 2 assets in a liquidity pool. This in turn provides liquidity for traders who want to swap between these 2 assets and pay a small fee that goes back to the liquidity providers.

If you want to better understand how liquidity pools work you can check out this article here.

Thorchain is often explained as a cross-chain Uniswap. This is usually a good simplification for understanding the general idea behind Thorchain although there are some big differences between these two protocols that we’re going to explain later. 

Before we dive deeper into the mechanics of Thorchain, let’s have a quick look at how the project came into existence. 

Thorchain History

Thorchain started as a small project at the Binance hackathon in 2018.

The team behind Thorchain continued their research after the hackathon ended, but decided to put some of their effort on pause later that year as they were waiting for a few missing pieces of technology needed for creating a fully functioning cross-chain decentralized exchange. 

These were mainly Tendermint & Cosmos SDK and a working implementation of TSS – Threshold Signature Scheme.  

Seeing the viability of the product, the team decided to raise a small seed round and worked on a proof-of-concept of a decentralised exchange built on top of the Thorchain protocol, called Instaswap, which was later demonstrated during the Cosmos hackathon in Berlin. 

After that, they announced their first go-to-market product – BEPSwap –  in July 2019. The main goal of BEPSwap was to enable BEP2 asset swaps and was limited to Binance Chain. 

Also in July 2019, the team decided to raise more funds through an Initial Dex Offering on Binance Dex. The IDO resulted in $1.5M raised that was sufficient to enable further development of the project.  

The team continued their work on the protocol which resulted in the limited mainnet release, called multi-chain chaos network or MCCN, in April 2021.

Interestingly, the Thorchain team decided to remain mostly anonymous, even to this day. 

Now, let’s see how Thorchain works under the hood. 

How Does It Work 

At the core of the Thorchain protocol is a network of nodes built with Tendermint and Cosmos SDK.

This approach allowed Thorchain to create a separate blockchain with its own consensus and network layer without having to build all of its elements from scratch. 

Thorchain leverages Tendermint BFT model that allows the network to reach consensus even if up to ⅓ of all the nodes start failing. 

The consensus mechanism is important as Thorchain nodes have to work together, for example, in order to record transactions coming from other blockchains.

To see how this works in practice, let’s go through a quick example. 

Let’s say a user wants to swap their Bitcoin on the Bitcoin network to Ether on the Ethereum network. 

The user sends a standard Bitcoin transaction to the Bitcoin vault – a Bitcoin address controlled by the Thorchain network. 

Thorchain nodes keep monitoring vault addresses in order to acknowledge new transactions. 

To achieve this, each Thorchain node a.k.a THORNode consists of a few major components. The most important ones being: the service running the Thorchain blockchain itself; a full node for each of the connected blockchains, for example, a Bitcoin or an Ethereum node; and the Bifrost. 

The Bifrost Protocol acts as a connective layer between the Thorchain network and other networks such as Bitcoin or Ethereum. One of its main responsibilities is to watch the vault addresses in order to find inbound transactions that are later converted into THORChain witness transactions. 

The witness transactions are initially recorded as pending – which is one of the states in the Thorchain state machine. After the majority of nodes agree on the state of the inbound transaction, the transaction is moved to the “finalised” state. 

At this point, the user’s Bitcoin deposit is recorded on the Thorchain blockchain. 

Time for the other part of the swap – sending Ether back to the user. 

Once a new inbound transaction is finalised, the Thorchain protocol initiates a swap. The swap transaction is recorded on the Thorchain blockchain and the Bifrost Protocol is used again – this time to initiate a withdrawal of ETH from the Ether outbound vault. 

This outbound transaction is translated from Thorchain’s internal representation into a valid transaction for the destination chain using the respective chain client – in our case the Ethereum Client – and broadcast to the respective network. 

At this point, the swap is completed and the user ends up with Ether in their Ethereum wallet. 

Although this sounds quite simple, there is quite a lot of detail to make it all possible. 

TSS

In order to sign transactions, the network has to be able to control vault addresses on each of the integrated blockchains. 

Of course, storing private keys on each of the nodes would be a huge security risk and this is also why Thorchain uses the previously mentioned Threshold Signature Scheme or TSS. 

TSS is a cryptographic primitive for distributed key generation and signing. You can think about it as a better version of multisig. Both of them focus on achieving the same goal – allowing multiple parties to come together and sign a transaction only when a certain, previously set threshold is reached. The main difference is that multisig is usually implemented on the application layer of the blockchain, for example, as a smart contract on Ethereum, whereas TSS support is always possible regardless of the blockchain as it relies on basic cryptographic elements. 

This allows for making the whole process of signing transactions cheaper and more secure. 

Although TSS has a lot of benefits, it hasn’t yet been as battle-tested as other popular cryptographic elements such as ECDSA or certain hash functions. 

Vaults

Another interesting detail of Thorchain architecture is the way vaults operate. 

There are 2 types of vaults – “inbound” and “outbound”. 

Inbound vaults store most of the funds in the system. They are slower but more secure as they require ⅔ of all TSS signers to sign a transaction which can take even up to 20 seconds.  

This would be quite limiting for the whole system, so Thorchain introduced smaller, less secure outbound vaults that are run by each of the THORNodes. These vaults are faster as they require only a single signature from the node they run on. The funds in these vaults are limited to 25% of the value of its bond in assets. More on the bonding process a bit later in the article, but this basically creates incentives that prevent the node operator from stealing funds from the outbound vaults. These vaults are also being constantly topped up by the system as the funds are being used for the outbound transactions. 

PoS & Churning

As mentioned earlier, Thorchain uses Tendermint and Cosmos SDK. In this model, the Thorchain network operates as a Proof-Of-Stake (PoS) system where the nodes that want to be able to sign and verify transactions have to stake a certain amount of the RUNE tokens. 

In the Thorchain ecosystem, this process of staking RUNE tokens is also called bonding. 

At the moment of writing this article, 1,000,000 RUNE tokens worth around $18M are required to run a fully functioning Thorchain node. 

In contrast to most variations of PoS systems, the delegation of tokens is not allowed. This helps with making sure all nodes in the network are treated equally and there are no node operators that capture the majority of tokens for a long period of time. 

In fact, all nodes on the Thorchain network are anonymous and only identifiable by their IP address and public key. There is no branding or marketing of nodes like in other systems that allow delegation. 

In order to avoid having the same nodes with the highest amount of RUNE tokens always signing transactions, Thorchain introduces the concept of churning. 

The network maintains one set of nodes that are active and able to sign transactions and another set of nodes that are on standby. 

Every 50,000 blocks, which is around every 3 days, the churning process kicks in and the oldest or the most unreliable nodes from the active set are replaced by the nodes from the standby set. 

Churning makes sure that new nodes that meet the staking criteria can eventually have their turn at signing transactions. Also, each time the set of validators change, the Thorchain network moves funds to new vaults, ensuring that active nodes can still access funds. 

At the moment, there are 28 active nodes and 45 nodes in standby mode on the single-chain chaos network that supports BEPSwap and 11 active nodes and 9 nodes in standby mode on the recently released multi-chain chaos network. 

Currently, the multi-chain chaos network is in expansion mode which means that for every node that is churned out from the network, 2 nodes are churned in. 

The multi-chain network can grow to 99 nodes before hitting the Tendermint and TSS limits. 

Even when the network grows to 99 active nodes, it can still expand further by having the capability of sharded vaults.

It’s also important to note that even though a high amount of RUNE is required to run a fully functioning node, people can still run nodes without bonding RUNE. These nodes are able to validate transactions without the ability to sign transactions. 

RUNE Token 

This brings us to the last key element in the Thorchain architecture – The RUNE token. 

RUNE powers the Thorchain ecosystem and provides the economic incentives required to secure the network. 

All liquidity pools in the system consist of a native token and RUNE. For example, to swap from Bitcoin to Ether, the trade has to go through the BTC-RUNE and ETH-RUNE pools. In this model each asset has to be paired with RUNE. This usually results in a fewer number of pools than in a system that can create a pool out of any 2 assets like in case of Uniswap. 

Besides this, Thorchain nodes have to meet the staking criteria by bonding a particular amount of RUNE. This bond is then used to secure the system by underwriting the assets in the pools. If the node attempts to steal funds from the protocol, their bond is deducted by the amount of the assets they stole (1.5x) and the pools are made whole. Also, if the node doesn’t offer reliable service they put themselves at risk of their bond being slashed. 

The Thorchain protocol also encourages the node operators to always bond the optimal amount of RUNE. This is achieved by a mechanism called – The Incentive Pendulum. 

The Incentive Pendulum aims at keeping the system in the most optimal state. This is when 67% of all the RUNE in the system are bonded and 33% is pooled in pools. 

If there is too much capital in the liquidity pools, the network increases rewards for node operators and reduces rewards for liquidity providers. If there is too much capital bonded by the nodes, the system boosts rewards for liquidity providers and reduces rewards for node operators. 

In the optimal state, for every $1M worth of assets in the pools there would be $2M worth of RUNE bonded by the nodes. 

On top of this, RUNE is used to pay transaction fees on the network, subsidise gas needed for sending outbound transactions to different networks and participate in the Thorchain governance where users can signal which chains and assets the network should add next. 

Uniswap 

As I mentioned earlier, there are some big differences between Thorchain and Uniswap or in fact any other decentralized exchange on Ethereum. Let’s have a look at some of them. 

First of all, Uniswap allows for swapping ERC-20 tokens only, so if we want to trade assets from other blockchains they have to be represented in the form of wrapped or synthetic tokens. Thorchain allows for swapping native assets without wrapping them. 

Swaps on Thorchain are charged both a fixed network fee, as well as a dynamic slip-based fee. This means that swaps incurring more slippage are charged more in trading fees. This makes it harder for bots to extract value from swaps like in the case of a sandwich attack – a popular way of affecting the price in the liquidity pool that results in users getting worse prices on their trades.  

When it comes to speed of swaps, assets on Ethereum can be swapped in 1 Ethereum block which happens every 13 seconds on average. On Thorchain, this is a bit more complicated. The swap time depends on which networks we’re swapping between. In case of a Bitcon to Ether swap, it’d take at least 1 block on the Bitcoin network – 10 minutes on average, plus the internal time for executing a swap on the Thorchain blockchain, plus the outbound Ethereum transaction – around 13 seconds. 

Interestingly, a swap from Ether to Bitcoin would be way faster as the Thorchain network would only have to wait for the Ethereum transaction before sending an outbound Bitcoin transaction that would result in the receiving wallet having the Bitcoin UTXO spendable and available straight after the transaction is broadcast. 

Also, Thorchain as a separate blockchain loses some of the benefits of decentralized applications on Ethereum. One of them is composability. For example, a Uniswap swap can be incorporated into more complicated contracts as a part of one transaction. This is not possible with Thorchain swaps. 

Thorchain network is also, of course, not even remotely as decentralized as the Bitcoin or the Ethereum networks and the system instead relies on strong economic incentives. 

This is not necessarily that bad, considering that Thorchain has a completely different use case than the Bitcoin network that secures an asset worth over $1T or the Ethereum network that secures billions of dollars locked in smart contracts. 

For it’s main use case – swapping assets between different blockchains where most users won’t store their assets on the Thorchain blockchain for a very long period of time – it looks like it could be decentralized enough.

Summary 

After the long-awaited multi-chain chaos network release, the Thorchain team is focusing on growing the Thorchain ecosystem while making sure the system works as expected. 

Users have multiple choices when it comes to interacting with the Thorchain protocol. They can use one of the decentralized exchanges like Thorswap or Asgardex or wallets integrated with Thorchain like ShapeShift. 

We should see more applications and wallets integrating with Thorchain in the future. 

On top of this, it looks like we should see more chains and assets being onboarded, more Thorchain nodes joining the network and hopefully seeing more and more trading volume and total value locked in the liquidity pools. 

Eventually, the extra protective measures will be removed and the chaosnet will become the mainnet.

Thorchain clearly looks like an interesting protocol and a missing piece in the defi ecosystem that would allow people to swap between native assets without using centralized exchanges. 

So what do you think about Thorchain? How big can it grow in the future? 

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

]]>
https://finematics.com/thorchain-explained/feed/ 0
Polygon PoS Chain – A Commit Chain And Not A Sidechain? https://finematics.com/polygon-commit-chain-explained/?utm_source=rss&utm_medium=rss&utm_campaign=polygon-commit-chain-explained&utm_source=rss&utm_medium=rss&utm_campaign=polygon-commit-chain-explained https://finematics.com/polygon-commit-chain-explained/#respond Thu, 29 Apr 2021 19:32:17 +0000 https://finematics.com/?p=1335

So what is a commit chain? How is it different from a sidechain? And what makes Polygon Commit Chain a commit chain rather than a sidechain? We’ll answer all of these questions in this article. 

Let’s start with understanding what exactly a sidechain is. 

Sidechain 

A sidechain, in essence, is a separate blockchain that can be used as one of the ways of scaling a Layer 1 blockchain such as Ethereum or Bitcoin. As the name suggests, a sidechain runs in parallel or “on the side” of the main chain. 

Sidechains have their own consensus mechanisms usually in the form of Proof-Of-Stake, Delegated-Proof-Of-Stake or Proof-Of-Authority. 

Sidechains allow users to send their tokens from the main chain and receive them on the sidechain. Once the funds are transferred to the sidechain they can be used within the sidechain ecosystem. Similarly, users can withdraw their tokens from a sidechain back to the main chain. The whole process is called a 2-way peg or a 2-way bridge. Thing to note is that once the user tokens are on the sidechain then they are completely reliant on the consensus mechanism of the sidechain

Initially, all scaling solutions such as sidechains, Plasma and rollups were classified as Layer 2 solutions as they are built on top of Layer 1. 

After a while, the Ethereum community started differentiating between scaling solutions fully secured by the Ethereum main chain – Layer 2 and other scaling options with their own consensus mechanisms – sidechains. At the moment, pretty much all scaling solutions are classified as either one or the other. 

When it comes to Polygon Commit Chain, it is worth differentiating it from a sidechain as it has a lot of extra features that rely on the security of the main Ethereum layer.

Let’s review them one by one. 

Permissionless Validators on Ethereum

Many sidechains use a consensus mechanism that limits the number of entities able to verify the chain. For example, in a Delegated-Proof-Of-Stake (DPoS) there are usually 21 validators who are chosen by the token holders and only these validators are able to validate the state of the blockchain. Similarly, in a Proof-Of-Authority (PoA) model the chain initiator chooses authorities to run the chain. This excludes most participants and creates a situation where only a selected few are responsible for making sure the transactions are validated correctly.  

In Polygon PoS Chain anyone can join the network and start validating the state of the blockchain. This is important as it allows any participants to become validators and check by themselves that all transactions are processed correctly. 

Validators on Polygon PoS Chain have to stake their MATIC tokens and run a full node. 

MATIC tokens are staked on the Ethereum main chain. This is also where the set of all validators is maintained. If a validator starts acting in a malicious way, for example, by double signing or having a significant downtime their stake is slashed.  

This is also a good time to introduce 2 core components of the Polygon PoS Chain architecture – Heimdall Chain and Bor Chain.

Heimdall & Bor

Heimdall works in conjunction with the Stake Manager contract deployed on the Ethereum mainnet to coordinate validator selection and updating validators.

Since staking is actually done on the Ethereum smart contract, we don’t have to rely on validator honesty and instead inherit Ethereum chain security for this key part. Even if a majority of validators collude and start acting maliciously, the community can come together and redeploy the contracts on Ethereum to fork out, i.e. slash the malicious validators, and the chain can continue to operate as intended. 

Heimdall is also responsible for checkpointing – more on this later in the article. 

Bor is the block producer layer of the PoS Chain architecture that is responsible for aggregating transactions into blocks. 

Bor block producers are a subset of the validators that are periodically shuffled by the Heimdall validators. Block producers are selected to validate blocks only for a set number of blocks, also called “span”. After this time period, the selection process is triggered again. 

Let’s have a closer look at the process of selecting block producers

  1. Let’s suppose we have 3 validators in the pool, and they are Alice, Bill and Clara.
  2. Alice staked 100 MATIC tokens whereas Bill and Clara staked 40 MATIC tokens each.
  3. Validators are given slots according to their stake, as Alice has 100 MATIC tokens staked, and there are 10 tokens per slot (maintained by validator’s governance), Alice will get 5 slots in total. Similarly, Bill and Clara get 2 slots in total.
  4. All the validators are given these slots [ A, A, A, A, A, B, B, C, C ]
  5. Using the historical Ethereum blocks as a seed we shuffle this array.
  6. After shuffling the slots using the seed we get this array [ A, B, A, A, C, B, A, A, C]
  7. Now depending on Producer count(maintained by validator’s governance), we pop validators from the top, for eg if we want to select 5 producers we get the producer set as [ A, B, A, A, C]
  8. Hence the producer set for the next span is defined as [ A: 3, B:1, C:1 ].
  9. Using this validator set and Tendermint’s proposer selection algorithm we choose a producer for every sprint on Bor.

This model allows anyone to participate in securing the network with any amount of MATIC tokens. It also doesn’t sacrifice the speed of transaction as not all validators have to validate blocks all the time. 

Let’s go back to the other important function of Heimdall – checkpointing. 

Checkpointing 

Checkpoints are important as they provide finality on the Ethereum chain.

Heimdall layer allows for aggregating blocks produced by Bor into a single Merkle root and periodically publishing it to the Ethereum main chain. This published state is also called a checkpoint hence the whole process is known as checkpointing. 

Checkpoint proposers are initially selected via Tendermint’s weighted round-robin algorithm. A further custom check is implemented based on the success of checkpoint submission. This allows Polygon PoS Chain to decouple from Tendermint proposer selection and provides it with abilities like selecting a proposer only when the checkpoint transaction on the Ethereum mainnet succeeds or submitting a checkpoint transaction for previous blocks if the checkpoint transaction failed. 

Submitting a checkpoint on Tendermint is a 2-phase commit process. A proposer, selected via the above-mentioned algorithm, sends a checkpoint with their address in the proposer field and all other proposers validate it.

The next proposer then sends an acknowledgement transaction to prove that the previous checkpoint transaction has succeeded on the Ethereum mainnet. Every Validator set change will be relayed by the validator node on Heimdall which is embedded onto the validator node. This allows Heimdall to remain in sync with the Polygon contract state on the Ethereum mainchain at all times.

The Polygon PoS Chain contract deployed on the main chain is considered to be the ultimate source of truth, and therefore all validation is done via querying the Ethereum main chain contract.

Checkpoints also provide “proof of burn” in the withdrawal of assets. 

Speaking about withdrawals, let’s have a look at another important element of the PoS chain – the two-way Ethereum Bridge.

Two-way Ethereum Bridge

Typical two-way bridges rely on a small set of authorities which are often not even staked, nor part of the sidechains’s validators set – basically bridges are often operated, i.e. controlled by several PoA signers. This is a significant security concern.,. 

Polygon provides 2 separate ways for moving assets between Ethereum and Polygon – Plasma Bridge and the PoS Bridge.

Plasma Bridge provides increased security guarantees due to the Plasma exit mechanism. However, there is a 7-day withdrawal period associated with all exits/withdrawals caused by certain restrictions in the Plasma architecture. 

The PoS Bridge doesn’t have this restriction and it is secured by a robust set of validators that we discussed earlier in this article. The state of these validators is maintained on the Ethereum mainnet and they are secured by all the funds staked in the system – around $500M at the time of writing this article. To the best of our knowledge, the PoS bridge is the only bridge secured by the whole validator set of a bridged chain; bridges are normally secured by a small set of PoA signers, as already mentioned earlier. 

As we can see, Polygon PoS Chain offers a lot of extra security measures based on the Ethereum main chain and it is not just a mere sidechain. Perhaps, a commit chain is a better name for it. 

So what do you think about Polygon Commit Chain? Do you think it’s valuable to differentiate it from a sidechain?

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

]]>
https://finematics.com/polygon-commit-chain-explained/feed/ 0
How (Not) To Get Rekt – DeFi Hacks Explained https://finematics.com/defi-hacks-explained/?utm_source=rss&utm_medium=rss&utm_campaign=defi-hacks-explained&utm_source=rss&utm_medium=rss&utm_campaign=defi-hacks-explained https://finematics.com/defi-hacks-explained/#respond Sat, 17 Apr 2021 19:25:00 +0000 https://finematics.com/?p=1325

Intro


Every opportunity comes with risks, and DeFi is no exception. For every way you can make money in DeFi, it seems there are at least two ways you can lose it. Although these risks can’t be avoided entirely, with careful risk management and sensible judgement, you can at least decrease your chances of getting totally rekt.

So what are some of the most common ways people lose money in DeFi? What are the different types of hacks and exploits? And most importantly, how can you minimise your chances of being negatively affected by hacks in the future? You’ll find answers to these questions in this article. 

And one more thing before we start. This article is a collaboration between Finematics and rekt.news.  

Rekt News is an anonymous platform for whistleblowers and DeFi detectives to present their findings to the community. They analyse all the major hacks and exploits, and provide creative commentary on all things crypto and DeFi, with an aim to educate and entertain their readers. Their site rekt.news contains their own articles as well as an AI-generated news aggregator that gives coverage of all the most important recent events within crypto and DeFi in particular.

So, back to the topic of the article, how can I get rekt in DeFi?

We wouldn’t have time to cover every single type of exploit, and of course, many types remain unknown, but there are a few techniques that occur regularly, and we’ll now look at a few examples.

The Rug Pull 

To “rug pull” has become a commonly used term across all of DeFi, and is now used to refer to many types of hacks and exploits, but it actually refers to a specific technique of suddenly removing the majority of liquidity from a liquidity pool.

The sudden loss of liquidity can create a death spiral for the token, as token holders try to sell as fast as possible in order to save their profits. 

To rug pull is usually the final move from a malicious team and is a common form of “exit scam”, where the protocol deletes all traces of Social Media as they try to escape with the funds.

As this type of attack is technically very simple, it is often the chosen technique for quick cash grabs by low effort projects, however, this does not mean the profits are low, and there have been several major rug pulls where users have lost multiple millions.

One such example would be the case of Meerkat Finance, who after just one day of operation, were rugged for 13 million busd and about 73,000 BNB, totalling around $31m at that time. 

If a large liquidity pool is used within a project, then the project team should not have the ability to retrieve these assets. If they do, then you are placing your trust in the project team. 

Initially, Meerkat Finance did not have this ability, however, shortly before the attack, the Meerkat Finance Deployer “upgraded” 2 of their own vaults, giving themselves the ultimate backdoor into the vaults. 

How can we avoid getting rug pulled?

Check how the liquidity is locked, is there a timelock, is there a multisig? 

Do your research into the project, find out who is backing it and what is the purpose of the project.

Is the team known? If they are, what can you find out about them? Proving identity online is becoming increasingly difficult, and scammers are turning to unusual methods to build the trust of others, as was the case with DeTrade Fund, who some suspected to have used deepfake technology to create a video of a fake CEO. This story was covered by rekt.news (Deepfake – A Scam so Surreal).

On the other hand, if you can’t find any information on who is behind the project, remember that an anonymous team isn’t necessarily a bad thing, as the founder of Bitcoin remains anonymous to this day. 

Economic exploit / Flash Loan

There was a period of time when it seemed every week brought a new DeFi hack, and the words “flash loan” were never far from the scene.

The association of flash loans with “hacks and exploits” led many in the community to believe that their impact was solely negative. 

However, it’s worth remembering that these transactions were already possible for whales with large accounts, and that flash loans in themselves are not a malicious tool – they simply give access to large amounts of funding on a very short time frame. This funding can then be used to take advantage of loopholes in code, or to manipulate pricing and profit from arbitrage.

Flash loans are uncollateralised, unsecured loans that must be paid back before the blockchain transaction ends; if not repaid the smart contract reverses the transaction, so that it’s like the loan never happened in the first place.

Because the smart contract for the loan must be fulfilled in the same transaction that it is lent out, the borrower has to use other smart contracts to perform instant trades with the loaned funds before the transaction ends.

If you want to learn more about flash loans check out this article here.  

Most flash loan attacks involve the manipulation of the token price using a large amount of capital.

One example of a major flash loan attack would be Harvest Finance, who lost $33.8 million to an attacker in October of 2020.

fUSDT fell 13.7% and $FARM fell 67% over two hours as the hacker took out a $50m USDT flash loan, then used the Curve Finance Y pool to swap funds and stretch stable coin prices out of proportion.

The following actions took place in a 7 minute time period. 

  1. Take a $50m USDT flash loan
  2. Swap 11.4m USDC to USDT -> causing USDT price to go up
  3. Deposit 60.6m USDT into Vault
  4. Exchange 11.4m USDT to USDC -> USDT price goes down
  5. Withdraw 61.1m USDT from Vault -> resulting in 0.5m USDT profit
  6. Rinse and repeat 32 times. (without any prior testing)
  7. Convert to renBTC and exit to BTC and ETH via Tornado Cash (a service that allows for making anonymous transactions on Ethereum, therefore covering the attacker’s tracks)

The attacker was able to withdraw more USDT at step 4 because of the changed USDT price. As the price of USDT was lower during the time of the withdrawal, their shares represent more USDT from the Vault pool.

Approximately 4 cycles can fit into a 10m gas limit, and although the profit on each cycle was less than 1%, ~$500k per repetition adds up quickly.

Flash loans are often used to manipulate prices, which allows for arbitrage where it would otherwise not be possible. To avoid flash loan price manipulation attacks, protocols should look to use reliable decentralised oracles.

Flash loans can also be used for other attack methods such as re-entrancy, front-running, or Arbitrage.

Arbitrage

Arbitrage refers to taking advantage of price differences between different markets in order to generate a profit. These types of opportunities are especially common in immature markets such as DeFi and crypto. Arbitrage opportunities tend to decrease as liquidity increases and the market becomes more efficient.

If a pool is manipulated (with flash loans for example) to allow room for arbitrage, then this may also be considered an exploit, as liquidity providers can end up losing their funds, as was the case with Saddle Finance. 

During their launch in January this year, at least three major arbs took over 7.9 BTC ($275,735) from the early liquidity providers within 6 minutes, despite their claim to “have solved the problem of slippage”

4.01 BTC $139,961 Jan-19-2021 04:06:54 PM +UTC

0.79 BTC $27,573 Jan-19-2021 04:08:46 PM +UTC

3.11 BTC $108,548 Jan-19-2021 04:12:37 PM +UTC

Although this was only arbitrage, the users still lost out, as Saddle Finance was unable to protect them from the arbitrageurs, who were simply buying and selling, within the limitations of the code.  

This brings us to one of the common questions regarding losing funds in DeFi

“Was it a hack, or was it an exploit?”

DeFi is still such a new concept, and the entire industry is a live experiment, testing new ideas as we build a new financial system. This means that loopholes are often found in live code, and when these loopholes can be used to withdraw funds without forcibly manipulating anything, then perhaps it’s best called “an exploit”. 

However, this labelling could be applied to all hacks, as they can only operate with the code that has been written. Whether we call them a hack or an exploit, the end result is the same. If loopholes exist, then eventually someone will take advantage of them, and there is little we can do to stop this. 

Even security audits do not guarantee safety.

Audits 

rekt.news also ranks each hack and exploit on the rekt.news leaderboard, which shows not only how much was stolen from the protocol in dollar value at that time, but also who audited the protocol before the hack. 

If we look at the rekt leaderboard, we can see that the majority of hacked (or exploited) protocols actually had a security audit completed prior to the attack. This proves that an audit is not a guarantee of safety, and audit firms can also fail.

The leaderboard shows who audited the specific piece of code that was exploited. 

According to the leaderboard, the most notorious security companies are currently; Peckshield with 3 failed audits, Certik with 2, and Quantstamp also with 2.

Many of the most recent rekt.news articles covered audited protocols, showing that in the end, there is very little difference between an audited and unaudited protocol.

Users often make the mistake of believing that one security audit can cover an entire protocol forever, however, all DeFi protocols are full of moving parts, and even if a protocol is audited very thoroughly, a single small update can render the audit useless.

Summary 

So what do you think about hacks in DeFi? Have you ever been affected by any of them? 

Also, don’t forget to check out rekt.news for more content like this. 

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

]]>
https://finematics.com/defi-hacks-explained/feed/ 0
The Truth About DeFi https://finematics.com/the-truth-about-defi/?utm_source=rss&utm_medium=rss&utm_campaign=the-truth-about-defi&utm_source=rss&utm_medium=rss&utm_campaign=the-truth-about-defi https://finematics.com/the-truth-about-defi/#respond Mon, 05 Apr 2021 14:48:14 +0000 https://finematics.com/?p=1312

Intro

So what is the current state of decentralized finance? Where are we with different scaling solutions on Ethereum? How about DeFi on other chains? And what is the most likely scenario when it comes to the future of DeFi? You’ll find answers to these questions in this article. 

And one more thing before we start, Finematics has recently hit 100,000 subscribers on Youtube! This is pretty amazing and I’d like to say a big thank you to all of you watching my videos and reading my articles. Also a special shout out to all of our Patrons who actively support this channel. If you’d like to become one of them and join our DeFi community you can check it out here.  

DeFi – State of Affairs 

Decentralized Finance, together with the whole cryptocurrency space, started 2021 with a bang. 

In just a few months the total value locked in DeFi grew from around $15B to an astonishing $45B. 

The volume on decentralized exchanges has also been at an all-time high with over $50B traded each month. 

On top of this, a lot of DeFi tokens have seen a significant increase in value which attracted even more people to this still very new space. 

We’ve also seen a lot of development. 

New projects popping up pretty much every day. 

Already existing protocols launching their new versions. 

Other well-known projects migrating or announcing their migration to different scaling solutions.  

We also had some big news. For example, Visa announced they will start settling transactions in USDC on Ethereum. This is amazing not only for Ethereum but also for the whole DeFi space in general. 

Despite all of this development, it seems like DeFi is at a crossroad. 

On one hand, we have DeFi protocols on Ethereum exploring multiple different scaling solutions to alleviate high transaction fees. On the other hand, we have other chains trying to attract both the new and the already existing DeFi projects. 

Scaling DeFi 

When it comes to scaling DeFi on Ethereum we have quite a few options either already available or not being too far from a full launch.

These options belong to one of the 2 categories: Layer 2 scaling and sidechains. 

Layer 2 scaling relies on the security of the main layer – the Ethereum blockchain. 

Sidechains rely on their own security models, usually by having a separate consensus mechanism. They can also have additional security guarantees that leverage the main layer, but despite this, they are usually considered less secure than Layer 2 solutions. 

One of the most discussed Layer 2 options, when it comes to DeFi, are rollups. 

Rollups provide scaling by executing transactions outside of Layer 1 but posting transaction data on Layer 1 which allows rollups to be secured by the main Ethereum chain. 

There are 2 types of rollups: optimistic rollups and ZK rollups. 

Optimistic rollups run an EVM-compatible virtual machine which allows for executing the same smart contracts as on Ethereum. Optimism and Arbitrum are currently the most popular options. 

Optimism has been partially rolled out to the Ethereum mainnet with a limited set of partners to ensure that the technology works as expected.

Synthetix has already migrated its staking module to Optimism which allows for minting sUSD and receiving staking rewards in a fast, cheap and secure way. 

Another big partner that has already announced its launch on Optimism is Uniswap with its long-awaited Uniswap V3.

Arbitrum, on the other hand, seems to be even closer to being fully launched on the Ethereum mainnet. They partnered with a few major DeFi projects like Augur and Bancor. 

ZK rollups, although faster and more efficient than optimistic rollups, do not provide an easy way for the existing smart contracts to migrate to Layer 2 – at least not just yet. 

With ZK rollups we have a few options available, mainly StarkWare and ZKSync. 

StarkWare-based rollups are already extensively used by projects such as DeversiFi, Immutable X and dYdX. 

ZKSync is working on an EVM-compatible virtual machine that will be able to fully support any arbitrary smart contracts written in Solidity. They are targeting August for their mainnet release.

Besides Layer 2 options, we have sidechains like the Matic PoS chain or the xDai chain. 

We’ve recently seen a lot of projects launching on both of these chains. This includes Sushiswap, Polkamarkets and Aave on the Matic PoS chain and Perpetual Protocol, RealT and Gnosis on xDai. Both chains are also integrated with Chainlink oracles. 

Matic, after its recent rebranding to Polygon, also aims at expanding its available scaling solutions and add things like ZK and optimistic rollups, on top of already available solutions: the PoS chain and the Plasma chains. 

Although Layer 2 scaling and sidechains can alleviate high transaction fees and increase both the throughput and speed of transactions they come with their own challenges. 

The biggest one, that can negatively affect DeFi, is the lack of smart contract composability between different scaling solutions.

Composability is clearly one of the most important characteristics of DeFi. When it comes to the Ethereum main chain, a single transaction can interact with multiple different DeFi protocols. For example, a smart contract can borrow funds on Aave, swap these borrowed coins to other ones on Uniswap and provide the swapped coins to a yield farming aggregator – all of this in one single Ethereum transaction. 

Although composability is still possible within one scaling solution, it would break if even one of these protocols is not available on this particular scaling solution. 

Continuing with our previous example, if Aave is only additionally available on Matic PoS and Uniswap is only available on Optimism, we wouldn’t be able to compose one transaction that calls both the Aave’s and Uniswap’s smart contracts in any way outside of the Ethereum main chain.

The next issue is the interoperability between different scaling solutions.

What if we borrowed funds on Aave on the Matic PoS Chain but we later want to swap them using Uniswap on Optimism. At the moment we’d have to withdraw them to the Ethereum main chain before being able to use the coins on Optimism. This of course adds a lot of friction as some withdrawals can take a long time to be fully settled especially when it comes to optimistic rollups. 

Having multiple scaling solutions also provides us with some challenges when it comes to the existing liquidity in DeFi. Instead of having a lot of liquidity available on the Ethereum main chain in a few major protocols, we’ll see the existing liquidity being split across the Ethereum main chain, multiple implementations of rollups and different sidechains. 

Fortunately, most of these challenges are solvable. 

We’ll probably see a lot of bridges between different scaling solutions which should help with reducing friction. On top of this, the other approach is to create a whole ecosystem that is interoperable by default. This is the approach that Polygon decided to go with. 

Also, it looks like most liquidity outside of Layer 1 will concentrate around a few most popular scaling options with the biggest number of high-quality DeFi protocols available. 

Despite the existing and future challenges, Ethereum is clearly an undisputed leader when it comes to DeFi. It offers credible neutrality, decentralization and a strong community, driven not only by the recent price rise of ETH but even more about building the future of finance and web3. 

When it comes to DeFi on other chains there are also a few options available. 

BSC And DeFi On Other Chains

Let’s start with Binance Smart Chain – yes, I know, BSC is not always even considered as DeFi and more like CeDeFi and you can find more about it here in this article

Nevertheless, BSC attracted a lot of users and trading volume in a very short period of time. 

BSC, as a fork of Ethereum, allows for deploying the same smart contracts as the ones already available on Ethereum. 1inch and Alpha Homora are some of the projects that decided to expand their reach and launch on BSC in parallel to Ethereum.  

Besides Binance Smart Chain, there are a lot of other chains that come with their own security models and different levels of decentralization. 

Many also put a lot of effort into building their own DeFi ecosystem, including Solana with its decentralized exchange – Serum, Avalanche with Pangolin and even Bitcoin where DeFi can be built on top of sidechains. 

The challenges with all of these blockchains are similar to the challenges of different scaling solutions on Ethereum, mainly the lack of composability and interoperability. 

Fortunately enough, there are also a few other projects that focus predominantly on these problems. 

Cosmos aims at creating “The Internet Of Blockchains” by leveraging its inter-blockchain communication protocol. 

Polkadot, on top of its parachains technology that allows for creating sovereign blockchains, also aims to make bridging to the external blockchains easier.

Thorchain focuses on cross-chain liquidity and decentralized exchange of assets between different blockchains.

With all of these options within the Ethereum ecosystem, and outside of it, most people feel lost trying to answer the question – what will DeFi look like in the future. 

The Truth 

Although it seems like DeFi is at a crossroad, in practice we’re just experiencing an ongoing Cambrian Explosion of this nascent space. 

The truth is, that at this point, no one can be certain what the future of DeFi will look like.

What we can pretty much assume though is that every single potential solution will be explored. Some of them will survive and thrive while others will become irrelevant over time. 

DeFi on Ethereum with different Layer 2 options and sidechains; DeFi and CeDeFi on other chains; interoperability between blockchains or maybe something completely new: It’s almost guaranteed that all of this will be tried and after some time we’re going to end up with the most adopted and hopefully most decentralized working solution.

And even though some people may not like it, this is the beauty of DeFi and any other open ecosystems like this. 

The exact details might still be quite vague, but the future of the whole DeFi space is brighter than ever. DeFi is here to stay. We may not be 100% sure of its final form, but it is almost certainly the future of finance. 

One thing that can be used to indicate which projects and ideas are here to stay more than anything else is the next, most likely inevitable, bear market. During this time most bad ideas and projects without strong communities die off and the remaining users concentrate around the best, long-lasting protocols. 

When it comes to individual DeFi projects, some of them will thrive and capture value across multiple scaling solutions or even across multiple blockchains. Others won’t be able to do it and new protocols will be able to steal their market share.

Because of this ever-changing space, it’s really important to keep learning and expanding our knowledge about DeFi, crypto, finance and technology in general. 

Fortunately, Finematics is here for you and will always deliver fresh and relevant content to keep you up to speed!

So what do you think about the future of DeFi? What will it look like in its final form? Or maybe there is no final form at all and the whole space will keep evolving?

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

]]>
https://finematics.com/the-truth-about-defi/feed/ 0
Uniswap V3 – New Era Of AMMs? https://finematics.com/uniswap-v3-explained/?utm_source=rss&utm_medium=rss&utm_campaign=uniswap-v3-explained&utm_source=rss&utm_medium=rss&utm_campaign=uniswap-v3-explained https://finematics.com/uniswap-v3-explained/#respond Tue, 23 Mar 2021 23:29:18 +0000 https://finematics.com/?p=1295

Intro

So what is the long-awaited Uniswap V3 all about? How is it different from V2? Will this be a game-changer when it comes to the Automated Market Makers space? And will it launch directly on Layer 2? You’ll find answers to these questions in this article. 

Uniswap 

Although Uniswap, as one of the core DeFi projects, doesn’t need much of an introduction, let’s quickly go through a few major points before we jump into V3. 

Uniswap, in essence, is a protocol for decentralized and permissionless exchange of tokens on the Ethereum blockchain. 

The initial version of Uniswap was launched in Nov 2018 and slowly started building users’ interest. 

In May 2020, at the beginning of DeFi Summer, Uniswap launched a second version of the protocol called Uniswap V2.

The main feature was the addition of ERC20/ERC20 liquidity pools on top of ERC20-ETH pools present in V1. 

In the second half of 2020, Uniswap V2 went through a period of parabolic growth and quickly became the most popular application on Ethereum. It also became pretty much a standard for Automated Market Makers (AMMs) making it one of the most forked projects in the whole DeFi space.

Less than a year since its launch, V2 has facilitated over $135B in trading volume – an astonishing number that is comparable with top centralized cryptocurrency exchanges. 

You can learn more about the full story behind Uniswap V1 and V2 in this article here.

Also, the concepts of liquidity pools and automated market makers are worth understanding. If you need a quick recap, here is an article.

V3

Just before releasing V2, the team behind Uniswap had already started working on a new version of the protocol, details of which were announced just now – at the end of March 2021. The team decided to launch Uniswap V3 on both the Ethereum mainnet and Optimism – an Ethereum Layer 2 scaling solution – targeting early May for the release.  

This was clearly one of the most anticipated announcements in DeFi’s history and it looks like V3 can completely revolutionise the AMM space.

So what are the main changes? 

Uniswap V3 focuses on maximising capital efficiency when compared to V2. This not only allows LPs to earn a higher return on their capital but also dramatically improves trade execution that can now be comparable or even surpass the quality of both centralized exchanges and stablecoin-focused AMMs. 

On top of this, because of better capital efficiency, LPs can create overall portfolios that significantly increase exposure to preferred assets and reduce their downside risk. They can also add single assets as liquidity to a price range that is above or below the current market price which basically creates a fee-earning limit order that executes along a smooth curve. 

This is all possible by introducing a new concept of concentrated liquidity – more on this in a second.

Besides this, V3 introduces multiple fee tiers and improves Uniswap Oracles.

Now, let’s go through some of the Uniswap V3 features one by one to understand them a bit better. 

Concentrated Liquidity 

Concentrated liquidity is the main concept behind V3. 

When LPs provide liquidity to a V2 pool, liquidity is distributed evenly along the price curve. Although this allows for handling all price ranges between 0 and ∞, it makes the capital quite inefficient. This is because most assets usually trade within certain price ranges. This is especially visible in pools with stable assets that trade within a very narrow range. As an example, Uniswap DAI/USDC pool only uses around 0.5% of capital for trading between $0.99 and $1.01 – a price range where the vast majority of trading volume goes through. This is also the volume that makes the majority of trading fees for the LPs. 

This means that in this particular example, 99.5% of the remaining capital is pretty much never used.

In V3, LPs can choose a custom price range when providing liquidity. This allows for concentrating capital within ranges where most of the trading activity occurs. 

To achieve this, V3 creates individualised price curves for each of the liquidity providers. 

Before V3, the only way to allow LPs to have individual curves was to create a separate pool per curve. These pools if not aggregated together resulted in high gas costs if a trade had to be routed across multiple pools. 

What is important is that users trade against combined liquidity that is available at a certain price point. This combined liquidity comes from all the price curves that overlap at this specific price point. 

LPs earn trading fees that are directly proportional to their liquidity contribution in a given range. 

Capital Efficiency 

Concentrating liquidity offers much better capital efficiency for liquidity providers. 

To understand it better, let’s go through a quick example. 

Alice and Bob both decide to provide liquidity in the ETH/DAI pool on Uniswap V3. They each have $10,000 and the current price of ETH is $1,750. 

Alice splits her entire capital between ETH and DAI and deploys it across the entire price range (similar to V2). She deposits 5,000 DAI and 2.85 ETH. 

Bob, instead of using his entire capital, decides to concentrate his liquidity and provides capital within the price range from 1,500 to 2,500. He deposits 600 DAI and 0.37 ETH – a total of $1200 and keeps the remaining $8800 for other purposes. 

What is interesting is that as long as the ETH/DAI price stays within the 1,500 to 2,500 range, they both earn the same amount of trading fees. This means that Bob is able to provide only 12% of Alice’s capital and still makes the same returns – making his capital 8.34 times more efficient than Alice’s capital. 

On top of that, Bob is putting less of his overall capital at risk. In case of a quite unlikely scenario of ETH going to $0, Bob’s and Alice’s entire liquidity would move into ETH. Although they would both lose their entire capital, Bob puts a much smaller amount at risk. 

LPs in more stable pools will most likely provide liquidity in particularly narrow ranges. If the $25M currently held in the Uniswap v2 DAI/USDC pool were instead concentrated between 0.99 – 1.01 price range in v3, it would provide the same depth as $5B in Uniswap v2 as long as prices stayed within that range. 

When V3 launches, the maximum capital efficiency will be at 4000x when compared to V2. This will be achievable when providing liquidity within a single 0.1% price range. On top of that, the V3 pool factory will be able to support ranges as granular as 0.02% – which translates to a maximum of 20,000x capital efficiency relative to V2. 

Active Liquidity 

V3 also introduces the concept of active liquidity. If the price of assets trading in a specific liquidity pool moves outside of the LP’s price range, the LP’s liquidity is effectively removed from the pool and stops earning fees. When this happens, the LP’s liquidity shifts completely towards one of the assets and they end up holding only one of them. At this point, the LP can either wait until the market price moves back into their specified price range or they may decide to update their range to account for current prices. 

Although it is entirely possible that there will be no liquidity at a specific price range, in practice this would create an enormous opportunity for liquidity providers to indeed provide liquidity to that price range and start collecting all trading fees. From the game theory point of view, we should be able to see a reasonable distribution of capital with some LPs focusing on narrow price ranges, others focusing on less likely but more profitable ranges and yet another ones choosing to update their price range if the price moves out of their previous range. 

Range Limit Orders

Range Limit Orders is the next feature enabled by concentrated liquidity. 

This allows LPs to provide a single token as liquidity in a custom price range above or below the current market price. When the market price enters into the specified range, one asset is sold for another along a smooth curve – all while still earning swap fees in the process. 

This feature, when used together with a narrow range, allows for achieving a similar goal to a standard limit order that can be set at a specific price. 

For example, let’s assume that DAI/USDC trades below 1.001. An LP can decide to deposit their DAI to a narrow range between 1.001 and 1.002. Once DAI trades above 1.002 DAI/USDC the whole LP’s liquidity is converted into USDC. At this point, the LP has to withdraw their liquidity to avoid automatically converting back into DAI once DAI/USDC goes back to trading below 1.002. 

Multiple Positions

LPs can also decide to provide liquidity in multiple price ranges that may or may not overlap.

For example, an LP can provide liquidity to the following price ranges in the ETH/DAI pool:
-$2000 between $1,500-$2,500
-$1000 between $2,000 – $3,000 
-$500 between $3,500-$5,000

Being able to enter multiple LP positions within different price ranges allows for approximating pretty much any price curve or even an order book. This also enables creating more sophisticated market-making strategies. 

Non-Fungible Liquidity 

As each LP can basically create their own price curve, the liquidity positions are no longer fungible and cannot be represented by well-known ERC20 LP tokens. 

Instead, provided liquidity is tracked by non-fungible ERC721 tokens. Despite this, it looks like LP positions that fall within the same price range will be able to be represented by ERC20 tokens either via peripheral contracts or through other partner protocols. 

On top of this, trading fees are no longer automatically reinvested back into the liquidity pool on LPs’ behalf. Instead, peripheral contracts can be created to offer such functionality. 

Flexible Fees

The next new feature is the flexibility when it comes to trading fees. Instead of offering the standard 0.3% trading fee known from Uniswap V2, V3 initially offers 3 separate fee tiers – 0.05%, 0.3% and 1%. This allows LPs to choose the pools according to the risk they are willing to take. The team behind Uniswap expects the 0.05% fee to be predominantly used for pools with similar assets such as different stable coins, 0.3% for other standard pairs like ETH/DAI and 1% for more exotic pairs. 

Similarly to V2, V3 can also enable a protocol fee switch where part of the trading fee would be redirected from LPs to UNI token holders. Instead of having a fixed percentage like in V2, V3 offers between 10 and 25% of LP fees on a per-pool basis. This will be switched off at launch, although it can be switched on at any time as per Uniswap governance. 

Advanced Oracles

Last but not least is a significant improvement to the TWAP oracles introduced by Uniswap V2. V3 makes it possible to calculate any recent TWAP within the past ~9 days in a single on-chain call.

On top of this, the cost of keeping oracles up to date has been reduced by around 50% when compared to V2. 

These are pretty much all the main features behind Uniswap V3. 

What is interesting is that all of these features haven’t caused an increase in the gas cost. Rather the opposite, the most common feature – a simple swap will be around 30% cheaper than its V2 equivalent. 

Summary

It looks like Uniswap V3 can be a game-changer when it comes to AMMs. It basically combines the benefits of a standard AMM with the benefits of a stable-asset AMM – all of this while making capital way more efficient. This makes V3 a super flexible protocol able to accommodate a whole range of different assets.  

It will be interesting to see how V3 could affect other AMMs, especially the ones that V2 couldn’t earlier compete with, for example, stable coin AMMs like Curve. 

It is also crucial that V3 launches in parallel on Optimism. 

Optimism is an optimistic rollup-based Layer 2 scaling solution that enables fast and cheap transactions without sacrificing Layer 1’s security. At the moment, Optimism is partially rolled out and has started integrating with a few selected partners like Synthetix. 

Uniswap on Layer 2 should be able to attract even more users who might have been priced out by high gas fees on Layer 1.

Exchanges enabling withdrawals to Optimism would be another big step towards the quick adoption of V3 on Layer 2.

On top of the V3 launch, an imminent full launch of Optimism will clearly be another highly anticipated event to wait for.  

Besides this, the migration from V2 to V3 will be done on a fully voluntary basis. In case of V1 to V2 migration, it took just over 2 weeks for V2 to surpass V1’s liquidity. It would be also interesting to see if Uniswap’s governance decides to further encourage LPs by voting in some kind of incentives only present in V3 – maybe another liquidity mining program? 

With the super high capital efficiency of V3, even if the existing liquidity is split between V2, V3 and V3 on Optimism, it should still be way more than enough to facilitate trading with low slippage across all of these 3 protocols. 

One challenge of V3 is that providing liquidity may become a bit harder, especially for less sophisticated users. Choosing a wrong price range may magnify the chances of being affected by impermanent loss and it will be interesting to see a development of third party services that could help with choosing optimal strategies for allocating liquidity. 

So what do you think about Uniswap V3? Will this be a game-changer in the AMM space? Will Uniswap on Optimism bring even more users to DeFi?

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

Finematics is also now participating in Round 9 of Gitcoin Grants based on the quadratic funding mechanism where even the smallest contribution matters. If you’d like to show your extra support, check out this link here.

]]>
https://finematics.com/uniswap-v3-explained/feed/ 0
Binance Smart Chain and CeDeFi Explained https://finematics.com/binance-smart-chain-and-cedefi-explained/?utm_source=rss&utm_medium=rss&utm_campaign=binance-smart-chain-and-cedefi-explained&utm_source=rss&utm_medium=rss&utm_campaign=binance-smart-chain-and-cedefi-explained https://finematics.com/binance-smart-chain-and-cedefi-explained/#respond Thu, 04 Mar 2021 00:13:43 +0000 https://finematics.com/?p=1275

So what is Binance Smart Chain? How is it different from Ethereum? And what is CeDeFi all about? You’ll find answers to these questions in this article. 

First, let’s see how Binance Smart Chain came into existence. 

Binance Chain 

In April 2018, Binance – one of the biggest cryptocurrency exchanges, decided to launch their own blockchain – Binance Chain. 

The main idea behind Binance Chain was to create a high-speed blockchain able to support large transaction throughput.

To achieve this, the team behind Binance Chain chose the Tendermint consensus model with instant finality and instead of supporting multiple applications, decided to focus on its primary app – Binance DEX.

With DeFi on Ethereum flourishing and Binance DEX not getting as much traction as expected, Binance very quickly realised that the main feature missing from Binance Chain was the ability to run smart contracts and allowing other teams to deploy their own applications. 

At this point, Binance made an interesting decision. Instead of trying to add smart contract capabilities to Binance Chain and sacrificing its performance, they decided to launch another chain in parallel to Binance Chain and this is where Binance Smart Chain comes into play. 

Binance Smart Chain 

Binance Smart Chain launched in September 2020 and in contrast with Binance Chain, was fully programmable and supported smart contracts out of the box. 

If you’d like to better understand what smart contracts are and why they are so important you can check this article here

Creating a completely new smart contract platform from scratch requires years of work and research. Instead of doing that, Binance decided to leverage users’ and developers’ familiarity with Ethereum and forked Ethereum’s go client – geth. 

Of course, forking Ethereum without making any changes wouldn’t make much sense, so Binance decided to optimise the new chain for low fees and higher transaction throughput by sacrificing decentralization and censorship-resistance properties of the network. 

This was achieved by replacing Ethereum’s Proof-of-Work consensus model with the Proof-of-Staked-Authority model and tweaking a few other parameters such as the block time and the gas limit per block. 

Before we jump into the details of Binance Smart Chain, let’s see why some properties of the network had to be sacrificed in the first place. We can understand this better by revisiting the famous Scalability Trilemma. 

Scalability Trilemma 

The Scalability Trilemma is a useful model, introduced by Vitalik Buterin, that helps with visualising what trade-offs have to be made when it comes to different blockchain architectures. 

Each blockchain has 3 core properties: security, scalability and decentralization that cannot be achieved simultaneously. So in order to significantly improve one of these properties the other ones have to be sacrificed.

Sharding is an attempt at solving this challenge at the base layer by splitting a blockchain into multiple smaller chains – “shards”. Sharding is one of the scaling approaches chosen by Ethereum and it’s one of the elements of the Eth2 upgrade. 

Unfortunately, sharding by itself cannot fully solve the trilemma and even sharded blockchains wouldn’t be able to process hundreds of thousands or even millions of transactions per second without sacrificing decentralization and security.  

This is also why the Ethereum community decided to use Layer 2 solutions that can dramatically increase the scalability of a blockchain without sacrificing other properties. 

It shouldn’t come as a big surprise that there were a lot of other projects popping up that, despite The Scalability Trilemma, decided to scale up by sacrificing the other 2 properties. One of the most notable examples was EOS.

This is also the approach that Binance Smart Chain decided to go with. 

Architecture 

Binance Smart Chain, instead of using a Proof-of-Work (PoW) or a Proof-of-Stake (PoS) consensus mechanism, uses a Proof-Of-Staked-Authority (PoSA) model. 

In this model, all transactions are validated by a set of nodes called validators. A validator can be either active or inactive. The number of active validators is limited to 21 and only active validators are eligible to validate transactions. 

Active validators are determined by ranking all validators based on the amount of BNB tokens they hold. The top 21 validators with the highest amount of BNB become active and take turns validating blocks. This is determined once per day and the set of all validators is stored separately on Binance Chain.

Besides staking BNB tokens themselves, validators can also encourage BNB holders to delegate their BNB tokens to them in order to receive a share of the validator’s transaction fees. 

On this note, all transaction fees on Binance Smart Chain are paid in BNB which is the native token of the chain, in a similar way to how ETH is native to the Ethereum blockchain. 

In contrast to Ethereum and Bitcoin, there are no block subsidy rewards on Binance Smart Chain. This means that the validators only receive the transaction fees paid in BNB and there is no other fixed reward per block. 

Although the PoSA consensus model allows for achieving a short block time and lower fees, it does so at a cost of decentralization and security of the network. 

First of all, a user cannot just start validating the state of the blockchain in a similar way as they can do it in Bitcoin or Ethereum. 

On top of this, even if a user could just join the network in a permissionless way and start validating transactions, they wouldn’t be able to do it for a very long time on consumer-grade hardware as the state on Binance Smart Chain grows at a much higher rate than the Ethereum’s state.

Now, let’s see how the PoSA-based model allowed the Binance Smart Chain team to change the block time and the block gas limit.

The block time was reduced from around 13s on Ethereum to around 3s on Binance Smart Chain. This allows for higher transaction throughput and faster confirmation time, at a cost of having to store more data. 

If implemented on Ethereum, it would also increase the number of orphaned blocks as there would not be enough time to propagate valid blocks across the network from multiple different geographic locations.

When it comes to Binance Smart Chain, however, this is not a problem as validators just take turns validating blocks. 

Block gas limit is another important parameter that we discussed in our article about the gas fees. This parameter basically decides how many transactions can fit into one single block. On Ethereum, miners have to come to a consensus and decide what value they want to set it to. 

Increasing the block gas limit, similarly to reducing the block time, increases the amount of data produced by the blockchain which makes it harder for individual users to run their own nodes. 

Again, this is not a problem on Binance Smart Chain as the 21 validators can just run their nodes on institutional-grade hardware when the state of the blockchain grows beyond what can be handled by consumer-grade hardware. 

At the time of writing this article, the gas limit per block is set to 12.5M gas on Ethereum and 30M on Binance Smart Chain. 

By knowing both the block time and the gas limit per block we can quickly calculate that the amount of data on Binance Smart Chain increases roughly at a 10-times faster rate than the state on the Ethereum blockchain. 

Currently, with an average block size of 40,000 bytes, Binance Smart Chain grows by around 1.15 GB per day which is around 420 GB per year. After a couple of years, this of course eliminates most of the consumer-grade hardware. 

Now as we understand a bit more about the Binance Smart Chain architecture, let’s see what CeDeFi is all about. 

CeDeFi

As we know, DeFi stands for decentralized finance. CeFi is the opposite of DeFi and as we can probably guess stands for centralized finance. CeDeFi is a term coined by the CEO of Binance that basically describes a mixed solution between centralized and decentralized finance which Binance Smart Chain is a good example of. 

So what are the benefits of such a solution? 

CeDeFi allows users to get a feel for using DeFi without paying high transaction fees. Low fees encourage users to play with multiple different DeFi protocols such as decentralized exchanges, lending protocols, liquidity aggregators, yield farming tools and others.

On top of this, CeDeFi makes users familiar with common DeFi tools like Metamask and block explorers. 

It also allows new teams to deploy their smart contracts for a fraction of a cost when compared to what they would have to pay on the Ethereum blockchain. This way they can easily test and get feedback on their projects. Testing within an ecosystem with actual economic incentives usually works much better than just testing on a testnet. 

Binance Smart Chain and CeDeFi have recently started gaining a lot of popularity. This is mainly driven by the high transaction cost on Ethereum that priced out some of the users.

As we know, Binance Smart Chain is a fork of Ethereum and therefore allows for running exactly the same smart contracts like the ones on Ethereum. 

This allowed the network to quickly bootstrap its ecosystem by essentially either reusing or forking all popular Ethereum services and applications.

Users can connect to Binance Smart Chain based dApps by switching their network in Metamask. They can look up their transactions on bscscan.com which is pretty much a copy of etherscan.com. They can trade on Pancakeswap – a fork of Uniswap. They can lend and borrow on Venus – a fork of Compound and yield farm via Autofarm – a protocol that resembles Yearn Finance. 

Binance Smart Chain, similarly to Ethereum, also allows for creating new tokens using their BEP-20 standard – Ethereum’s ERC-20 counterpart.

Some Ethereum-based projects also quickly saw the opportunity for expanding their reach to Binance Smart Chain, at a minimal cost. 1Inch – a liquidity aggregator – has recently decided to also launch on Binance Smart Chain.

Summary

It’s clearly visible that Binance Smart Chain was able to make quite a lot of traction and attract a decent number of users and trading volume in a very short amount of time. 

A decision to fork Ethereum and allow users and developers to interact with DeFi tools and protocols they are already familiar with was quite clever. 

The timing was also extremely good. The popularity of Ethereum combined with most Ethereum scaling solutions still in progress and a roaring bull market resulted in high transaction fees that priced out smaller users and forced them to find a different option if they still wanted to participate in DeFi.

On top of this, Binance was able to leverage its position as one of the top cryptocurrency exchanges and make it easy for its millions of users to easily withdraw BNB and other tokens directly to Binance Smart Chain. 

The main question to ask here is if this is a short term growth caused only by high transaction fees on Ethereum or a longer-term user acquisition? 

At this point, it’s hard to say, but two main things pointing at the former are Ethereum’s layer 2 scaling solutions and the Eth2 scaling roadmap. 

Both of these can dramatically reduce the transaction fees on Ethereum without sacrificing other properties like security and decentralization. 

We can already get a feel for it with Matic (a.k.a. Polygon) and Loopring attracting more and more users and trading volume. This trend should only keep escalating with other layer 2 solutions getting more traction and new ones like Optimism fully launching in a matter of weeks. 

With millions of new users entering the cryptocurrency space, it’s also extremely important to make sure they are aware of the differences between DeFi and CeDeFi and are able to make their own decisions. 

At the end of the day, we have to ask ourselves the question. What’s the main point of using a blockchain if it’s not fully decentralized and permissionless? Auditability? Maybe, but is this really the main value proposition of the whole cryptocurrency space?

It will clearly be interesting to see how DeFi and CeDeFi play out. 

So what do you think about Binance Smart Chain? Does CeDeFi have a future? 

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

]]>
https://finematics.com/binance-smart-chain-and-cedefi-explained/feed/ 0
What Is Gas? Ethereum High Transaction Fees Explained https://finematics.com/what-is-gas-ethereum-high-transaction-fees-explained/?utm_source=rss&utm_medium=rss&utm_campaign=what-is-gas-ethereum-high-transaction-fees-explained&utm_source=rss&utm_medium=rss&utm_campaign=what-is-gas-ethereum-high-transaction-fees-explained https://finematics.com/what-is-gas-ethereum-high-transaction-fees-explained/#respond Sat, 20 Feb 2021 18:41:23 +0000 https://finematics.com/?p=1261

So what exactly is gas? Why are transaction fees so high at the moment? And what are some of the ways to make the transaction cost lower? You’ll find answers to these questions in this article. 

Let’s start with what gas actually is. 

What Is Gas 

Gas is a unit used for measuring the amount of computational effort required to perform specific actions on the Ethereum blockchain.

The name itself hasn’t been chosen by accident. Similarly to gasoline fueling a car and allowing it to drive, gas on the Ethereum network fuels transactions and allows them to perform different operations. 

Every operation on the Ethereum blockchain, or to be precise on the Ethereum Virtual Machine (EVM), has an associated gas cost. For example: adding 2 numbers costs 3 gas; getting the balance of an account – 400 gas; sending a transaction – 21,000 gas. 

Smart contracts usually consist of multiple operations that together can cost even hundreds of thousands of gas.

What is interesting is that the gas cost by itself doesn’t tell us how much we have to pay for a particular transaction. To calculate the transaction fee we have to multiply the gas cost by gas price. 

The gas price is measured in gwei – a smaller unit than ether where 1 gwei equals 0.000000001 ETH. We can think about it as a major and a minor unit similarly to dollars and cents. 

As an example, let’s say we want to send a simple Ethereum transaction and the ETH price is at $1,800. Most of the popular Ethereum wallets such as Metamask estimate necessary gas prices and allow us to choose between fast, medium and slow transaction confirmation speed. Let’s assume that the wallet estimated the gas price to be set to 100 gwei if we want to have a chance of having our transaction confirmed within the next minute. 

We can now quickly calculate that we have to pay $3.78 for such a transaction. We multiply the gas cost for sending a transaction – 21,000 gas – and the gas price – 100 gwei. This is equal to 2,100,000 gwei which is 0.0021 ETH. At the ETH price of $1,800, this gives us $3.78. 

ETH Price And Gas

It’s worth mentioning that gas is only an abstract unit that exists only inside the EVM and the user always pays for their transactions in ETH.

The main reason for having a separate unit for measuring computational effort is to decouple it from the price of ETH. 

This means that the increase in the ETH price should not change the cost of transactions. If the network activity stays the same and the price goes up we should see the gas price going down, so the final transaction cost measured in ETH stays the same in dollar value. 

Saying this, a price increase of ETH is very often correlated with an increase in the activity on the Ethereum network – something that indeed increases the cost of transactions. 

Now, let’s see how exactly an increase in network activity causes the transaction cost to go up. 

To start with – all transactions sent to the Ethereum network land in the mempool. This is a place where all pending transactions are waiting for the miners to pick them up and include them in the next Ethereum block. 

Miners are incentivised to pick up transactions with the highest gas price first as they are basically doing a fixed unit of work for a better price. 

Miners are also limited to how many transactions they can include in one single block. This is determined by the maximum gas limit per block. At the time of writing this article, this limit is set to 12.5M gas. 

As a quick example, let’s assume there are only simple ETH transactions in the mempool each one costing 21,000 gas. A miner can include ~595 such transactions (12.5M/21K). If there are, let’s say, 1,000 pending transactions in the mempool, the miner would choose transactions by sorting all pending transactions by the gas price and choosing 595 most profitable ones. 

The current fee model is based on a simple auction mechanism and the users who want to have their transaction picked up by miners first have to essentially outbid other people for the space in a block. This in turn drives the gas prices up, especially at times when a lot of users have urgent transactions that they want to confirm. 

Why Do We Need Gas

To wrap up the gas explanation, it’s also important to understand why gas has to exist in the first place. EVM as a Turing-complete machine allows for executing any arbitrary code. Although this is one of the main reasons that makes Ethereum so powerful it also makes it vulnerable to the halting problem. The halting problem is the problem of determining, from a description of an arbitrary computer program and an input, whether the program will finish running, or continue to run forever.

Without gas, a user could execute a program that never stops, either by making a mistake in their code or just by being malicious. To prevent this, Ethereum introduced a gas cost associated with each operation that would prevent a program from running forever and bringing the whole network to the grinding halt. 

Besides the gas price, each transaction also has a gas limit that has to be equal or higher to the anticipated amount of computation needed for successfully executing a particular transaction. 

EVM, before executing each operation within a transaction, checks if there is enough gas left for that operation. In case there is not enough gas left, the whole transaction is reverted with “out of gas” exception and all state changes are rolled back. The user would still pay the transaction fee for the amount of work that has been done by the miner even if the transaction fails. This is again to avoid attacks on the network. 

If the transaction consumes less gas than initially anticipated the remaining amount of gas is converted to ETH and refunded to the sender. 

It’s also really important that all operations on Ethereum have the correct gas cost in relation to each other; otherwise, that could be another attack vector. One of such attacks took place in 2016 and resulted in a hard fork that repriced certain low-level operations. 

Now, as we know a bit more about gas, let’s have a look at the recent period of high transaction fees and a few solutions that can lower the transaction cost now and in the future. 

High Fees on Ethereum

With record volumes on decentralised exchanges, the highest total value locked on defi lending platforms, multiple yield farming opportunities available, and minting more and more NFTs – the Ethereum network is as busy as ever. 

This popularity results in high demand for block space which in turn results in high transaction cost. 

It’s not uncommon anymore to pay more than $10 for a simple ERC20 transfer or $50-100 for a Uniswap transaction. This, of course, is not ideal as it makes it really hard for smaller players to participate in the Ethereum ecosystem. 

Fortunately, there are multiple solutions either already available or being actively worked on. Let’s go through some of the most important ones. 

Layer 2 Scaling and Eth2

Layer 2 scaling is a collective term for solutions that help with increasing the capabilities of the main Ethereum chain – Layer 1 – by handling transactions off-chain. Besides improving transaction speed and transaction throughput, layer 2 solutions can greatly reduce the transaction fees. 

Loopring is a good example of a decentralized exchange built on Layer 2 that is getting more and more popular. The exchange has recently hit $200M in total value locked and over $10M in daily trading volume.

Another project – Matic – that was recently rebranded to Polygon also hit over $200M in TVL on their Plasma+POS chain. 

A more general-purpose solution – Optimism – that is based on optimistic rollups is also being rolled out. This is important as it will allow DeFi smart contracts to interact with each other in a similar way to how they interact on Layer 1. 

One of the missing pieces that can increase the adoption of Layer 2 solutions even further is direct onboarding to Layer 2. This could decrease the cost of transactions even further as users would be able to transfer their ETH directly from an exchange to a Layer 2 solution like Loopring. 

If you want to learn more about Layer 2 Scaling check out this article here

Besides Layer 2 scaling, another solution that can decrease the transaction cost, in the long run, is Eth2 which introduces sharding and Proof-Of-Stake. You can learn more about these concepts here

EIP-1559

EIP-1559 is another solution for optimising the transaction cost. 

Although the proposal will not have a direct effect on lowering the transaction cost, it will allow for optimising the fee model by smoothing fee spikes and limiting the number of overpaid transactions. This will make transaction fees more predictable.

From the timeline perspective, it looks like EIP 1559 could be implemented in early 2021. 

Here is a separate article that explains EIP-1559 in depth. 

Optimising Gas Usage

Besides using Layer 2 scaling solutions and waiting for other improvements, there are a few other tricks that can help us with lowering our transaction cost on Layer 1. 

First of all, if we don’t have any urgent transactions, we can try to find times of the day when the gas prices are the lowest. 

Besides this, we should always double-check the gas cost estimated by our wallet with a separate reliable source such as https://ethgasstation.info/

Another trick, used by 1Inch exchange, allows for lowering transaction fees with CHI tokens. These tokens must be burned alongside the primary operation, which allows for reducing the total amount of gas spent in a transaction.

This can be achieved by leveraging an EVM mechanism that refunds gas when storage space is freed. When CHI tokens are minted, dummy smart contracts on the Ethereum network are created. Burning CHI destroys these contracts and results in a gas refund. 

Other Chains 

So how about other chains besides Ethereum? 

There is no doubt that the recent period of high transaction fees on Ethereum resulted in a few other chains capturing a meaningful amount of users and volume.

At this point, it’s hard to say how much of this will be a short-term play versus a longer-term user acquisition. 

Saying this, we have to keep in mind that some of these chains are not fully decentralized and permissionless. This basically creates a fake DeFi ecosystem that may be fun to play with but is actually not that much different from using a centralized exchange. 

So what do you think about gas and high transaction fees? What is your favourite way of lowering it? 

If you enjoyed reading this article you can also check out Finematics on Youtube and Twitter.

]]>
https://finematics.com/what-is-gas-ethereum-high-transaction-fees-explained/feed/ 0