WFE explains why tokenization has not “taken off” yet

“The fundamentals of tokenization and the infrastructures these assets trade on need to be better understood. Regulation in this area should reflect that tokenization is a natural evolution in the financial industry, rather than a drastic break from the norm. Its usage is suitable in particular environments and for particular assets, but in these cases, market participants can reap great benefits.”

The World Federation of Exchanges (WFE) has published a paper to provide a reality check on tokenization: we shouldn’t curtail further development nor should we run blindly towards it.

The paper, “Demystifying Tokenisation: Embracing the Future”, was published as the European Commission discusses with stakeholders the tokenization of assets.

Yay: fractional ownership, liquidity, trust

Tokenized traditional assets should be viewed as nothing more than a modernized and innovative iteration of traditional finance, the WFE argues, pointing to many benefits that could make it a natural next step for financial markets:

  • Fractional ownership, allowing multiple investors to own a share of an asset thereby lowering the capital requirement for individuals to invest in high-value assets;
  • Enhanced liquidity, which arises from fractionalization enabling access to investments that may have been out of reach for many; and
  • Enhanced trust could all lead to greater financial inclusion, diversification, and ultimately economic growth.

Nay: 24/7 trading, disintermediated models. instantaneous settlement

The trade association of publicly regulated stock, futures, and options exchanges, as well as central counterparties, also reminded us of some of the supposed benefits that are overexaggerated or frankly don’t exist at all:

  • Continuous 24/7 trading – if truly needed – can be achieved without tokenization
  • Disintermediated models face conflicts of interest
  • Instantaneous settlement in tokenized trading may have unpredictable timing, affecting market liquidity and trading costs, especially if assets and funding need to be blocked prior to execution.

“Tokenization is a natural evolution in the financial industry”

James Auliffe, Manager, Regulatory Affairs at the WFE, said: “The fundamentals of tokenization and the infrastructures these assets trade on need to be better understood. Regulation in this area should reflect that tokenization is a natural evolution in the financial industry, rather than a drastic break from the norm. Its usage is suitable in particular environments and for particular assets, but in these cases, market participants can reap great benefits.”

The WFE argued that tokenization should be seen as a creative and modern version of traditional finance that offers new possibilities to investors and market players. However, current limitations mean that it won’t be right for every type of asset. The WFE explained why tokenization has not “taken off” in traditional markets yet:

  • DLT has limitations, particularly in high transaction environments, as the technology is currently not fast enough to execute and settle all the trades running through a highly active exchange in any given moment. There are also other limitations such as storage problems caused by the distributed ledger.
  • The nature of different DLT creations means that there is a fragmented infrastructure with tokenised assets managed on different blockchains which are not interoperable. Financial institutions would have to build connections with each platform, leading to significant operational costs and challenges, meaning there are only marginal efficiency gains in certain markets, particularly those that are already liquid.
  • There are significant sunk costs involved in implementing DLT. It is a capital-intensive investment to move to the new technology and build the relevant infrastructure. These costs would be felt across the market, from infrastructure providers to market participants and end users. Even then, there may not be sufficient demand if customers do not have the correct infrastructure or capital to invest.
  • The lack of regulatory certainty, which is improving thanks to the efforts of regulators and industry, persists. Most jurisdictions’ bodies of law do not reflect the creation of tokenised assets, leaving firms apprehensive. Anything that is, in effect, a financial instrument, should be treated in the same way, regardless of whether it is tokenised.
  • The lack of adoption further inhibits tokenisation because, without widespread use, there are no network effects and there is little value to firms and exchanges to update their technology stacks to incorporate tokenised assets.



Financefeeds.com