Singapore MAS tokenization standards require overhaul to realize innovation potential
The Monetary Authority of Singapore’s (MAS) Project Guardian has been making waves in recent months for its progressive approach to digital assets.
For instance, MAS recently launched the world’s first live repurchase transaction using a digitally native bond on a public blockchain. This and countless other innovations mark a significant milestone in its recent digital asset approach.
However, while optimistic about MAS’s developments, Ralf Kubli, a board member of the Casper Association, the organization responsible for overseeing Casper Network, highlights that this innovation isn’t without its challenges.
Kubli believes a critical yet often overlooked aspect of the tokenization process is the issue of standardization.
In an interview with CryptoSlate, he explained that current practices in asset tokenization primarily focus on digitizing the asset itself but neglect to incorporate the associated liabilities and cash flows into this digital transformation. This results in the creation of asset-backed tokens appended to blockchains, typically accompanied by a simple PDF outlining terms and conditions.
Kubli believes this approach, while seemingly efficient, still necessitates manual intervention for cash flow calculations, potentially leading to errors and discrepancies. He points out that this lack of transparency and verifiability in cash flows closely resembles the issues that precipitated the 2008 banking crisis. Further, Kubli argues that the key to averting a similar economic catastrophe is ensuring that cash flows are digitized, tokenized in a machine-executable format, and, crucially, standardized.
In the forthcoming interview, Ralf Kubli delves deeper into these challenges and explores the potential pathways to a more secure and efficient future in asset tokenization.
You’ve highlighted the lack of standardization in asset tokenization practices as a significant issue. Could you elaborate on the risks and challenges this presents, especially in the context of the Monetary Authority of Singapore’s recent initiative?
The recent announcement of the Project Guardian initiative from the Monetary Authority of Singapore is a great step toward showcasing the benefits that tokenization can engender. However, these tokenized assets still aren’t utilizing any standards that will make them both safe and interoperable across the entire financial ecosystem. The current projects do not define the payment obligations, meaning the cash flows of the financial instrument are in a machine-readable and machine-executable term sheet. Failing to do so means we still have the same risks that have already plagued the financial industry for years.
As for challenges, it may take some time to get everyone to adopt the same standards, but if projects such as the one from MAS want to truly make progress, they need to do so.
You mentioned that tokenization platforms often overlook liabilities and cash flows. How critical is it to include these elements in the tokenization process, and what would be the ideal approach to achieve this?
As it stands, most tokenized assets don’t include algorithmic descriptions of their liabilities or cash flows. They simply tokenize a PDF version of a contract, meaning that humans still have to read, interpret, and process them manually and find the corresponding documents detailing the financial contract. This completely undermines the point of tokenization and doesn’t meaningfully move the financial industry forward.
Implementing cash flow logic into the smart contracts that represent these assets turns them into “Smart Financial Contracts” that are now machine-readable, executable, and auditable. With these, we can truly enjoy the benefits that tokenization brings, allowing for much faster, more efficient, and more transparent finance.
Ultimately, the inclusion of cash flows and payment obligations in Smart Financial Contracts resolves the reconciliation problem both inside and between financial firms while allowing for systemic risk management.
Drawing parallels to the 2008 banking crisis, you’ve suggested that a lack of transparency in cash flows can be hazardous. How can blockchain and tokenization technologies be leveraged to prevent such economic risks in the future?
By automating finance via tokenization, every company’s balance sheet can be completely audited almost in real-time. Because the financial assets which are on these firm’s balance sheets are forward-looking, static, and dynamic, “what if?” simulations can be conducted at any given time.
Firms will be able to see exactly where they stand in terms of liquidity and can easily model how they would fare under any conceivable economic conditions. This should effectively reduce the risk of events like the ones that led to the 2008 crisis, as well as more recent volatility and contagion that we have seen.
Understanding the current state of each financial contract on any firm’s balance sheet in an algorithmic and standardized form will also reduce the regulatory burden, allowing for effective and progressive regulation and systemic risk analyses across many firms.
Do you view the Monetary Authority of Singapore’s move as a step towards addressing these tokenization challenges globally, or is it more of a localized effort? How can other regulatory bodies learn from this?
Many initiatives by the MAS are developed in collaboration with several regulators; therefore, whatever happens in Singapore with large international financial firms is of a global nature.
In your opinion, what does the future hold for the regulation of tokenized assets? How important is international cooperation in standardizing these practices?
Tokenized financial assets will revolutionize the way financial systems operate. You can think of it as upgrading the plumbing of capital markets. Tokenization is already happening with cash and cash equivalents on a large scale (deposit tokens, money market funds, T-Bills, etc.). For fund tokenization, many large players are investing heavily (the likes of Fidelity, Franklin Templeton, and KKR).
For debt, structured instruments, and derivatives, algorithmic definitions of the cash flows of the underlying financial instrument are a pre-condition for the successful adoption of infrastructure for tokenized financial assets.
A bond or a mortgage remains a bond or mortgage when it is tokenized. Therefore, the regulators should be happy to have DLT-enabled financial infrastructure, where it is much easier to track which party holds which obligation.
Without the Cash Flows inside the tokens representing debt, structured instruments, or derivatives, these tokens will remain dumb and not provide the necessary efficiency in price discovery and post-trade automation.
What are some potential solutions or innovations you foresee that could address the standardization issue in asset tokenization?
A comprehensive set of open banking standards that algorithmically define how financial contracts interact. Combining tokenization with clearly defined standards can bring a new level of efficiency, transparency, and legitimacy to finance and businesses. Fortunately, standards already exist that can address these concerns, specifically the standards outlined by the Algorithmic Contract Types Universal Standards (ACTUS) Research Foundation. Implementing a structure such as this is what needs to come to tokenization if it wants to truly be adopted.
Do you believe the issues you’ve identified with tokenization are specific to stablecoins or indicative of a broader trend in the financial system?
The truth is that using stablecoins for payments brings little innovation to finance. The innovations in payment rails have been mistaken as innovations in finance since finance is the exchange of cash over time, and payments are the exchange of cash today.
DeFi currently consists primarily of over-collateralized lending, which will keep it a niche form of finance, as in the real world, very small amounts of over-collateralized loans exist. The reason why DeFi loans need to be so heavily collateralized is because DeFi is incapable of calculating the cash flows or liabilities of a loan without human intervention.
As I’ve said, to innovate and attract institutions, liabilities and cash flows must be tokenized, machine-executable, and, perhaps most importantly, standardized. With sound financial logic underpinning the blockchain-based tokenization we see today, DeFi can grow beyond its niche status into the revolutionary technology it aims to become.
What advice would you give to innovators and regulators in the blockchain space to address these challenges effectively?
For innovators, don’t just build another payment rail – that only creates another channel that needs to be independently audited. Instead, utilize smart financial contracts that can be audited via automation. This is the true innovation.
As for regulators, understand that embracing tokenization that follows agreed-upon standards will genuinely make your jobs much easier. All of these instruments and rails will be transparent and enforced by code. This means it won’t even be possible for companies to do things like overvalue positions and move liabilities, and it would be completely visible if somehow they should.
Finally, what is your vision for the future of blockchain and tokenization in creating a more efficient, transparent, and stable financial ecosystem?
This is the first time in 60 years, since the introduction of computers in banks, that we can address and solve the main problems plaguing the banking and financial systems. By implementing open source, algorithmic financial contracts, the financial world of tomorrow will work so much more efficiently, and balance sheets will be reconcilable within minutes or hours with reduced or eliminated instances of fraud.
Done correctly, the Blockchain can truly offer the reliability that is required to improve firm-wide risk management and make systemic risk management possible again. I think this is happening; it will just take a little longer to get everyone on board.