x

Tokenization Engine

We in BANKEX believe that digitalization and tokenizations of real-world assets is bound to become a part of everyday practice someday.

engine.bankex.comLaunching Q4 2018

Asset tokenization has become a very promising field during the last few years, especially in the area of finance and capital markets. We in BANKEX also believe that digitalization and tokenizations of real-world assets is bound to become a part of everyday practice someday. That being said, we understand that there is a lot of technical and legal issues to be solved first, and transaction cost and speed in Ethereum also needs to be revisited. However, we leave the discussion of these issues out of this paper.

Tokenization, utility and security tokens

Tokenizations is a process of converting rights to a real-world asset like a money flow, real estate ownership, copyright, etc. into tokens. There are two types of tokens: utility and security tokens. They might be applied to the same real-world assets, the difference between them lies in the way they are used to get a profit.

Let's take a vending machine as an example: if you have utility tokens, you can use them to purchase the goods. And if you have security tokens, this means that you are a shareholder of the vending machine, and you get a share of the profit it makes. A similar example is a real estate rental service: as a tenant, you can buy some utility tokens and pay rent with them, and as an investor, you can purchase security tokens that guarantee you a percentage of the profit, depending on the size of your share in tokens.

Initial Smart Asset Offering

Initial Smart Asset Offering is the process that results in issuing of tokens representing shares in a real-world asset. Unlike ICO procedure that is aimed at investments in the development of products that do not exist yet, the point of ISAO is to trade rights to an existing cash flow or something that already has value in the real world.

To meet this requirement, the tokenization must have a strong connection to the real world in terms of properties of the asset, and this includes KYC procedures, AML (Anti Money Laundering) laws, confirmation and transfer of ownership, and independent valuation of the asset.

All of this makes for a complicated process that we will illustrate in a simplified way with the diagram below, explaining the parties involved and the procedures that the process includes. As the diagram shows, there are three main roles in the tokenization process:

Asset owner is the owner of an asset in the physical world, e.g., a real estate owner. In order to tokenize their asset, they need to pass the KYC procedure and get verification for the asset, receiving a certificate which is basically a digital snapshot of the asset on the blockchain, confirming its existence in the real world, its value and the fact that the AO does indeed own the asset.

The originator is where the end user, i.e., Asset Owner, comes in order to have the rights to their asset tokenized and broadcast into blockchain. The Originator is a service that carries out all the procedures required for the tokenization. It is the responsibility of the Originator to compile the Asset Profile. The Asset Profile is then being run through verification, upon a successful completion of which the ISAO process can be launched, and ERC-20 tokens issued.

After the ISAO process, the tokens are being distributed between token holders, and the Asset Owner receives the funds collected in ETH, paying to the Originator the percentage or a fixed payment according to the deal initially signed between them. Depending on what kind of token is issued, the holders should be able to get dividends, use utility tokens as payment for some service, or trade them on a public or private marketplace.

The building blocks described above are not supposed to be something very sophisticated or innovative: the main challenge is to make them fully secure and effective in terms of gas consumption. We are going to provide well-tested smart contracts supplied with documentation and sample applications so that they are good and ready to be used for building any kind of tokenization service.