Tokenization is not yet an industrial technology
European banks issue more than 2 million structured products per quarter. Most tokenization platforms are still celebrating their first hundred. Everyone says tokenization will revolutionize finance — but if you compare it to how banks already issue structured products, one thing becomes clear: tokenization is still far from industrial scale.
What banks already do at scale
Large banks have built fully industrial issuance platforms over decades. Behind every certificate or autocall sits a pipeline that most people never see.
A Swiss private bank wants to issue 500 capital-protected notes on a basket of clean energy equities. Using a major structuring bank's platform, the notes can be structured, approved, ISIN-assigned, priced, and live on SIX within a single business day — without any bespoke engineering.
What tokenization looks like today
Most tokenization projects are built around a single-asset model: one real-world asset, one token, one smart contract deployed for that specific deal. While technically interesting, it is far from operational scale.
- Template-driven product creation
- Hundreds of thousands live simultaneously
- Automated lifecycle management
- Integrated with trading infrastructure
- Standardized product descriptions (term sheets, PRIIP KIDs)
- One asset → one smart contract
- Manual deployment per issuance
- Lifecycle managed externally or manually
- Limited trading venue integration
- No standardized on-chain product schemas
A tokenization platform tokenizes a €10M commercial real estate loan. Engineers write and deploy a bespoke smart contract for that single asset. When the loan is partially repaid, someone updates the contract manually. Coupon payments are tracked off-chain. This is not a pipeline — it is a craft operation.
The economic gap: what things actually cost
The tokenization industry rarely publishes cost-per-issuance data, which makes comparison difficult. But order-of-magnitude estimates from practitioners tell a clear story.
| Cost driver | Traditional (templated) | Tokenization (today) |
|---|---|---|
| Smart contract / product engineering | ~€0 marginal Template reused |
€20k–€100k+ Bespoke per deal |
| Legal & structuring opinion | ~€2k–5k Standardized docs |
€20k–50k Novel legal territory |
| ISIN / product registration | Automated Via NNA integration |
Manual or absent |
| Audit & security review | Covered By platform certification |
€10k–30k Per smart contract |
| Time to issuance | Hours to days | Weeks to months |
A €5M private credit note tokenized today might cost €80k–€150k in engineering, legal, and audit fees before a single investor sees it. The same structured note issued through a major bank's certificate platform costs €5k–€15k all-in, with a time-to-market measured in hours. The economics flip only when platforms achieve template-driven issuance where marginal cost per product approaches zero — as it does in traditional issuance. That point does not yet exist at meaningful scale.
The regulatory wall: what traditional platforms handle automatically
Much of the operational cost in traditional structured product issuance is regulatory compliance — and most of it is automated. Tokenization platforms largely bypass this machinery, which limits who can buy, where products can be sold, and what reporting is required.
A tokenized product that lacks an ISIN cannot be held in most custodian systems, cannot be reported under MiFIR, and cannot be distributed to retail investors under MiFID. This limits most tokenized assets to a narrow universe of institutional or professional investors — a fundamental distribution constraint that has nothing to do with the quality of the underlying technology. The EU DLT Pilot Regime, live since March 2023, creates a genuine opening — but uptake has been slow and asset caps of €6bn per operator limit its industrial relevance for now.
The missing piece: self-described financial products
One of the most underappreciated promises of tokenization is that smart contracts can make financial products genuinely self-describing. Every term, every payoff rule, every lifecycle event could be encoded in the token itself — not in a PDF term sheet sitting in a document management system.
Here is what a minimal on-chain product schema for an autocall note might look like:
"productType": "AutocallNote",
"version": "1.2", // schema version for interoperability
"underlying": {
"type": "basket",
"components": ["NESN.SW", "ROG.SW", "NOVN.SW"],
"oracleAddress": "0xA3f...91c" // on-chain price feed
},
"maturity": "2027-06-15",
"currency": "CHF",
"autocall": {
"observationDates": ["2025-06-15", "2026-06-15"],
"triggerLevel": 1.00, // 100% of initial fixing
"redemptionPremium": 0.08 // 8% p.a. if called
},
"barrier": {
"type": "european",
"level": 0.60, // 60% capital protection
"observationType": "closing"
},
"lifecycle": {
"couponLogic": "memory", // missed coupons accumulate
"corporateActions": "adjusted",
"settlementAgent": "contract" // self-settling
},
"regulatory": {
"isin": "CH1234567890",
"kidHash": "0xb7e...44a", // PRIIP KID content hash
"targetMarket": "professional"
}
}
With a standardized schema, a tokenized autocall issued on SDX could be priced by a risk engine on a completely different platform, displayed correctly in a wealth management portal, and settled automatically at maturity — without any bilateral integration work. Today, each of those connections requires bespoke data mapping. The closest existing standards are ISDA's CDM for derivatives and ANNA's work on ISIN harmonization. Neither was designed for on-chain execution. A purpose-built financial product schema standard — governed by a credible industry body — remains an open problem.
The real missing piece: a secondary market that barely exists
Primary issuance infrastructure is only half the equation — and arguably the easier half. The secondary market for tokenized assets is nascent, fragmented, and in most cases functionally illiquid. A tokenized product that cannot be sold is not an asset. It is a locked position with a smart contract attached.
- Market makers quote continuously on SIX, Euronext
- Bid-ask spreads typically 0.1–0.5% for vanilla products
- Settled via established CSDs (Euroclear, SIX SIS) at T+2
- Visible in Bloomberg, Refinitiv, and all major data feeds
- Bookable in any custodian via ISIN
- A handful of venues (SDX, Boerse Stuttgart Digital) with thin order books
- Spreads often 2–5%+ or no continuous quote at all
- Settlement fragmented across chains and custodians
- Not visible in standard financial data infrastructure
- Not bookable in most custodian systems without manual workarounds
Many tokenized assets are marketed as more liquid than their traditional equivalents — fractional ownership, 24/7 transferability, programmable settlement. But transferability is not liquidity. A token can be technically transferable on-chain while having zero willing buyers at any given moment. The infrastructure that creates genuine liquidity — continuous quoting, market making, data visibility, custodial integration — is largely absent.
Why the secondary market has not developed
An investor buys a tokenized real estate token representing a fractional interest in a commercial property in Zurich. The token is technically transferable 24/7. But when the investor needs liquidity six months later, there is no exchange quoting it, no market maker providing a bid, and no mechanism to find a willing buyer other than bilateral negotiation. The investor is in a worse position than if they had bought a traditional real estate fund with quarterly redemption windows — at least those come with a defined exit mechanism.
What would actually unlock secondary liquidity
Secondary market liquidity requires scale, and scale requires secondary market liquidity. Tokenization cannot buy its way out of this loop with technology alone — it requires regulatory intervention, institutional anchor commitments, or both.
The demand side: who actually buys tokenized products
Issuance infrastructure and secondary markets are only meaningful if buyers can access and hold what is issued. The demand side has three distinct segments, each with different barriers.
Tokenization's most compelling pitch to issuers is democratization — reaching a broader investor base at lower cost. But today's tokenized products are predominantly accessible only to professional or institutional investors, due to the regulatory and infrastructure gaps above. The technology promises to widen distribution; the current reality narrows it.
The counter-argument: is industrial-scale tokenization even the right goal?
There is a serious version of the argument that this entire analysis misses the point.
Tokenization should be judged against the industrial scale of traditional structured product issuance. Until it can match 2 million products per quarter with automated lifecycle management and liquid secondary markets, it has not delivered on its promise.
Tokenization is not competing with structured products. It is opening up asset classes — private credit, real estate, infrastructure, fund units — where traditional issuance infrastructure does not exist and where even hundreds of tokenized assets represent genuine market creation, not failure.
The counter-argument has real force. A tokenized private credit facility for an SME, or fractional real estate accessible to a retail investor, does not have a traditional structured product equivalent. The comparison to certificate issuance may be measuring the wrong thing.
At the same time, the counter-argument lets the tokenization industry off a hook it should be held to. If the technology genuinely makes issuance cheaper and easier, it should eventually reach industrial scale in every asset class — not just the ones where traditional infrastructure is absent. The fact that most activity concentrates in a narrow band of assets where there is no incumbent to compete with may reflect a technology that is not yet ready for competitive markets, rather than a deliberate strategic choice.
Both things can be true: tokenization is creating genuine value in underserved asset classes, and it is not yet capable of competing with traditional infrastructure in established ones. The question is whether the former eventually leads to the latter, or whether the two ecosystems remain permanently parallel.
The road ahead
The gap between now and the future stage is not primarily a technology gap. The smart contract primitives exist. The gap is in standardization, secondary market infrastructure, regulatory integration, and distribution — all of which require industry coordination rather than individual innovation. That is a slower, harder problem. And it is the one that actually needs solving.
Cost estimates are order-of-magnitude approximations intended to illustrate relative scale, not precise industry benchmarks.