AI Agents & Tokenized Assets: The Next Market Stack
How tokenization makes assets programmable and AI agents become the programs that operate on them. A look at the emerging infrastructure.
Over the last two years, tokenized treasuries have gone from a curiosity to a multi-billion dollar market. BlackRock's BUIDL fund passed $1 billion in assets in early 2025 and is now accepted as collateral on major trading venues. Franklin Templeton's on-chain government money fund has hundreds of millions under management and runs on public blockchains. Asset managers are not talking about experiments any more. They are shipping products.
At the same time, autonomous AI agents have moved from toy demos like AutoGPT to production systems that can plan, call APIs, and operate for long periods with little human input. Large hedge funds and asset managers are rolling out AI research assistants and workflow agents across their organizations.
Put these two trends together and you get a simple picture: tokenization makes assets programmable, AI agents are the programs. The interesting question is not whether that happens, but how.
---
1. What tokenization changes in practice
Tokenization is not magic. At a minimum it does three concrete things:
1. Standardizes ownership and transfer
A token becomes the canonical representation of a claim on some real-world asset: treasuries, money market funds, private credit, GPU time, and so on. BUIDL's token represents shares in a short term dollar liquidity fund; Franklin's BENJI token represents shares in a registered US government money fund.
2. Moves settlement and recordkeeping on-chain
Transfers, collateralization, and some corporate actions happen through smart contracts. BUIDL can now be posted as collateral on derivatives venues while still earning yield, something that would be operationally painful in legacy plumbing.
3. Exposes machine-readable state
Balances, transfers, and sometimes yield and pricing information are visible to code. That code can be simple scripts or fully autonomous agents.
Tokenization on its own mostly improves plumbing. The step change comes when software can reason over thousands of tokenized positions and act on them in real time.
---
2. What AI agents are actually good at
Autonomous agents are essentially loops:
1. Observe the world (APIs, blockchains, news).
2. Form a plan.
3. Call tools (trade, rebalance, send messages, update logs).
4. Repeat until some goal is met.
Open source projects like AutoGPT showed the basic pattern, even if early versions were brittle and expensive. Newer commercial agents are explicitly designed to execute complex multi-step tasks without supervision.
In finance today, the most credible uses are:
- ingesting and summarizing large volumes of data
- monitoring positions and risk limits
- generating and explaining trade ideas
- automating routine workflows rather than making final investment decisions
Once assets themselves live in programmable containers (tokens), these same patterns extend from "analyze and suggest" to "analyze and act".
---
3. AI agents on top of tokenized assets
A rough stack for AI-driven tokenized markets looks like this:
3.1 Data layer
Agents need a clean description of each asset:
- what the token represents (fund units, treasuries, GPUs, receivables)
- what cash flows look like (daily yield, revenue share, rent)
- what the legal and regulatory constraints are (KYC, accreditation, region)
- where the token lives on-chain (addresses, supported networks)
Today this information is scattered across PDFs, marketing pages, and block explorers. Some registries are starting to normalize this for treasuries and money market funds, but coverage is still thin.
Without a reliable data layer, agents will hallucinate eligibility, misread risk, and misprice collateral.
3.2 Execution layer
Once an agent understands assets, it can:
- route orders between tokenized funds based on yield, duration, and risk
- manage collateral, for example posting BUIDL to trading venues while monitoring loan-to-value ratios and margin calls
- rebalance between tokenized funds like BUIDL, Franklin's on-chain fund, and newer products as yields and liquidity change
- trigger workflow actions such as instructing custodians or recording movements in off-chain systems
Most of this is boring middle-office work today. AI agents plus tokenization turn it into software.
3.3 Compliance layer
The hard constraint is regulation. Tokens are often restricted to certain jurisdictions or investor types. Franklin's fund, for example, is a regulated 1940 Act fund with specific eligibility requirements, and many tokenized treasuries are limited to qualified or institutional investors.
Agents need machine-readable rules for:
- who is allowed to hold or trade a given token
- how concentration and liquidity limits apply
- what happens during corporate actions or redemptions
In highly regulated environments, this layer will probably be more "AI assisted" than fully autonomous for a long time.
---
4. AI funding AI: tokenized compute as a case study
One of the clearest early intersections of AI and tokenization is AI infrastructure itself.
Several projects now tokenize GPU capacity and data center revenue:
- GAIB's $30 million deal with Siam.AI tokenizes GPU infrastructure for an Asian sovereign AI cloud.
- Compute Labs and NexGen Cloud have launched vaults that fractionalize high-end NVIDIA GPUs, with expected yields in the double digits.
- Protocols like USD.AI turn stablecoin liquidity into loans secured by tokenized GPUs, bridging DeFi capital into AI data centers.
Here the loop is tight:
1. Tokenization makes GPU capacity investable.
2. DeFi and RWA investors provide capital against these tokens.
3. Data centers deploy GPUs to train more AI models.
4. Those models in turn help manage portfolios and optimize the data centers themselves.
AI is not just trading tokens. It is helping allocate capital to the infrastructure that powers AI.
---
5. What breaks if we are lazy
There are real failure modes.
- Garbage data
- Opaque models
- Regulatory mismatches
- Concentration and feedback loops
Tokenization plus AI is not automatically more stable than the current system. It is just more programmable.
---
6. Likely path over the next five years
Given current adoption, a plausible trajectory looks like this:
1. Phase 1: Human in the loop
Asset managers use AI assistants to summarize tokenized products, monitor positions, and prepare trade tickets that humans still execute.
2. Phase 2: Semi-autonomous execution
For limited, well-specified strategies (for example, keeping cash in tokenized treasuries above a yield threshold while respecting eligibility rules), agents execute directly with human monitoring and strict limits.
3. Phase 3: Full stack integration
Tokenized funds, collateral platforms, and custodians expose standardized APIs and event streams. AI agents manage liquidity, collateral, and even some corporate actions end to end, while regulators focus on model governance and data quality.
BlackRock CEO Larry Fink has compared tokenization today to the internet in 1996, arguing that it could grow at the speed of the web era. If that is even roughly right, the real leverage will accrue to the systems that help machines understand and act on these new assets safely.
Tokenization makes financial assets legible to code. AI agents are the code that will read, reason, and execute on that legible layer. The window to define how that stack works in practice is open now, not in ten years.
© RWA Kernel