Tokenization is increasingly being framed by U.S. regulators as a natural evolution of financial market infrastructure rather than a disruptive break from existing systems. That message was reinforced on February 9, 2026, when a commissioner of the U.S. Securities and Exchange Commission outlined how blockchain based systems could modernize securities markets without weakening investor protections.
Speaking at an industry forum focused on asset management, derivatives, and tokenized markets, Mark T. Uyeda described tokenization as part of a broader historical progression shaped by technology. He emphasized that financial markets have repeatedly adapted to new record keeping systems, from paper certificates to electronic databases, and that distributed ledger technology may represent the next phase of that process.
Uyeda explained that tokenization involves migrating securities records from traditional centralized databases to blockchain based systems that represent ownership and contractual rights directly on chain. Under this model, digital tokens encode the rights and obligations associated with securities while maintaining a verifiable record of ownership and transaction history. He stressed that this approach does not change the legal nature of the instruments themselves.
According to Uyeda, properly designed tokenized systems could improve security, transparency, and data integrity across issuance, trading, and post trade processes. He highlighted the potential for clearer ownership records and improved visibility into shareholder positions, areas that have long presented challenges for market participants and issuers. Faster settlement cycles were also cited as a key benefit, with the potential to reduce friction, counterparty risk, and operational complexity.
Crucially, Uyeda made clear that tokenization does not create an exemption from existing securities laws. Tokenized instruments, he noted, remain subject to the same regulatory obligations as their traditional counterparts. Investor protection, disclosure requirements, and market integrity rules would continue to apply regardless of the underlying technology used to record or transfer ownership.
The regulatory approach described by Uyeda focuses on outcomes rather than specific technologies. He outlined a framework that favors technology neutral rules supported by engagement tools such as public consultations, staff guidance, roundtable discussions, and limited exemptive relief where appropriate. This method, he suggested, allows regulators to study innovation in real market conditions without relying on enforcement actions to define policy direction.
Uyeda also underscored the importance of measured experimentation. Rather than rapid or wholesale shifts, he argued that incremental adoption and transparency are essential to ensuring that tokenization strengthens, rather than destabilizes, market infrastructure. He linked this cautious approach to the SEC’s statutory mandate to maintain fair, orderly, and efficient markets.
As interest in tokenized securities continues to grow among financial institutions, issuers, and technology providers, the remarks signal that regulators are open to modernization efforts that align with existing legal frameworks. Tokenization, as presented, is not a shortcut around regulation but a potential tool for improving how markets function in a digital environment.



