Traders will soon be able to bet on computer chip prices as AI drives costs skyward

CME Group sign above the former Chicago Board of Trade (CBOT) trading pit in Chicago, Illinois, USA on Thursday, November 13, 2025.
Christopher Dilts | Bloomberg | Getty Images
A new futures market for semiconductors will allow investors to hedge their AI investments by betting on the increasingly expensive price of computing power.
Contracts on CME Group’s new “computing futures market” will be based on Silicon Data’s graphics processing units (GPU) price indices, the companies said in a statement announcing the joint venture published Tuesday, which is still pending regulatory review.
The new market will allow investors to lock in the price of computing capacity based on GPU benchmarking, which can be used to hedge against rising GPU lease rates and other operational costs across a massive and versatile AI structure.
“GPU markets… have historically lacked standardized reference pricing,” said Carmen Li, CEO of Silicon Data. release. “The launch of computer futures is an important step towards providing AI developers, cloud providers and investors with more reliable tools for valuation, hedging and long-term planning.”
Futures markets are traditionally associated with basic commodities such as foodstuffs, metals, and petroleum products, but they have also emerged for aggregated components in rapidly developing segments of advanced industrial sectors.
During the broadband boom of the late 1990s, Enron’s broadband services division sought to sell unused capacity in its fiber optic cable network before the company’s massive failure.
Except for semiconductors, Silicon Data sells customers access to proprietary price indexes similar to the consumer price index or personal consumption expenditures price index. Its products include a standardized GPU price index, a RAM index, and estimates of GPU rental prices.
Wall Street doesn’t see demand for GPUs or more traditional central processing units (CPUs) slowing down anytime soon.
“Agent AI requires entirely new CPU server racks that sit alongside GPU infrastructure to power the work of all these agents,” wrote analyst Shawn Kim of Morgan Stanley in a report on Monday.
“The future AI system will resemble a distributed system consisting of racks of GPUs for dense model computation… [and] agency CPU racks for orchestration, data processing, and vehicle execution,” Kim said.
Memory chip prices rose in the first quarter as artificial intelligence increased demand for CPUs. While hyperscalers have increased capital spending across the board, executives have also expressed concern about a bottleneck in memory that is driving up input costs.
Memory chip makers are forecasting huge profit margins this year and next as valuations rise rapidly.




