Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
$GOOGL is everywhere! 👇
Google reveals algorithms to address AI memory challenges; memory and storage stocks drop
Google (GOOG)(GOOGL) revealed a set of new algorithms today designed to reduce the amount of memory needed to run large language models and vector search engines.
Shares of major memory and storage suppliers had declined during early market action on Wednesday. Micron Technology (MU) was down 4%, Western Digital had slid 4.4%, Seagate Technology (STX) had declined 5.6%, and Sandisk (SNDK) had sunk 6.5%. Sandisk revealed today it has entered into a private placement subscription agreement to make an equity investment in semiconductor firm Nanya Technology.
The algorithms introduced by Google include TurboQuant, Quantized Johnson-Lindenstrauss, and PolarQuant. TurboQuant is a compression algorithm that optimally addresses the challenge of memory overhead in vector quantization.
A mathematical technique known as the Johnson-Lindenstrauss Transform shrinks complex, high-dimensional data while preserving the essential distances and relationships between data points. This algorithm creates a high-speed shorthand that requires zero memory overhead.
Finally, PolarQuant addresses the memory overhead problem by using polar coordinates. This allows LLMs to skip the data normalization step because it maps data onto a fixed, predictable “circular” grid where the boundaries are already known.
"As AI becomes more integrated into all products, from LLMs to semantic search, this work in fundamental vector quantization will be more critical than ever," said Google research scientist Amir Zandieh and Vahab Mirrokni, VP and Google Fellow, in a blog post.
Google plans to present TurboQuant at the International Conference on Learning Representations in Rio de Janeiro in April.