$GOOGL is everywhere! 👇


Google reveals algorithms to address AI memory challenges; memory and storage stocks drop
Google (GOOG)(GOOGL) revealed a set of new algorithms today designed to reduce the amount of memory needed to run large language models and vector search engines.
Shares of major memory and storage suppliers had declined during early market action on Wednesday. Micron Technology (MU) was down 4%, Western Digital had slid 4.4%, Seagate Technology (STX) had declined 5.6%, and Sandisk (SNDK) had sunk 6.5%. Sandisk revealed today it has entered into a private placement subscription agreement to make an equity investment in semiconductor firm Nanya Technology.
The algorithms introduced by Google include TurboQuant, Quantized Johnson-Lindenstrauss, and PolarQuant. TurboQuant is a compression algorithm that optimally addresses the challenge of memory overhead in vector quantization.
A mathematical technique known as the Johnson-Lindenstrauss Transform shrinks complex, high-dimensional data while preserving the essential distances and relationships between data points. This algorithm creates a high-speed shorthand that requires zero memory overhead.
Finally, PolarQuant addresses the memory overhead problem by using polar coordinates. This allows LLMs to skip the data normalization step because it maps data onto a fixed, predictable “circular” grid where the boundaries are already known.
"As AI becomes more integrated into all products, from LLMs to semantic search, this work in fundamental vector quantization will be more critical than ever," said Google research scientist Amir Zandieh and Vahab Mirrokni, VP and Google Fellow, in a blog post.
Google plans to present TurboQuant at the International Conference on Learning Representations in Rio de Janeiro in April.
STX1.08%
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin