Concerns Raised About Accuracy of Google's TurboQuant Paper

The author, a researcher at ETH Zurich, claims that Google's TurboQuant paper accepted at ICLR 2026 has serious problems in its description and comparison of the prior RaBitQ vector quantization method, which the author developed during their PhD.

đź’ˇ

Why it matters

This issue highlights concerns about the accuracy of academic publications, especially when high-profile work fails to properly credit and compare against prior research.

Key Points

  • 1TurboQuant and RaBitQ share a core methodological similarity in using random rotation/Johnson-Lindenstrauss transform before quantization, but TurboQuant fails to properly acknowledge this
  • 2TurboQuant authors were aware of the details of RaBitQ before publication, but chose not to correct inaccuracies in their paper
  • 3RaBitQ was the first work to combine random rotations with vector quantization and prove optimal theoretical guarantees, but TurboQuant does not properly credit this
  • 4Despite reviewer feedback, TurboQuant authors refused to add a fuller discussion and comparison of RaBitQ in the final paper

Details

The author, Jianyang Gao, is a postdoctoral researcher at ETH Zurich and the first author of the RaBitQ line of work, which was published in 2024. RaBitQ proposed a high-dimensional vector quantization method and proved it achieves asymptotically optimal error bounds. One of the key ideas in RaBitQ is to apply a random rotation to the input vector before quantization, similar to a Johnson-Lindenstrauss transform. The author claims that TurboQuant, a paper accepted to ICLR 2026 from Google Research, has serious problems in how it describes and compares the prior RaBitQ method. Specifically, TurboQuant systematically avoids acknowledging the methodological similarity between the two approaches, even though the TurboQuant authors had detailed understanding of RaBitQ's technical details beforehand. The author argues that this omission is problematic, as RaBitQ was the first work to combine random rotations with vector quantization and prove optimal theoretical guarantees under the same problem setting. Despite reviewer feedback and the author's direct requests, the TurboQuant authors refused to add a fuller discussion and comparison of RaBitQ in the final paper.

Like
Save
Read original
Cached
Comments
?

No comments yet

Be the first to comment

AI Curator - Daily AI News Curation

AI Curator

Your AI news assistant

Ask me anything about AI

I can help you understand AI news, trends, and technologies