Tether launches mobile local medical AI: 1.7B small model surpasses 16 times larger models, completely eliminating reliance on the cloud

robot
Abstract generation in progress

According to Beating monitoring, the AI research team of USDT issuer Tether announced today the launch of the QVAC MedPsy series of medical language models—an on-device localized medical AI designed specifically for low-compute terminals such as smartphones and wearables. It can run without relying on cloud servers. Through an efficient architecture, it delivers performance far beyond the model size: the 1.7B parameter version achieved an average score of 62.62 across seven closed medical benchmarks, exceeding Google MedGemma-4B by 11.42 points, and beat MedGemma-27B—whose parameter count is nearly 16 times larger—in real clinical scenarios such as HealthBench Hard; the 4B parameter version scored even higher, reaching 70.54, fully surpassing larger models while significantly reducing inference token consumption (up to 3.2 times). It is released in a quantized GGUF format (about 1.2GB for 1.7B), making it suitable for mobile and edge deployment.

This release challenges the traditional assumption that “bigger models = better performance.” It focuses on improving efficiency through staged medical post-training (supervised learning, clinical reasoning data + reinforcement learning), enabling true on-device privacy protection and low-latency inference. Tether CEO Paolo Ardoino said that this allows medical AI to process sensitive data directly at hospitals locally and on the device side, without transmitting it to the cloud—thereby reducing costs, latency, and privacy risks. It is expected to reshape the infrastructure of medical AI and promote the adoption of localized deployment worldwide, especially in underdeveloped regions.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin