🍁 Golden Autumn, Big Prizes Await!
Gate Square Growth Points Lucky Draw Carnival Round 1️⃣ 3️⃣ Is Now Live!
🎁 Prize pool over $15,000+, iPhone 17 Pro Max, Gate exclusive Merch and more awaits you!
👉 Draw now: https://www.gate.com/activities/pointprize/?now_period=13&refUid=13129053
💡 How to earn more Growth Points for extra chances?
1️⃣ Go to [Square], tap the icon next to your avatar to enter [Community Center]
2️⃣ Complete daily tasks like posting, commenting, liking, and chatting to rack up points!
🍀 100% win rate — you’ll never walk away empty-handed. Try your luck today!
Details: ht
Microsoft Open Source New Version of Phi-4: Inference Efficiency Rises 10 Times, Can Run on Laptops
Jin10 data reported on July 10, this morning, Microsoft open sourced the latest version of the Phi-4 family, Phi-4-mini-flash-reasoning, on its official website. The mini-flash version continues the Phi-4 family’s characteristics of small parameters and strong performance, specifically designed for scenarios limited by Computing Power, memory, and latency, capable of running on a single GPU, suitable for edge devices like laptops and tablets. Compared to the previous version, mini-flash utilizes Microsoft’s self-developed innovative architecture, SambaY, resulting in a big pump in inference efficiency by 10 times, with average latency reduced by 2-3 times, achieving a significant improvement in overall inference performance.