Generative AI NVIDIA releases Nemotron 3 Super: a 120B parameter open-source hybrid Mamba-Attention MOE model that delivers 5x higher throughput for agent AI. by ai-intensify March 11, 2026 March 11, 2026 Read more
AI News Kyutai releases Hibiki-Zero: A3B parameter simultaneous speech-to-speech translation model using GRPO reinforcement learning without any word-level aligned data by February 13, 2026 February 13, 2026 Read more
Generative AI Liquid AI releases LFM2.5-1.2b-thinking: a 1.2b parameter reasoning model that fits under 1GB on-device by January 21, 2026 January 21, 2026 Read more
Generative AI Microsoft Research Releases OptiMind: A 20B Parameter Model That Transforms Natural Language Into Solver Ready Optimization Models by January 20, 2026 January 20, 2026 Read more
Future Tech LLM has a lot of parameters. But what is the parameter? by January 7, 2026 January 7, 2026 Read more