Technology Innovation Institute (TII), Abu Dhabi, has released the Falcon-H1R-7B, a 7B parameter reasoning specialized model that matches or exceeds many 14B to 47B reasoning models in math, code and …
Context
-
-
AI Tools
NVIDIA AI Releases Nemotron 3: A Hybrid Mamba Transformer MOE Stack for Long Context Agent AI
NVIDIA has released the Nemotron 3 family of open models as part of a full stack for agentic AI, including model weights, datasets, and reinforcement learning tools. The family has …
-
AI News
Google Introduces T5Gemma 2: Encoder-Decoder Model with Multimodal Input via SigLIP and 128K Context
Google has published T5Gemma 2open family encoder-decoder Custom-made transformer checkpoints Gemma 3 Pre-trained weights in an encoder-decoder layout, then continuing pre-training with that UL2 Objective. is released pre trained onlyThe …
-
Generative AI
Zipu AI Releases GLM-4.6V: A 128K Context Vision Language Model with Native Tool Calling
Zipu AI has open sourced the GLM-4.6v series as a pair of vision language models that treat images, videos, and tools as first-class inputs to agents, not as afterthoughts on …
-
Generative AI
DeepSeek Researchers Introduce DeepSeek-v3.2 and DeepSeek-v3.2-Special for Long Context Reasoning and Agentic Workloads
How do you get GPT-5-level logic on real long-context, tool-use workloads without paying the quadratic attention and GPU costs that typically make those systems impractical? DeepSeek Research Introduces deepseek-v3.2 and …
-