Gemma 3 Collection All versions of Google's new multimodal models including QAT in 1B, 4B, 12B, and 27B sizes. In GGUF, dynamic 4-bit and 16-bit formats. • 55 items • Updated 3 days ago • 96
Kimi Linear: An Expressive, Efficient Attention Architecture Paper • 2510.26692 • Published Oct 30 • 114
view article Article Accelerating Qwen3-8B Agent on Intel® Core™ Ultra with Depth-Pruned Draft Models +3 Sep 29 • 21
view article Article Reachy Mini - The Open-Source Robot for Today's and Tomorrow's AI Builders Jul 9 • 721
Brainstorm Adapter Models - Augmented/Expanded Reasoning Collection Adapters by DavidAU: Splits apart the reasoning center(s) and multiples them 3x, 4x, 8x, 10x, 20x, 40x+. Creativity+ / Logic+ / Detail+ / Prose+ ... • 202 items • Updated about 10 hours ago • 23