Kimi Linear: An Expressive, Efficient Attention Architecture Paper • 2510.26692 • Published Oct 30, 2025 • 124
hxa07D RWKV-Transformer Hybrid series Collection New hxa07D family of hybrid models, combining improved RWKV recurrent architectures with Transformer-based attention. Designed for efficient long-cont • 6 items • Updated Jan 7