Alibaba's Qwen3.5-397B Achieves #3 Position in Open Weights Model Rankings
1 min readAlibaba has released Qwen3.5-397B-A17B, an open-weight mixture-of-experts model that ranks #3 in the Artificial Analysis Intelligence Index among open models, competing directly with frontier closed-source alternatives. As a 397B parameter model with selective activation (MoE architecture), this release provides a high-performance option for organizations capable of hosting large-scale local inference infrastructure.
The model's strong benchmark performance while remaining fully open-weight makes it attractive for enterprises seeking alternatives to closed-source APIs. The MoE architecture is particularly relevant for local deployment because it enables efficient inference—only a subset of parameters activate for any given input, reducing computational requirements compared to dense models of similar nominal size.
This release is significant for the local LLM ecosystem because it demonstrates continued momentum in open-source model development at the cutting edge of performance. Organizations with the infrastructure to run 397B models can now self-host a genuinely competitive alternative, reducing dependency on external API providers and gaining full control over data and inference behavior.
Source: r/LocalLLaMA · Relevance: 8/10