关闭导航

包含标签" mixture of experts (MoE) architecture"的内容

腾讯开源Hunyuan-A13B:动态推理+MoE架构,AIME准确率87.3%超OpenAI o1
AI妹 1 个月前 7 0

Tencent has recently announced the open-source release of its new language model "**Hunyuan-A13B**

阿里巴巴Qwen3-Coder成全球最受欢迎开源AI编程模型
AI妹 1 个月前 11 0

On July 24, the world's largest AI open-source community, Hugging Face, released the latest large