Back to search
30.5B MoE language model from Qwen team, tuned for broad instruction following, reasoning, multilingual tasks, and agentic tool use.<metadata> gpu: A100 | collections: ["HF_Transformers"] </metadata>
Stars
0
Forks
2
Watchers
0
Open Issues
0
Overall repository health assessment
No package.json found
This might not be a Node.js project
7
commits