Found 70 repositories(showing 30)
meituan-longcat
A flagship 560-billion-parameter open-source MoE model that advances Native Formal Reasoning in Lean4 for Mathematics Formalization and Proving through agentic tool-integrated reasoning.
moe-serifu-circle
Anime-themed personal assistant and goal-oriented intelligent agent
zhaozijie2022
Official implementation of the paper "Learning and Planning Multi-Agent Tasks via a MoE-based World Model"
guyulongcs
Awesome papers in Large Language Models (LLM). They focus on state-of-the-art LLM methods, such as algorithms, system, SFT, RL, Multi-modal LLMs, MOE, Quantization, and Applications (RAG, agent, coding).
Recursive Multi-Agent Mixture-of-Experts (R-MoE) for Autonomous Clinical Diagnostics
boluo2077
🎭 MoE Agent: Replace Multi-Agent with single-prompt AI architecture | RAG + SQL + Tool calling without routing | Zero context loss, 10x cost reduction | LLM Prompt engineering for AGI | Production-ready template | Solve ambiguous queries & cross-domain reasoning
AaronCWacker
MoE Mode with Mixture of Experts to Support Health Care Scenarios in Multi Agent Systems
AryasKeeper
Enterprise-grade AI agent infrastructure with ReAct loops, multi-agent coordination, memory management, and secure tool execution. Features zero-trust security, intelligent MoE routing, comprehensive observability, and production-ready deployment capabilities.
wuulong
環保署開放資料 api 的 adk agent
MoebiusSolutions
Standardized install scripts for cac-agent
keithofaptos
MoE-Diffusion Multimodal World Models for Cognitive Multi-Agent Systems, with H7 gated coherence.
pemartins1970
B.E.N. (Bridge Engine for Native CFML) — A CFML-MCP Native System. MCP+MoE+LLM agent and an API key management and rotation system.
TSNTheSilentNinja
A Mod for DDLC in which a special agent is transported into the body of the main character in order to save the multiverse from an unknown evil. With his expertise, he will find out what is happening in the multiverse and put an end to it once and for all! ...Well, at least that's what he'll try to do. I mean he DID get sent to the most dangerous Earth imaginable, where the source of the problem is located. This Earth happens to be full of incredibly cute girls! Will he be able to save the girls from the lurking evil, or will he fall to the evil that terrorizes this Earth? This Mod was made to be played after the normal game which can be found at https://ddlc.moe/
heichaowo
A Go-based daemon for automated BGP peering on DN42. Manages WireGuard tunnels, BIRD routing configuration, and real-time metrics—all orchestrated by a central Control Plane.
ianshank
No description available
qnftk020
MoE (Mixture of Experts) multi-agent app prototype generator powered by Gemini CLI + Claude Code CLI
ry-ops
Autonomous multi-agent AI platform for GitHub repository lifecycle management. MoE routing, self-healing workers, LLM mesh gateway, RAG-enhanced context, enterprise governance. [Archived]
Livingstone99
Production-ready Flutter package for building intelligent agent-based applications. Features single agents with tools, Mixture of Experts (MOE) architecture, 4 LLM providers (OpenAI, Claude, DeepSeek, Gemini), intelligent routing, response synthesis, and pluggable architecture for custom providers. MIT Licensed.
tayyaba034
An AI-powered data science assistant built on a Mixture-of-Experts (MOE) framework. The chatbot interface triggers specialized agents for data collection, preprocessing, analysis, feature engineering, model training, evaluation, and optimization. Implemented using OpenAI SDK and Xero MCP Server, with Google AI Studio API integration.
mf2023
A high-performance multimodal Mixture-of-Experts (MoE) model featuring the Yv Architecture, supporting text, image, audio, video, document, and agent understanding. PiscesL1 (PiscesLx series, Dunimd Team) is designed for research and practical applications, capable of running on a single RTX 4090 GPU with scalable architecture up to 1T parameters.
moegodot
No description available
YogeshSai27
No description available
Sonal56
No description available
inferless
30.5B MoE language model from Qwen team, tuned for broad instruction following, reasoning, multilingual tasks, and agentic tool use.<metadata> gpu: A100 | collections: ["HF_Transformers"] </metadata>
prasadgola
Agent Harness for MoE models
samson623
No description available
omnistrateg-ux
MOEX Trading Agent with ML prediction and margin risk management
deepthibiotune-hash
Moems Agent with Langchain
renanvicente
Agent
pratikbhande
No description available