Found 183 repositories(showing 30)
jasonacox
Setup and run a local LLM and Chatbot using consumer grade hardware.
zozoheir
Develop, evaluate and monitor LLM applications at scale
weiserlab
Bringing Language Models to the Most Resource Constrained Devices
Ashfaqbs
a collection of tiny llms with usecases
Mikyx-1
Build a tiny LLM from scratch.
brianestadimas
No description available
xi029
实现一个小型的LLM,包含LORA微调、web端部署、借鉴Deepseek的MOE架构、潜在注意力机制和简单的RAG功能,以学习大模型流程为主
CoraCote
Setup and run a local LLM and Chatbot using consumer grade hardware.
shaleigang
Train a tiny language model using pure C++ from scratch without any third-party libraries. CUDA is supported.
YikunHan42
No description available
rodgersmag
TinyLLM is a research project focused on developing and training compact, specialized language models using publicly available datasets from platforms like Hugging Face. The project aims to explore diverse architectures and build expertise toward creating a high-performance coding model rivaling the capabilities of claude code
BhashkarFulara369
Tiny devices, big ideas , from bits to brilliance
overloadedHenry
A repository for LLM Learners.
mgruszkiewicz
simple gui for openai api for brick/feature mobile phones compatible with j2me
Wings236
从零开始的小型LLM
elcruzo
LLM inference engine
inareshmatta
TinyLLM with Attention Residue Architecture - PyTorch implementation of Attention Residuals (AttnRes) — arXiv:2603.15031 by Kimi Team. Replaces fixed residual connections with learned depth-wise attention. Full AttnRes + Block AttnRes built from scratch with side-by-side training comparison.
theyashwanthsai
This repo has all the code (well organized) i wrote while reading the book "Build a Large Language Model from Scratch"
vikramlingam
TinyLLM project is a lightweight, privacy-focused AI inference system designed to run locally on standard laptops without a dedicated GPU. It utilizes Microsoft's ONNX Runtime (`onnxruntime-genai`) and the `Phi-3.5-mini-instruct` model optimized for CPU performance.
Rehanasharmin
Most tinyestllm ever created
Iro96
An LLM model with 10 million parameters was designed for research purposes.
ydah
A minimal LLM implemented in pure Ruby
newfacade
easy to understand
sreekarvamsi
Edge LLM for In-Vehicle Deployment - Optimized small language model for automotive conversational AI
Akash-source-web
No description available
Pranav082001
Code Repository for Uds-Pretraining LLM Software project
Hetens
No description available
zcxGGmu
A simple inference engine written in Rust
aniketnighot
Character-level Transformer LLM built from scratch in PyTorch.
No description available