Found 81 repositories(showing 30)
jaymody
An unnecessarily tiny implementation of GPT-2 in NumPy.
Kuberwastaken
GPT in a QR Code ; The actual most atomic way to train and inference a GPT in pure, dependency-free JS/Python.
francoisfleuret
Minimal GPT (~350 lines with a simple task to test it)
jaymody
Like picoGPT but for BERT.
bigeagle
No description available
enricozb
picoGPT in Rust
NPN
J port of "An unnecessarily tiny implementation of GPT-2 in NumPy."
ehijano
No description available
AkmOleksandr
No description available
alight659
A pico-level implementation of Generative Pre-Trained Transformer(GPT) from scratch.
banjobyster
Try out here. Initially, you need to download the model weights. Once downloaded, the weights can be loaded from local storage on next visits.
nielsuit227
A fun little play with transformers and text generation
QuwsarOhi
A simple small GPT model for educative purpose that can be trained on CPU
tobiasosborne
From-scratch GPT in pure Julia. Every gradient by hand. No AD, no GPU, no magic.
VeryFatBoy
No description available
ptarau
Small scale GPT-like full transformer stack, runnable on any PC
Evan704
Smaller than nano! :)
EthanCornell
A hands-on fork of NanoGPT with FlashAttention-2 CUDA kernels, INT8/INT4 GPTQ quantization, paged KV-cache reuse, and continuous batching, turning a tiny Shakespeare model into a full-speed GPU LLM inference demo.
persquare
Rewrite of nanoGPT
Zjm1900
60 lines code for a simple GPT
bbpp222006
No description available
lukasmoellerch
A tiny implementation of BERT in NumPy, inspired by jaymody/picoGPT
mundahl
PicoGPT and others
nazimboudeffa
No description available
Masony817
No description available
arnavg115
GPT2 in numpy. A worse version of the original picogpt.
chenlujiu
A tiny llm for learning training process
KumaloWilson
GPT in a QR Code ; The actual most atomic way to train and inference a GPT in pure, dependency-free JS/Python.
truebest
No description available
Killua7362
No description available