Found 86 repositories(showing 30)
jaymody
An unnecessarily tiny implementation of GPT-2 in NumPy.
bigeagle
No description available
code-cp
No description available
rkinas
This is a funny compressed Karpathy gpt2 imlementation
NPN
J port of "An unnecessarily tiny implementation of GPT-2 in NumPy."
Alan-Shin
No description available
ehijano
No description available
curry-bean
No description available
QuwsarOhi
A simple small GPT model for educative purpose that can be trained on CPU
banjobyster
Try out here. Initially, you need to download the model weights. Once downloaded, the weights can be loaded from local storage on next visits.
AkmOleksandr
No description available
nielsuit227
A fun little play with transformers and text generation
alight659
A pico-level implementation of Generative Pre-Trained Transformer(GPT) from scratch.
tobiasosborne
From-scratch GPT in pure Julia. Every gradient by hand. No AD, no GPU, no magic.
Satyam1Gupta
pico-GPT: pico version of large language model like GPT
persquare
Rewrite of nanoGPT
Zjm1900
60 lines code for a simple GPT
EthanCornell
A hands-on fork of NanoGPT with FlashAttention-2 CUDA kernels, INT8/INT4 GPTQ quantization, paged KV-cache reuse, and continuous batching, turning a tiny Shakespeare model into a full-speed GPU LLM inference demo.
Evan704
Smaller than nano! :)
Michae-zHOU
Pico-GPT is a minimal PyTorch-based GPT implementation built for learning and experimentation. It features a full transformer, training tools, and a simple chatbot UI.
lineick
Mini GPT from Scratch implementation (from Andrej Karpathy's Zero to Hero Youtube series)
ChidambaraRaju
A GPT-style decoder-only Small Language Model (~49M parameters) built from scratch using PyTorch. This project implements a clean, scalable, and research-grade training pipeline for pretraining a transformer language model from first principles.
harsh15163
A small scale version of GPT to test out various ideas related to its analog/digital implementation
torphix
A pico sized GPT based chatbot, first trained in an unsupervised manner on a large text corpus followed by finetuning on a conversational dataset in a supervised manner
Michaelockz
A reproduction of GPT 2 model using Pytorch .
rsnemmen
Minimal chatbot from scratch
Ragnarok540
No description available
theiyobosa
A small experiment in training a language model from scratch on a single laptop.
nnethercott
a smaller implementation of karpathy's nanoGPT with modern features
taras-sereda
No description available