Found 4 repositories(showing 4)
Bensmail-anis
We build a Mini Generative Pretrained Transformer (GPT) inspired by the "Attention is All You Need" paper -11 M params
brooksideas
Experimental project building a Language Model from scratch inspired by Generative Pre-trained Transformers (GPT). Architecture.
Pranavh-2004
Exploring transformers by building a GPT model from scratch using nanoGPT, inspired by Andrej Karpathy’s tutorial.
harshu0117
A PyTorch implementation of the Transformer architecture, inspired by the seminal paper Attention is All You Need, and OpenAI's GPT models. This repository contains code for building, training, and evaluating transformer-based language models from scratch.
All 4 repositories loaded