Found 14 repositories(showing 14)
advitrocks9
Word2vec from scratch in pure NumPy. Skip-Gram with negative sampling, trained on text8.
Word2Vec Skip-Gram with Negative Sampling implemented from scratch in pure NumPy — no ML frameworks.
ammar-cpu
Word2Vec (skip-gram + negative sampling) implemented from scratch in pure NumPy.
amirosla
Word2Vec (skip-gram + negative sampling) implemented from scratch in pure NumPy
antonirozbicki
Word2vec (skip-gram + negative sampling) implemented from scratch in pure NumPy
ahmedyousry27
Word2Vec Skip-gram with Negative Sampling implemented from scratch in pure NumPy.
amirosla
Word2Vec Skip-Gram with Negative Sampling implemented from scratch in pure NumPy
MKorp7
Skip-gram word2vec with negative sampling implemented from scratch in pure NumPy
veneviusxx
Implementation of the Word2Vec algorithm (Skip-gram with Negative Sampling) from scratch using pure NumPy.
Word2Vec Skip-Gram with Negative Sampling implemented from scratch in pure NumPy — no ML frameworks.
chelaifa
Word2Vec (Skip-gram with Negative Sampling) implemented from scratch in pure NumPy — no ML frameworks
ognjenvujovic04
Word2Vec implemented from scratch in NumPy. Train word embeddings on any text corpus with pure Python + NumPy. Features custom negative sampling, learning rate decay, similarity/analogy evaluation, and model checkpointing.
LinkBOTW17
A pure NumPy implementation of Word2Vec (Skip-gram with Negative Sampling). Built from scratch to demonstrate the mathematical foundations of word embeddings, backpropagation, and custom optimization.
arshiyashaik-24
Pure NumPy implementation of Word2Vec (Skip-gram with Negative Sampling). Includes full training loop from scratch: forward pass, loss computation, manual gradient derivation, and SGD updates without ML frameworks. Built to demonstrate deep understanding of embeddings, optimization, and representation learning.
All 14 repositories loaded