Found 46 repositories(showing 30)
chonieny
No description available
codingyesiamalway
No description available
PriyankaGoenka
No description available
EmmaRYoung
Implement VGG-16 network architecture
itsYaldaKarimi
No description available
YonatanEldan
Machine-Learning-HW5
DoidoYo
Machine Learning HW5
HW5 Machine Learning
cmc17300
No description available
amgenene
No description available
127rifat
No description available
Khyre-Hill
No description available
Cassidy-Smith12
No description available
SimonGeorge8
No description available
DaanyaalBawla
q1andq2
lepingwang928
No description available
edalrami
Practice with regularization and neural networks
Ekta1489
No description available
tjace
No description available
ZackLa
No description available
vidhaidiamond
Machine Learning HW-5
Donovan-Rasamoelison
No description available
krinya
5th homework for ML class
Endy55
No description available
PraneethTeja15
This project implements scaled dot-product attention in NumPy and a simplified Transformer encoder block in PyTorch, including multi-head self-attention, feed-forward layers, residual connections, and layer normalization. The code verifies correct dimensions for typical input batches.
SaeedSaravani
Machine learning course HW5
No description available
jaminahabiba
No description available
shaozhed
No description available
korivernon
No description available