Found 16 repositories(showing 16)
M-Taghizadeh
In this repository, based on the latest NLP pappers, we researched on sequential data and time series and developed tasks in NLP such as stock price prediction, time series prediction, sentiment analysis from text and We developed the language model and so on. This research is based on recurrent neural networks, LSTM networks and the new Transformer architecture and attention mechanism.
Anharo
This repository is a complete Generative AI learning hub, containing my implementation of classic and modern deep learning architectures used in NLP and sequence generation. The goal of this repo is to learn, build, and experiment with models that power today’s AI systems — from simple RNNs to full-scale Transformers.
Ratnesh-181998
Experienced 4+ Yrs across the full AI/ML lifecycle, from DE , DS and model development to API driven deployment, cloud infrastructure, and monitoring. Strong hands on expertise in ML (LR, SVM, Random Forest, XGBoost, ARIMA, SARIMA, Time Series, Forecasting, RL), DL(CNN, RNN, GANs, Yolo8 ,Transformers),Computer Vision, NLP, GenAI & Agentic AI.
TraitYoung
My re-implementation of NLP basics course.
aliakyurek
NLP examples from simple RNN to Transformers
phenammar
Applying my NLP learning from simple RNN to transformers.
mariorizki-lang
A practical, end-to-end guide to TensorFlow 2: from tensors, Keras APIs, and data pipelines to deep learning models (CNNs, RNNs, Transformers) for vision and NLP, plus TensorBoard and TFX for monitoring and production deployment.
AdelNamani
From-Scratch Implementation of different DL NLP algorithms, ranging from RNNs to Transformers. 💻📚📃
Mohannadwaleed
a machine translation project using NLP models like RNN and transformers it translates from one language to another(here it's from English to Arabic)
ThoKimHuynh
Different supervised and unsupervised learning algorithms (including RNN deep learning network) together with NLP transformers are used to do text-author classification from Gutenberg project.
holaholu
Deep learning projects implemented from scratch. Demonstrates the core architectures in Computer Vision, NLP, and Sequence Models, ranging from fundamental CNNs/RNNs to modern Transformers and Generative models.
drisskhattabi6
This repository contains a collection of hands-on labs and experiments from my Natural Language Processing (NLP) module. Each lab focuses on a specific aspect of NLP, ranging from text preprocessing and rule-based methods to advanced deep learning techniques like RNNs, LSTMs, and Transformers.
s20488
The course covers key NLP topics—from basic methods (tokenization, Word2Vec) to modern models (Transformers, BERT, GPT), their optimization, and applications (NER, chatbots). It explores both theory (RNN, LSTM architectures) and practice (fine-tuning, quantization), concluding with a discussion on NLP trends and future directions.
FarahR01
This repository documents my journey learning Natural Language Processing from the ground up — starting with text preprocessing and statistical models, then moving to embeddings, RNNs, Transformers, and LLMs. My goal is to deeply understand how language models work and grow toward mastering and teaching modern NLP systems.
deepdubey197
Next word prediction is a machine learning model designed to forecast the most likely word to follow a given sequence of words in a text or sentence. This predictive model is often built using techniques from natural language processing (NLP) and specifically recurrent neural networks (RNNs) or transformer architectures.
This project aims to compare traditional Machine Learning methods for tabular data classification, such as Ensemble methods, Decision Trees, and Naive Bayes, with NLP classification methods like Multinomial Naive Bayes, RNNs, and Transformers. We are utilizing survey data from the CDC via the Behavioral Risk Factor Surveillance System (BRFSS)
All 16 repositories loaded