Found 32 repositories(showing 30)
AvulaBhumika
An advanced and interactive text summarization application built using Hugging Face’s `BART` transformer. This project showcases the power of abstractive summarization using state-of-the-art NLP models and offers both a Gradio web interface and command-line usage.
🚀 Abstractive Text Summarization using Transformer-based NLP models. This project leverages deep learning and attention mechanisms to generate human-like summaries from long text inputs
JanhaviChede2204
Emotion-aware abstractive text summarization using transformer-based NLP models.
Saillut5
An NLP project for abstractive and extractive text summarization using transformer models.
Thowed5
An NLP project for abstractive and extractive text summarization using advanced transformer models.
Scontel
An advanced NLP project focusing on abstractive and extractive text summarization techniques using transformer models.
Saillut5
An advanced NLP project for abstractive and extractive text summarization using transformer models and deep learning.
Himanshu-1324
NLP-based text summarization using extractive and abstractive methods. Extractive models select key sentences, while abstractive models generate new text using transformers. Extractive is more factual; abstractive is more fluent
DreadPirate5
Abstractive text summarization of YouTube transcripts using transformer-based NLP models. Focuses on attention mechanisms, long-context modeling, and practical deployment considerations
isanthoshrajan
Text Summarization Model using LLMs (Large Language Models) to generate concise summaries from long text. Implements transformer-based models for abstractive and extractive summarization, with Python and NLP libraries for preprocessing, model training, and evaluation.
Developed a Text Summarizer using Python, NLP, and Streamlit that generates summaries from text, PDFs, and YouTube transcripts. Implemented the BART transformer model for abstractive summarization and built a user-friendly web interface for real-time summarization
rudra2955
This is a Python-based Natural Language Processing (NLP) tool designed to perform text summarization using both extractive and abstractive methods. It leverages spaCy for extractive summarization and pretrained models from Hugging Face Transformers (like T5 or BART) for high-quality abstractive summarization.
Priyanshu312003
This is a Python-based Natural Language Processing (NLP) tool designed to perform text summarization using both extractive and abstractive methods. It leverages spaCy for extractive summarization and pretrained models from Hugging Face Transformers (like T5 or BART) for high-quality abstractive summarization.
rudra2954
This is a Python-based Natural Language Processing (NLP) tool designed to perform text summarization using both extractive and abstractive methods. It leverages spaCy for extractive summarization and pretrained models from Hugging Face Transformers (like T5 or BART) for high-quality abstractive summarization.
A document analysis system combining OCR with extractive and abstractive text summarization. Converts scanned images into text using OCR, then summarizes content using NLP models (spaCy, Transformers). Built with Python, delivering concise insights from lengthy documents.
1shikapandey
Abstractive text summarization using Facebook’s BART transformer model. Tokenizes long text, generates concise summaries with beam search, and evaluates performance using ROUGE metrics. Ideal for NLP, AI research, and real-world text compression tasks.
YogeshLalchandani
In this model I have used the google pegasus-xsum model for abstractive text summarization. Abstractive summarization is the technique of generating a summary of a text from its main ideas, not by copying verbatim most salient sentences from text. The model google pegasus xsum is a Natural Language Processing (NLP) Model implemented in Transformer library, generally using the Python programming language for abstractive text summarization. https://huggingface.co/google/pegasus-xsum
Geethika0606
AI-Based Meeting Summarizer Developed a web-based application that converts user-uploaded meeting audio into text using speech-to-text models and generates concise summaries using transformer-based NLP models. Implemented an end-to-end pipeline integrating audio processing and abstractive summarization
mshashank559
This is a Data Science project that performs abstractive text summarization using Hugging Face’s transformers library. It uses pre-trained NLP models to generate concise summaries from user-provided input.
saahilk1511
The project focuses on text summarization using NLP techniques. It employs models like transformers (BERT, GPT) for abstractive summarization and uses Python, Jupyter, NLTK, and spaCy for development. Evaluation is done using ROUGE metrics, and tools like Hugging Face's Transformers are used for model implementation and fine-tuning
medabdellahihabib
This project implements abstractive text summarization using the BART Transformer model from Hugging Face. We fine-tune and evaluate BART on sample datasets to automatically generate concise and coherent summaries from long documents, showcasing the power of transformer-based NLP for real-world summarization tasks.
monal26621
Abstractive Text Summarization project using Hugging Face Transformers with the BART (facebook/bart-large-cnn) model. Generates concise summaries from long text inputs and evaluates performance using ROUGE scores. Demonstrates end-to-end NLP workflow including preprocessing, model inference, and quality evaluation.
dhk-12
An advanced AI-powered NLP toolkit featuring Named Entity Recognition (NER) with spaCy and dual-mode Text Summarization. It uniquely combines traditional extractive methods (LSA, TextRank) with modern abstractive AI using the BART Transformer model.
Nahilel
📝 A lightweight NLP project focused on extreme text summarization using Transformer-based models. It generates one-line, headline-style summaries from longer texts and includes both an interactive Gradio interface and command-line support. Ideal for fast, abstractive summarization tasks like article titles or content previews.
Me-Vish
Document Summarization & Q/A System using BERT This project uses transformer-based NLP models to automatically summarize documents and answer user questions. BART is used for abstractive summarization and BERT for context-aware question answering, helping users quickly extract key information from large texts.
jhilmitasri
This project implements and compares transformer models (BERT and T5) for abstractive and extractive text summarization using the WikiHow dataset. The goal is to condense long-form articles into meaningful headlines while evaluating various transformer-based and traditional NLP methods.
sha-md
Transformer-based medical text summarization using the PubMed dataset from Hugging Face . Fine-tuned a T5-small model to generate concise and factual summaries of biomedical research abstracts. This project demonstrates the application of NLP Transformers in summarizing long-form scientific text.
abdulghaffaransari
Fine-tunes Facebook's Bart-Large model on the SAMSum dataset for abstractive text summarization using Hugging Face Transformers, Evaluate, and Weights & Biases for training and tracking. It demonstrates advanced NLP techniques with GPU acceleration for efficient processing.
Bishwarup-Das
Hey Folks, This is a Model built on Text Summarization (NLP) Using Text to Text Transformer Model (Deep Learning). It is an Abstractive Approach of Text Summarisation. As the modules used here are Deep Learning modules so we have limited size of input i.e - 512. Hope you guys will like it.
IST 664 NLP course project at Syracuse University advanced abstractive text summarization using Multi-head Attention Transformers. It aimed to create a top-performing model for generating concise summaries, surpassing the baseline ROUGE-0.35 score. This contribution enhances NLP by improving information extraction and comprehension.