Found 574 repositories(showing 30)
wasiahmad
Official implementation of our work, A Transformer-based Approach for Source Code Summarization [ACL 2020].
pszemraj
CLI & Python API to easily summarize text-based files with transformers
quanghuy0497
A summarization of Transformer-based architectures for CV tasks, including image classification, object detection, segmentation, and Few-shot Learning. Keep updated frequently.
Ajmeer-007
An AI-powered learning assistant that summarizes educational documents, answers learning queries, and generates quizzes to assess understanding. Built using NLP and transformer-based models to enable faster comprehension and active learning.
This project extracts audio from YouTube videos, converts speech to text using OpenAI Whisper, and summarizes the content using transformer-based NLP models to save time and improve content accessibility.
tm4roon
An implementation of transformer-based language model for sentence rewriting tasks such as summarization, simplification, and grammatical error correction.
yz1019117968
Source Code for "A multi-modal transformer-based code summarization approach for smart contracts"
shmpanski
Abstractive summarization model based on pure transformer architecture
domyounglee
An optimized Transformer based abstractive summarization model with Tensorflow
Skyline-9
Multi-modal transformer approach for natural language query based joint video summarization and highlight detection
ngoquanghuy99
An abstractive text summarization model based on Transformer Decoder (GPT-2) using Google/Trax.
HopLee6
Pytorch implementation for "Video Joint Modelling Based on Hierarchical Transformer for Co-summarization"
NirmalenduPrakash
Summarizes documents using RNN,Transformer(based on pointer generator network)
In this paper, we proposed a sequential hybrid model based on a transformer to summarize Arabic articles. We used two approaches of summarization to make our model. The First is the extractive approach which depends on the most important sentences from the articles to be the summary, so we used Deep Learning techniques specifically transformers such as AraBert to make our summary, The second is abstractive, and this approach is similar to human summarization, which means that it can use some words which have the same meaning but different from the original text. We apply this kind of summary using MT5 Arabic pre-trained transformer model. We sequentially applied these two summarization approaches to building our A3SUT hybrid model. The output of the extractive module is fed into the abstractive module. We enhanced the summary’s quality to be closer to the human summary by applying this approach. We tested our model on the ESAC dataset and evaluated the extractive summary using the Rouge score technique; we got a precision of 0.5348 and a recall of 0.5515, and an f1 score of 0.4932 and the evaluation of the abstractive model is evaluated by user satisfaction. We add some features to our summary to make it more understandable by applying the metadata generation task” data about data” and classification. By applying metadata generation, we add facilities to our summary, identification, and summary organization. Metadata provides essential contextual details, as not all summaries are self-describing. Also, classify the original text to determine the summary topic before reading. We acquire 97.5% accuracy by using Support Vector Machine (SVM) and trained it using NADA corpus.
MohanKrishnaGR
This repository contains the implementation of a Transformer-based model for abstractive text summarization and a rule-based approach for extractive text summarization.
dotrann1412
Small text-summarization application using transformer-based model.
Wendy-Xiao
The official code for the NAACL paper: Predicting Discourse Trees from pre-trained Transformer-based Neural Summarizers
LorenzoMinto
An Extractive-Abstractive Summarization Framework with a Sentence Embeddings Twist. Based on GPT-2 transformer fine-tuned on CNN/DailyMail dataset
Video summarization using keyframe extraction and visual attention-based transformer
Jackthebighead
Transformer based abstractive summarization models: mT5, T5 Pegasus, GPT-2 are implemented for Chinese text summarization.
SwagatSBhuyan
LSTM based Textual Entailment system on a HuggingFace Transformer-based Abstractive Text Summarizer, with gold label evaluation track consisting of other metrics such as BLEU score and ROUGE-n score.
porcelainruler
A Python Based Text-Summarizer to extract summary of a text or long paragraphs. Includes variety of approaches like normal LSTM architectures, LSTM with Attention and then finally Transformers like BERT and its various improvements. Also includes a python flask based web app as UI for clearer and user friendly interface for summarization.
Transformer based Bangla Abstractive Text Summarization System
maanvithag
An LLM-based chatbot trained on philosophical texts built using OpenAI GPT models, Langchain Tokenizers, HuggingFace transformers, and Meta BART summarization model wrapped in a NextJS web app hosted completely on AWS
shuvodoes
A web app built with Streamlit and the T5 Transformer model for real-time text summarization.
maxymkuz
Transformer-based large-scale news summarizer with an interactive Web UI
saloni-1919
AI-powered biomedical text summarization using extractive NLP, biomedical entity recognition, and transformer-based abstractive summarization.
Hjxin02AIsharing-Wust
This is our application based on the transformer network structure for the text summarization task。
knightRehman
AI Study Buddy is a Python-based tool that helps students learn smarter by summarizing PDFs, answering questions, and generating flashcards using NLP and Transformer models.
farazkh80
A Searching and Summarizing Engine leveraging a custom-built search engine for news keyword searching, and a pre-trained transformers-based T5 Model, fine-tuned on news text and summary data to achieve state-of-the-art results on text summarization