Found 21 repositories(showing 21)
A Abstractive Summarization Implementation with Transformer and Pointer-generator
steph1793
:scorpius::heavy_plus_sign::sagittarius::arrow_right::heavy_check_mark: Build a summarizer models combining transformers and pointing mechanism
AssafSinger94
Pointer-generator transformer model and transformer model for the morphological inflection task. custom to the SIGMORPHON 2019 shared task.
ChenghaoMou
Transformer with pointer generator for machine translation
An implementation of pointer generator network by using bert as encoder and transformer decoder.
shengqiangzhang
An Abstractive Summarization(for Datasets in English format) Implementation with Transformer and Pointer-generator
NirmalenduPrakash
Summarizes documents using RNN,Transformer(based on pointer generator network)
edumunozsala
Examples of models for Text Summarization using AWS SageMaker and W&B. Using Bi-LSTMs, Attention, Pointer Generator, Transformers, T5 and transformer library
nala-cub
Code implementation for the Paper "Making a Point: Pointer-Generator Transformers for Disjoint Vocabularies" (https://www.aclweb.org/anthology/2020.aacl-srw.13.pdf).
No description available
BlingBlingss
No description available
AssafSinger94
Transformer and pointer-generator transformer models for morphological inflection. Submission of the NYU-CUBoulder team to the SIGMORPHON 2020 shared task.
Thezone-1
Get to the Point
multi-summarization
Pointer generator transformer model for summarization
No description available
No description available
dbagal
Implementation of summarization using transformers and hybrid pointer generator networks
nkarline
A hybrid code-switched text generator model based on pointer generator network and transformer
nguyenduc13475
A PyTorch framework implementing and comparing advanced abstractive text summarization architectures (Transformer, Pointer-Generator, Neural Intra-Attention) from scratch.
Shaurya-Sethi
My Implementation of an enhanced encoder-decoder Transformer. Combines the original Attention Is All You Need architecture with RAT-SQL-style relation-aware encoding and a pointer-generator decoder to turn natural-language questions into accurate SQL queries.
Rare-Word Retention in Abstractive Summarisation via Hybrid Copy-Aware Transformer. This project explores the integration of pointer–generator and coverage mechanisms into BART-base, combined with span-aware decoding and entity-focused evaluation, to improve factuality and rare-word/entity retention in CNN/DailyMail summarisation.
All 21 repositories loaded