Found 22 repositories(showing 22)
chakki-works
A Python framework for sequence labeling evaluation(named-entity recognition, pos tagging, etc...)
tareknaous
A package to evaluate Seq2Seq models
LuisMiSanVe
Python script to train a NER AI Model
rbedyakin
Implemention of seqeval in torchmetrics
miftahurrrizki
Custom Named Entity Recognition (NER) with BiLSTM CRF and Spacy
ishwari215
Modular PyTorch pipeline for biomedical NER using the PLOD-CW-25 dataset. Implements FastText-based RNN and BiLSTM, CRF with word/char embeddings, and fine-tuned RoBERTa with optimizer comparisons. Features BIO tagging, subword OOV handling, early stopping, and seqeval evaluation for scientific abbreviation and long-form entity recognition.
Pranam2002
No description available
conda-forge
A conda-smithy repository for seqeval.
yuxiao-ww
No description available
mbcladwell
Evaluate sequencing run quality
chengchingwen
No description available
haukelicht
Python implementation of evaluation of sequence annotations (e.g., NER) that is more forgiving than the classic strict `seqval` metric
DavidBulger
Branch of Biopython including AbiTracer and SeqEval programs
anjanrp
Pretrained transformer NER evaluation on BC5CDR (Chemical/Disease) with seqeval metrics + examples
Using pycrfsuite, sklearn, nltk and seqeval libraries for named entity recognition in Twitter-NER datadet.
Ajaydhiman07
Fine-tuning XLM-RoBERTa for Hindi Named Entity Recognition (NER) on the HiNER dataset using Hugging Face Transformers and Seqeval for evaluation.
AadityaArunSingh
This repo explores token classification for abbreviation and long-form detection using RoBERTa. We evaluate the impact of adding 50% of the PLODv2-filtered dataset, achieving improved F1 and recall. The repo includes methodology, evaluation using seqeval, and confusion matrix analysis.
mohaaliothman
Fine-tuned AraBERT (aubmindlab/bert-base-arabertv2) for Arabic token classification on an aspect-tagged dataset loaded from Excel. Includes label encoding, tokenization with label alignment for subwords, training with Hugging Face Trainer, and evaluation using seqeval (precision/recall/F1/accuracy).
puspah-ghimire
This exercise demonstrates how to fine-tune the bert-base-cased model on the CoNLL-2003 dataset for sequence labeling (NER). It includes data preparation, model training with Hugging Face’s transformers, evaluation with seqeval, and visualization of entity predictions on custom text.
lutfiozark
RoBERTa-based CTI NER training, evaluation, and Gradio demo on the APTNER dataset.
janzna
Pretrained PyTorch model fror Named Entity Recognition with XLM-Roberta. The model was trained as described in https://github.com/mczuk/xlm-roberta-ner#training-and-evaluating . The model was trained using fairseq 0.10.2, numpy 1.21.5, pytorch-transformers 1.2.0 and seqeval 1.2.2.
Fatemerjn
End-to-end NLP experiments for movie-genre prediction from plot summaries in English & Persian—covering document-level baselines, BERT fine-tuning, and NER+LSTM-CRF pipelines, with datasets and evaluation code.
All 22 repositories loaded