Two scripts for training and testing BERT-like pretrained models on Named Entity Recognition (NER) tasks. Train (finetune) a BERT-based NER model (from Hugging Face or a model stored on your computer) on a dataset formatted in CoNLL-like format. Use the trained model to predict NER tags on test data, getting an evaluation report, or new data, getting a tagged version the data. The scripts handle sequences longer than the maximum length defined by the models by splitting the sequences in multiple chunks with a context overlap.
Named entity recognition with BERT
Esuli A.
2025
Abstract
Two scripts for training and testing BERT-like pretrained models on Named Entity Recognition (NER) tasks. Train (finetune) a BERT-based NER model (from Hugging Face or a model stored on your computer) on a dataset formatted in CoNLL-like format. Use the trained model to predict NER tags on test data, getting an evaluation report, or new data, getting a tagged version the data. The scripts handle sequences longer than the maximum length defined by the models by splitting the sequences in multiple chunks with a context overlap.File in questo prodotto:
File | Dimensione | Formato | |
---|---|---|---|
bert_ner_README.pdf
accesso aperto
Descrizione: NER with BERT
Tipologia:
Altro materiale allegato
Licenza:
Creative commons
Dimensione
76.92 kB
Formato
Adobe PDF
|
76.92 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.