arXiv Analytics

Sign in

arXiv:1906.09978 [cs.CL]AbstractReferencesReviewsResources

Multilingual Named Entity Recognition Using Pretrained Embeddings, Attention Mechanism and NCRF

Anton A. Emelyanov, Ekaterina Artemova

Published 2019-06-21Version 1

In this paper we tackle multilingual named entity recognition task. We use the BERT Language Model as embeddings with bidirectional recurrent network, attention, and NCRF on the top. We apply multilingual BERT only as embedder without any fine-tuning. We test out model on the dataset of the BSNLP shared task, which consists of texts in Bulgarian, Czech, Polish and Russian languages.

Comments: BSNLP Shared Task 2019 paper. arXiv admin note: text overlap with arXiv:1806.05626 by other authors
Categories: cs.CL
Related articles: Most relevant | Search more
arXiv:2012.02030 [cs.CL] (Published 2020-11-20)
Data-Informed Global Sparseness in Attention Mechanisms for Deep Neural Networks
arXiv:2204.13353 [cs.CL] (Published 2022-04-28)
Attention Mechanism with Energy-Friendly Operations
Yu Wan et al.
arXiv:1811.05544 [cs.CL] (Published 2018-11-12)
An Introductory Survey on Attention Mechanisms in NLP Problems