Download PDFOpen PDF in browserJoint Self-Attention and Multi-Embeddings for Chinese Named Entity RecognitionEasyChair Preprint 33405 pages•Date: May 6, 2020AbstractNamed Entity Recognition (NER) is a fundamental task in Natural Language Processing (NLP), but it remains more challenging in Chinese due to the particularity and complexity of Chinese. Traditional Chinese Named Entity Recognition (Chinese NER) methods require cumbersome feature engineering and domain-specific knowledge to achieve high performance. In this paper, we propose a simple yet effective neural network framework for Chinese NER, named A-NER. A-NER is the first Bidirectional Gated Recurrent Unit - Conditional Random Field (BiGRU-CRF) model that combines self-attention mechanism with multi-embeddings technology. It can extract richer linguistic information of characters from different granularities (e.g., radical, character, word) and find the correlations between characters in the sequence. Moreover, A-NER does not rely on any external resources and hand-crafted features. The experimental results show that our model outperforms (or approaches) existing state-of-the-art methods on different domain datasets. Keyphrases: BiGRU-CRF, Chinese NER, Multi-Embeddings, self-attention
|