*Article* **Bidirectional Recurrent Neural Network Approach for Arabic Named Entity Recognition**

**Mohammed N. A. Ali 1, Guanzheng Tan 1,\* and Aamir Hussain <sup>2</sup>**


Received: 28 October 2018; Accepted: 10 December 2018; Published: 13 December 2018

**Abstract:** Recurrent neural network (RNN) has achieved remarkable success in sequence labeling tasks with memory requirement. RNN can remember previous information of a sequence and can thus be used to solve natural language processing (NLP) tasks. Named entity recognition (NER) is a common task of NLP and can be considered a classification problem. We propose a bidirectional long short-term memory (LSTM) model for this entity recognition task of the Arabic text. The LSTM network can process sequences and relate to each part of it, which makes it useful for the NER task. Moreover, we use pre-trained word embedding to train the inputs that are fed into the LSTM network. The proposed model is evaluated on a popular dataset called "ANERcorp." Experimental results show that the model with word embedding achieves a high F-score measure of approximately 88.01%.

**Keywords:** Arabic named entity recognition; bidirectional recurrent neural network; GRU; LSTM; natural language processing; word embedding
