Title | Inflectional Review of Deep Learning on Natural Language Processing |
Publication Type | Conference Paper |
Year of Publication | 2018 |
Authors | Fahad, S.K. Ahammad, Yahya, Abdulsamad Ebrahim |
Conference Name | 2018 International Conference on Smart Computing and Electronic Enterprise (ICSCEE) |
Keywords | -Deep Learning, (DL) method, artificial neural network, Artificial neural networks, Considering Deep, contemporary time NLP, convolution neural networks, Deep Learning, Deep nural Network, DL methods, Human Behavior, learning (artificial intelligence), MultiLayer Neural Network, Multitask Learning, natural language processing, NLP process, NLP tools, nonlinear process, processing layers, pubcrawl, recurrent neural nets, recurrent neural network, Resiliency, Scalability, Semantics, tagging, Task Analysis, text analysis, Time Delay Neural Networks |
Abstract | In the age of knowledge, Natural Language Processing (NLP) express its demand by a huge range of utilization. Previously NLP was dealing with statically data. Contemporary time NLP is doing considerably with the corpus, lexicon database, pattern reorganization. Considering Deep Learning (DL) method recognize artificial Neural Network (NN) to nonlinear process, NLP tools become increasingly accurate and efficient that begin a debacle. Multi-Layer Neural Network obtaining the importance of the NLP for its capability including standard speed and resolute output. Hierarchical designs of data operate recurring processing layers to learn and with this arrangement of DL methods manage several practices. In this paper, this resumed striving to reach a review of the tools and the necessary methodology to present a clear understanding of the association of NLP and DL for truly understand in the training. Efficiency and execution both are improved in NLP by Part of speech tagging (POST), Morphological Analysis, Named Entity Recognition (NER), Semantic Role Labeling (SRL), Syntactic Parsing, and Coreference resolution. Artificial Neural Networks (ANN), Time Delay Neural Networks (TDNN), Recurrent Neural Network (RNN), Convolution Neural Networks (CNN), and Long-Short-Term-Memory (LSTM) dealings among Dense Vector (DV), Windows Approach (WA), and Multitask learning (MTL) as a characteristic of Deep Learning. After statically methods, when DL communicate the influence of NLP, the individual form of the NLP process and DL rule collaboration was started a fundamental connection. |
DOI | 10.1109/ICSCEE.2018.8538416 |
Citation Key | fahad_inflectional_2018 |