Visible to the public Few-Shot Transfer Learning for Text Classification With Lightweight Word Embedding Based Models

TitleFew-Shot Transfer Learning for Text Classification With Lightweight Word Embedding Based Models
Publication TypeJournal Article
Year of Publication2019
AuthorsPan, C., Huang, J., Gong, J., Yuan, X.
JournalIEEE Access
Volume7
Pagination53296–53304
ISSN2169-3536
Keywordscompositionality, Computational modeling, Computing Theory and Compositionality, Data models, data-hungry deep models, Deep Learning, deep learning architectures, feature extraction, Few-shot learning, few-shot transfer learning tasks, Human Behavior, human factors, lightweight word embedding-based models, modified hierarchical pooling strategy, parameter-free pooling operations, parameter-free property, parameters training, pattern classification, plug-and-play way, pooling strategy, pubcrawl, semantic compositionality, semantic networks, semantic text, simple word embedding-based models, supervised data, supervised learning, Task Analysis, text analysis, text categorization, text classification, Training, transfer learning, unseen text sequences, word embedding based models, word processing
AbstractMany deep learning architectures have been employed to model the semantic compositionality for text sequences, requiring a huge amount of supervised data for parameters training, making it unfeasible in situations where numerous annotated samples are not available or even do not exist. Different from data-hungry deep models, lightweight word embedding-based models could represent text sequences in a plug-and-play way due to their parameter-free property. In this paper, a modified hierarchical pooling strategy over pre-trained word embeddings is proposed for text classification in a few-shot transfer learning way. The model leverages and transfers knowledge obtained from some source domains to recognize and classify the unseen text sequences with just a handful of support examples in the target problem domain. The extensive experiments on five datasets including both English and Chinese text demonstrate that the simple word embedding-based models (SWEMs) with parameter-free pooling operations are able to abstract and represent the semantic text. The proposed modified hierarchical pooling method exhibits significant classification performance in the few-shot transfer learning tasks compared with other alternative methods.
DOI10.1109/ACCESS.2019.2911850
Citation Keypan_few-shot_2019