Visible to the public Stochastic Gradient Descent for Large-Scale Linear Nonparallel SVM

TitleStochastic Gradient Descent for Large-Scale Linear Nonparallel SVM
Publication TypeConference Paper
Year of Publication2017
AuthorsTang, Jingjing, Tian, Yingjie, Wu, Guoqiang, Li, Dewei
Conference NameProceedings of the International Conference on Web Intelligence
PublisherACM
Conference LocationNew York, NY, USA
ISBN Number978-1-4503-4951-2
Keywordscomposability, large-scale, Metrics, nonparallel support vector machine, pubcrawl, resilience, Resiliency, stochastic gradient descent, Support vector machines
Abstract

In recent years, nonparallel support vector machine (NPSVM) is proposed as a nonparallel hyperplane classifier with superior performance than standard SVM and existing nonparallel classifiers such as the twin support vector machine (TWSVM). With the perfect theoretical underpinnings and great practical success, NPSVM has been used to dealing with the classification tasks on different scales. Tackling large-scale classification problem is a challenge yet significant work. Although large-scale linear NPSVM model has already been efficiently solved by the dual coordinate descent (DCD) algorithm or alternating direction method of multipliers (ADMM), we present a new strategy to solve the primal form of linear NPSVM different from existing work in this paper. Our algorithm is designed in the framework of the stochastic gradient descent (SGD), which is well suited to large-scale problem. Experiments are conducted on five large-scale data sets to confirm the effectiveness of our method.

URLhttps://dl.acm.org/citation.cfm?doid=3106426.3109427
DOI10.1145/3106426.3109427
Citation Keytang_stochastic_2017