Stochastic Gradient Descent for Large-Scale Linear Nonparallel SVM
Title | Stochastic Gradient Descent for Large-Scale Linear Nonparallel SVM |
Publication Type | Conference Paper |
Year of Publication | 2017 |
Authors | Tang, Jingjing, Tian, Yingjie, Wu, Guoqiang, Li, Dewei |
Conference Name | Proceedings of the International Conference on Web Intelligence |
Publisher | ACM |
Conference Location | New York, NY, USA |
ISBN Number | 978-1-4503-4951-2 |
Keywords | composability, large-scale, Metrics, nonparallel support vector machine, pubcrawl, resilience, Resiliency, stochastic gradient descent, Support vector machines |
Abstract | In recent years, nonparallel support vector machine (NPSVM) is proposed as a nonparallel hyperplane classifier with superior performance than standard SVM and existing nonparallel classifiers such as the twin support vector machine (TWSVM). With the perfect theoretical underpinnings and great practical success, NPSVM has been used to dealing with the classification tasks on different scales. Tackling large-scale classification problem is a challenge yet significant work. Although large-scale linear NPSVM model has already been efficiently solved by the dual coordinate descent (DCD) algorithm or alternating direction method of multipliers (ADMM), we present a new strategy to solve the primal form of linear NPSVM different from existing work in this paper. Our algorithm is designed in the framework of the stochastic gradient descent (SGD), which is well suited to large-scale problem. Experiments are conducted on five large-scale data sets to confirm the effectiveness of our method. |
URL | https://dl.acm.org/citation.cfm?doid=3106426.3109427 |
DOI | 10.1145/3106426.3109427 |
Citation Key | tang_stochastic_2017 |