Biblio

Filters: Author is Wu, Guoqiang  [Clear All Filters]
2018-05-01
Tang, Jingjing, Tian, Yingjie, Wu, Guoqiang, Li, Dewei.  2017.  Stochastic Gradient Descent for Large-Scale Linear Nonparallel SVM. Proceedings of the International Conference on Web Intelligence. :980–983.

In recent years, nonparallel support vector machine (NPSVM) is proposed as a nonparallel hyperplane classifier with superior performance than standard SVM and existing nonparallel classifiers such as the twin support vector machine (TWSVM). With the perfect theoretical underpinnings and great practical success, NPSVM has been used to dealing with the classification tasks on different scales. Tackling large-scale classification problem is a challenge yet significant work. Although large-scale linear NPSVM model has already been efficiently solved by the dual coordinate descent (DCD) algorithm or alternating direction method of multipliers (ADMM), we present a new strategy to solve the primal form of linear NPSVM different from existing work in this paper. Our algorithm is designed in the framework of the stochastic gradient descent (SGD), which is well suited to large-scale problem. Experiments are conducted on five large-scale data sets to confirm the effectiveness of our method.