Abstract—In this paper, we introduce a new general definition of L1-norm SVM (GL1-SVM) for feature selection and represent it as a polynomial mixed 0-1 programming problem. We prove that solving the new proposed optimization problem reduces error penalty and enlarges the margin between two support vector hyper-planes. This possibly provides better generalization capability of SVM than solving the traditional L1-norm SVM proposed by Bradley and Mangasarian. We also propose a new search method that ensures obtaining of the global feature subset by means of the new GL1-SVM. The proposed search method is based on solving a mixed 0-1 linear programming (M01LP) problem by using the branch and bound algorithm. In this M01LP problem, the number of constraints and variables is linear in the number of full set features. Experimental results obtained over the UCI, LIBSVM, UNM and MIT Lincoln Lab data sets show that the new general L1-norm SVM gives better generalization capability, while selecting fewer features than the traditional L1-norm SVM in many cases.
Index Terms—branch and bound, feature selection, L1-norm support vector machine, mixed 0-1 linear programming problem.
Authors are with the Norwegian Information Security Laboratory, P.O.Box 191, N-2802 Gjøvik, Norway. (e-mail: hai.nguyen@ hig.no). (e-mail: firstname.lastname@example.org). (e-mail: email@example.com).
Cite: Hai Thanh Nguyen, Katrin Franke, and Slobodan Petrovi'c, "On General Definition of L1-norm Support Vector Machines for Feature Selection," International Journal of Machine Learning and Computing vol. 1, no. 3, pp. 279-283, 2011.