Why Us? >>

  • - Open Access
  • - Peer-reviewed
  • - Rapid publication
  • - Lifetime hosting
  • - Free indexing service
  • - Free promotion service
  • - More citations
  • - Search engine friendly

Free SCIRP Newsletters>>

Add your e-mail address to receive free newsletters from SCIRP.

 

Contact Us >>

WhatsApp  +86 18163351462(WhatsApp)
   
Paper Publishing WeChat
Book Publishing WeChat
(or Email:book@scirp.org)

Article citations

More>>

S. Abe, “Support Vector Machines for Pattern Classification,” 2nd Edition, Springer-Verlag, New York, 2010.

has been cited by the following article:

  • TITLE: Fast Variable Selection by Block Addition and Block Deletion

    AUTHORS: Takashi Nagatani, Seiichi Ozawa, Shigeo Abe

    KEYWORDS: Backward Selection, Forward Selection, Least Squares Support Vector Machines, Linear Programming Support Vector Machines, Support Vector Machines, Variable Selection

    JOURNAL NAME: Journal of Intelligent Learning Systems and Applications, Vol.2 No.4, December 14, 2010

    ABSTRACT: We propose the threshold updating method for terminating variable selection and two variable selection methods. In the threshold updating method, we update the threshold value when the approximation error smaller than the current threshold value is obtained. The first variable selection method is the combination of forward selection by block addi-tion and backward selection by block deletion. In this method, starting from the empty set of the input variables, we add several input variables at a time until the approximation error is below the threshold value. Then we search deletable variables by block deletion. The second method is the combination of the first method and variable selection by Linear Programming Support Vector Regressors (LPSVRs). By training an LPSVR with linear kernels, we evaluate the weights of the decision function and delete the input variables whose associated absolute weights are zero. Then we carry out block addition and block deletion. By computer experiments using benchmark data sets, we show that the proposed methods can perform faster variable selection than the method only using block deletion, and that by the threshold updating method, the approximation error is lower than that by the fixed threshold method. We also compare our method with an imbedded method, which determines the optimal variables during training, and show that our method gives comparable or better variable selection performance.