Adaptive pruning algorithm for least squares support vector machine classifier

UTSePress Research/Manakin Repository

Search UTSePress Research


Advanced Search

Browse

My Account

Show simple item record

dc.contributor.author Yang, Xiaowei en_US
dc.contributor.author Lu, Jie en_US
dc.contributor.author Zhang, Guangquan en_US
dc.contributor.editor en_US
dc.date.accessioned 2012-02-02T11:04:30Z
dc.date.available 2012-02-02T11:04:30Z
dc.date.issued 2010 en_US
dc.identifier 2010001501 en_US
dc.identifier.citation Yang Xiaowei, Lu Jie, and Zhang Guangquan 2010, 'Adaptive pruning algorithm for least squares support vector machine classifier', Springer Berlin / Heidelberg, vol. 14, no. 7, pp. 667-680. en_US
dc.identifier.issn 1432-7643 en_US
dc.identifier.other C1 en_US
dc.identifier.uri http://hdl.handle.net/10453/15942
dc.description.abstract As a new version of support vector machine (SVM), least squares SVM (LS-SVM) involves equality instead of inequality constraints and works with a least squares cost function. A well-known drawback in the LSSVM applications is that the sparseness is lost. In this paper, we develop an adaptive pruning algorithm based on the bottom-to-top strategy, which can deal with this drawback. In the proposed algorithm, the incremental and decremental learning procedures are used alternately and a small support vector set, which can cover most of the information in the training set, can be formed adaptively. Using this set, one can construct the final classifier. In general, the number of the elements in the support vector set is much smaller than that in the training set and a sparse solution is obtained. In order to test the efficiency of the proposed algorithm, we apply it to eight UCI datasets and one benchmarking dataset. The experimental results show that the presented algorithm can obtain adaptively the sparse solutions with losing a little generalization performance for the classification problems with no-noises or noises, and its training speed is much faster than sequential minimal optimization algorithm (SMO) for the large-scale classification problems with no-noises. en_US
dc.language en_US
dc.publisher Springer Berlin / Heidelberg en_US
dc.title Adaptive pruning algorithm for least squares support vector machine classifier en_US
dc.parent Soft Computing - A Fusion of Foundations, Methodologies and Applications en_US
dc.journal.volume 14 en_US
dc.journal.number 7 en_US
dc.publocation Germany en_US
dc.identifier.startpage 667 en_US
dc.identifier.endpage 680 en_US
dc.cauo.name FEIT.School of Systems, Management and Leadership en_US
dc.conference Verified OK en_US
dc.for 170200 en_US
dc.personcode 0000035940 en_US
dc.personcode 001038 en_US
dc.personcode 020014 en_US
dc.percentage 100 en_US
dc.classification.name Cognitive Sciences en_US
dc.classification.type FOR-08 en_US
dc.edition en_US
dc.custom en_US
dc.date.activity en_US
dc.location.activity en_US
dc.description.keywords Support vector machine, Least squares support vector machine, Pruning, Incremental learning, Decremental learning, Adaptive en_US
dc.staffid en_US
dc.staffid 020014 en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record