Any-Cost Discovery: Learning Optimal classification Rules

UTSePress Research/Manakin Repository

Search UTSePress Research


Advanced Search

Browse

My Account

Show simple item record

dc.contributor.author Ni, Ailing en_US
dc.contributor.author Zhu, Xiaofeng en_US
dc.contributor.author Zhang, Chengqi en_US
dc.contributor.editor Zhang, S; Jarvis, R en_US
dc.date.accessioned 2009-11-09T02:45:51Z
dc.date.available 2009-11-09T02:45:51Z
dc.date.issued 2005 en_US
dc.identifier 2005003081 en_US
dc.identifier.citation Ni Ailing, Zhu Xiaofeng, and Zhang Chengqi 2005, 'Any-Cost Discovery: Learning Optimal classification Rules', Springer, Berlin, Germany, pp. 123-132. en_US
dc.identifier.issn 3-540-30462-2 en_US
dc.identifier.other E1 en_US
dc.identifier.uri http://hdl.handle.net/10453/1966
dc.description.abstract Fully taking into account the hints possibly hidden in the absent data, this paper proposes a new criterion when selecting attributes for splitting to build a decision tree for a given dataset. In our approach, it must pay a certain cost to obtain an attribute value and pay a cost if a prediction is error. We use different scales for the two kinds of cost instead of the same cost scale defined by previous works. We propose a new algorithm to build decision tree with null branch strategy to minimize the misclassification cost. When consumer offers finite resources, we can make the best use of the resources as well as optimal results obtained by the tree. We also consider discounts in test costs when groups of attributes are tested together. In addition, we also put forward advice about whether it is worthy of increasing resources or not. Our results can be readily applied to real-world diagnosis tasks, such as medical diagnosis where doctors must try to determine what tests should be performed for a patient to minimize the misclassification cost in certain resources en_US
dc.publisher Springer en_US
dc.relation.isbasedon http://dx.doi.org/10.1007/11589990_15 en_US
dc.title Any-Cost Discovery: Learning Optimal classification Rules en_US
dc.parent AI 2005: Advances in Artificial Intelligence, 18th Australian Joint Conference on Artificial Intelligence Proceedings en_US
dc.journal.volume en_US
dc.journal.number en_US
dc.publocation Berlin, Germany en_US
dc.identifier.startpage 123 en_US
dc.identifier.endpage 132 en_US
dc.cauo.name FEIT.School of Software en_US
dc.conference Verified OK en_US
dc.conference.location Sydney, Australia en_US
dc.for 080105 en_US
dc.personcode 0000026251 en_US
dc.personcode 0000026252 en_US
dc.personcode 011221 en_US
dc.percentage 100 en_US
dc.classification.name Expert Systems en_US
dc.classification.type FOR-08 en_US
dc.custom Australasian Joint Conference on Artificial Intelligence en_US
dc.date.activity 20051205 en_US
dc.location.activity Sydney, Australia en_US
dc.description.keywords N/A en_US
dc.staffid 011221 en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record