A closed-form reduction of multi-class cost-sensitive learning to weighted multi-class learning
2009
In cost-sensitive learning, misclassification costs can vary for different classes. This paper investigates an approach reducing a multi-class cost-sensitive learning to a standard classification task based on the data space expansion technique developed by Abe et al., which coincides with Elkan's reduction with respect to binary classification tasks. Using this proposed reduction approach, a cost-sensitive learning problem can be solved by considering a standard 0/1 loss classification problem on a new distribution determined by the cost matrix. We also propose a new weighting mechanism to solve the reduced standard classification problem, based on a theorem stating that the empirical loss on independently identically distributed samples from the new distribution is essentially the same as the loss on the expanded weighted training set. Experimental results on several synthetic and benchmark datasets show that our weighting approach is more effective than existing representative approaches for cost-sensitive learning.
Keywords:
- Online machine learning
- Semi-supervised learning
- Machine learning
- Active learning (machine learning)
- Artificial intelligence
- Multi-task learning
- Empirical risk minimization
- Stability (learning theory)
- One-class classification
- Linear classifier
- Pattern recognition
- Mathematics
- Instance-based learning
- Statistical learning theory
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
26
References
18
Citations
NaN
KQI