Abstract
Multi-label decision procedures are the target of the supervised learning algorithm we propose in this paper. Multi-label decision procedures map examples to a finite set of labels. Our learning algorithm extends Schapire and Singer’s Adaboost.MH and produces sets of rules that can be viewed as trees like Alternating Decision Trees (invented by Freund and Mason). Experiments show that we take advantage of both performance and readability using boosting techniques as well as tree representations of large set of rules. Moreover, a key feature of our algorithm is the ability to handle heterogenous input data: discrete and continuous values and text data.
Partially supported by project DATADIAB: “ACI télémédecine et technologies pour la santé” and project TACT/TIC Feder & CPER Région-Nord Pas de Calais.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Erin L. Allwein, Robert E. Schapire, and Yoram Singer. Reducing multiclass to binary:a unifying approach for margin classifiers. In Proc. 17th International Conf. on Machine Learning, pages 9–16, 2000.
L. Breiman. Combining predictors. Technical report, Statistic Department, 1998.
T.G. Dietterich and G. Bakiri. Solving multiclass learning problems via error-correcting output codes. Journal of Artificial Intelligence Research, 2:263–286, 1995.
T. Dietterich, M. Kearns, and Y. Mansour. Applying the weak learning framework to understand and improve C4.5. In Proc. 13th International Conference on Machine Learning, pages 96–104. Morgan Kaufmann, 1996.
Yoav Freund and Llew Mason. The alternating decision tree learning algorithm. In Proc. 16th International Conf. on Machine Learning, pages 124–133, 1999.
Yoav Freund and Robert E. Schapire. Experiments with a new boosting algorithm. In Proc. 13th International Conference on Machine Learning, pages 148–146. Morgan Kaufmann, 1996.
Yoav Freund and Robert E. Schapire. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 55(1):119–139, August 1997.
[HPK+02]_G. Holmes, B. Pfahringer, R. Kirkby, E. Frank, and M. Hall. Multiclass alternating decision trees. In Proceedings of the European Conference on Machine Learning. Springer Verlag, 2002.
Ron Kohavi and Clayton Kunz. Option decision trees with majority votes. In Proc. 14th International Conference on Machine Learning, pages 161–169. Morgan Kaufmann, 1997.
M. Kearns and Y. Mansour. On the boosting ability of top-down decision tree learning algorithms. In Proceedings of the Twenty-Eighth Annual ACM Symposium on the Theory of Computing, pages 459–468, 1996.
J.R. Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo, CA, 1993.
Robert F. Schapire and Yoram Singer. Improved boosting algorithms using confidence-rated predictions. In Proceedings of the 11th Annual Conference on Computational Learning Theory (COLT-98), pages 80–91, New York, July 24–26 1998. ACM Press.
Robert E. Schapire and Yoram Singer. Boostexter: A boosting-based system for text categorization. Machine Learning, 39(2/3):135–168, 2000.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
De Comité, F., Gilleron, R., Tommasi, M. (2003). Learning Multi-label Alternating Decision Trees from Texts and Data. In: Perner, P., Rosenfeld, A. (eds) Machine Learning and Data Mining in Pattern Recognition. MLDM 2003. Lecture Notes in Computer Science, vol 2734. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45065-3_4
Download citation
DOI: https://doi.org/10.1007/3-540-45065-3_4
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-40504-7
Online ISBN: 978-3-540-45065-8
eBook Packages: Springer Book Archive