Skip to main content

Learning Multi-label Alternating Decision Trees from Texts and Data

  • Conference paper
  • First Online:
Machine Learning and Data Mining in Pattern Recognition (MLDM 2003)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2734))

Abstract

Multi-label decision procedures are the target of the supervised learning algorithm we propose in this paper. Multi-label decision procedures map examples to a finite set of labels. Our learning algorithm extends Schapire and Singer’s Adaboost.MH and produces sets of rules that can be viewed as trees like Alternating Decision Trees (invented by Freund and Mason). Experiments show that we take advantage of both performance and readability using boosting techniques as well as tree representations of large set of rules. Moreover, a key feature of our algorithm is the ability to handle heterogenous input data: discrete and continuous values and text data.

Partially supported by project DATADIAB: “ACI télémédecine et technologies pour la santé” and project TACT/TIC Feder & CPER Région-Nord Pas de Calais.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Erin L. Allwein, Robert E. Schapire, and Yoram Singer. Reducing multiclass to binary:a unifying approach for margin classifiers. In Proc. 17th International Conf. on Machine Learning, pages 9–16, 2000.

    Google Scholar 

  2. L. Breiman. Combining predictors. Technical report, Statistic Department, 1998.

    Google Scholar 

  3. T.G. Dietterich and G. Bakiri. Solving multiclass learning problems via error-correcting output codes. Journal of Artificial Intelligence Research, 2:263–286, 1995.

    MATH  Google Scholar 

  4. T. Dietterich, M. Kearns, and Y. Mansour. Applying the weak learning framework to understand and improve C4.5. In Proc. 13th International Conference on Machine Learning, pages 96–104. Morgan Kaufmann, 1996.

    Google Scholar 

  5. Yoav Freund and Llew Mason. The alternating decision tree learning algorithm. In Proc. 16th International Conf. on Machine Learning, pages 124–133, 1999.

    Google Scholar 

  6. Yoav Freund and Robert E. Schapire. Experiments with a new boosting algorithm. In Proc. 13th International Conference on Machine Learning, pages 148–146. Morgan Kaufmann, 1996.

    Google Scholar 

  7. Yoav Freund and Robert E. Schapire. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 55(1):119–139, August 1997.

    Article  MATH  MathSciNet  Google Scholar 

  8. [HPK+02]_G. Holmes, B. Pfahringer, R. Kirkby, E. Frank, and M. Hall. Multiclass alternating decision trees. In Proceedings of the European Conference on Machine Learning. Springer Verlag, 2002.

    Google Scholar 

  9. Ron Kohavi and Clayton Kunz. Option decision trees with majority votes. In Proc. 14th International Conference on Machine Learning, pages 161–169. Morgan Kaufmann, 1997.

    Google Scholar 

  10. M. Kearns and Y. Mansour. On the boosting ability of top-down decision tree learning algorithms. In Proceedings of the Twenty-Eighth Annual ACM Symposium on the Theory of Computing, pages 459–468, 1996.

    Google Scholar 

  11. J.R. Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo, CA, 1993.

    Google Scholar 

  12. Robert F. Schapire and Yoram Singer. Improved boosting algorithms using confidence-rated predictions. In Proceedings of the 11th Annual Conference on Computational Learning Theory (COLT-98), pages 80–91, New York, July 24–26 1998. ACM Press.

    Google Scholar 

  13. Robert E. Schapire and Yoram Singer. Boostexter: A boosting-based system for text categorization. Machine Learning, 39(2/3):135–168, 2000.

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

De Comité, F., Gilleron, R., Tommasi, M. (2003). Learning Multi-label Alternating Decision Trees from Texts and Data. In: Perner, P., Rosenfeld, A. (eds) Machine Learning and Data Mining in Pattern Recognition. MLDM 2003. Lecture Notes in Computer Science, vol 2734. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45065-3_4

Download citation

  • DOI: https://doi.org/10.1007/3-540-45065-3_4

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-40504-7

  • Online ISBN: 978-3-540-45065-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics