skip to main content
10.1145/1066677.1066787acmconferencesArticle/Chapter ViewAbstractPublication PagessacConference Proceedingsconference-collections
Article

DrC4.5: Improving C4.5 by means of prior knowledge

Published:13 March 2005Publication History

ABSTRACT

Classification is one of the most useful techniques for extracting meaningful knowledge from databases. Classifiers, e.g. decision trees, are usually extracted from a table of records, each of which represents an example. However, quite often in real applications there is other knowledge, e.g. owned by experts of the field, that can be usefully used in conjunction with the one hidden inside the examples. As a concrete example of this kind of knowledge we consider causal dependencies among the attributes of the data records. In this paper we discuss how to use such a knowledge to improve the construction of classifiers. The causal dependencies are represented via Bayesian Causal Maps (BCMs), and our method is implemented as an adaptation of the well known C4.5 algorithm.

References

  1. L. Breiman, J. Friedman, R. Olshen, and C. Stone. Classification and Regression Trees. Kluwer Academic Publishers, 1984.Google ScholarGoogle Scholar
  2. C. Eden, F. Ackermann, and S. Copper. The analysis of cause maps. Journal of Management Studies, 29(3):309/323, 1992.Google ScholarGoogle ScholarCross RefCross Ref
  3. D. Heckerman. Bayesian networks for data mining. Data Mining and Knowlwdge Discovery, 1:79--119, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. B. Kemmerer, S. Mishra, and P. P. Shenoy. Bayesian causal maps as decision aids in venture capital decision making: Methods and applications. In In proceedings of the Accademy of Management Conference, 2002.Google ScholarGoogle ScholarCross RefCross Ref
  5. T. Mitchell. Machine Learning. McGraw-Hill Companies, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. J. R. Quinlan. Induction of decision trees. Machine Learning, 1:81--106, 1986. Google ScholarGoogle ScholarCross RefCross Ref
  7. J. R. Quinlan. Simplifying decision trees, 1986. AI Memo No 930. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. J. R. Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann Publishers, 1993. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Z. Zheng. Construction New Attributes for Tree Learning. PhD thesis, Basser Departement of Computer Science, University of Sydney, NSW 2006, Australia, March 1996.Google ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Conferences
    SAC '05: Proceedings of the 2005 ACM symposium on Applied computing
    March 2005
    1814 pages
    ISBN:1581139640
    DOI:10.1145/1066677

    Copyright © 2005 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 13 March 2005

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • Article

    Acceptance Rates

    Overall Acceptance Rate1,650of6,669submissions,25%

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader