Abstract
Testing of rule-based expert systems has become a high priority for many organizations as the use of such systems proliferates. Traditional software teting techniques apply to some components of rule-based systems, e.g., the inference engine. However, to structurally test the rule base component requires new techniques or adaptations of existing ones. This paper describes one such adaptation: an extension of data flow path selection in which a graphical representation of a rule base is defined and evaluated. This graphical form, called a logical path graph, captures logical paths through a rule base. These logical paths create precisely the abstractions needed in the testing process. An algorithm for the construction of logical path graphs are analyzed.
- 1 AMBLE, T. Logic Programming and Knowledge Engineering. Addison-Wesley, Reading, Mass., 1987. Google Scholar
- 2 BELLMAN K. Testing and correcting rule-based expert systems. Computer Science Laboratory, Aerospace Corp., April 1988. Presented at the Space Quality Conference.Google Scholar
- 3 BELLMAN, K., AND WALTER, D. Analyzing and correcting knowledge-based sys~.ems requires explicit models. In Procee&ngs of AAAI-88 Workshop on Validation and Testing of Knowledge-based Systems (Minneapohs, Minn., Aug 1988).Google Scholar
- 4 BERGE, C. Graphs and Hypergraphs. North-Holland, Amsterdam, 1973. Google Scholar
- 5 BOCHSLER, D., AND GOODWIN, M. A. A software approach to expert system design and verification. Tech. Rep., NASA/Johnson Space Center, 1987Google Scholar
- 6 CHANG, C. L., AND STACHOWITZ, R A The testing of expert systems. In Proceedings of SOAR88 Conference (Wright State Univ., Dayton, Oh. July 1988), pp. 131-135.Google Scholar
- 7 CLARKE, L. A., PODaURSK~, A., RICHARDSON, D. J., AND ZEIL, S. J. A formal evaluation of data flow path selection criteria. IEEE Trans. Softw Eng. 15, 11 (Nov. 1989), 1318-1332. Google Scholar
- 8 Computer Science Corporation Expert system methodology study report. Tech. Rep. CSC/TM-87/6728, Computer Science Corporation, Dec. 1987.Google Scholar
- 9 CULBERT, C. CLIPS Reference Manual Artificial Intelligence Section, Johnson Space Center. NASA, Apr. 1988Google Scholar
- 10 CULBERT, C., RILEY, a., AND STAVELY, R. T An expert system development methodology which supports verification and validation. In Proceedings of the Fourth IEEE Conference on Artificial IntellLgence Applicatmns (Houston, Tex., Sept. 1987)Google Scholar
- 11 GIARRANTANO, J. CLIP's User's Guide. Artificial Intelligence Section, Johnson Space Center, NASA, June 1988.Google Scholar
- 12 JOHNSON, S. C. Vahdation of highly reliable real-time knowledge-based systems In Proceedings of the SOAR 88 Workshop of Automation and Robotics (Wright State Univ., Dayton, Oh., July 1988), pp. 123-129.Google Scholar
- 13 MCCABE, T J. A complexity measure. IEEE Trans. Softw. Eng. SE-2, 4 (Dec. 1976), 308-320.Google Scholar
- 14 MCCABE, T.J. Structured Testing: A Testing Methodology Using the McCabe Complexity Metrm IEEE Computer Society Press, Silver Spring, Md., 1983, pp. 19-47.Google Scholar
- 15 NGUYEN, T. A., PERKINS, W. A., LAFFEY, T. J., AND PECORA, D. Checking an expert systems knowledge base for consistency and completeness. In Proceedings of Ninth Internatmnal Joint Conference on AI (Los Angeles, Aug. 1985), pp. 375-378.Google Scholar
- 16 RICH, E. Artificial Intelligence. McGraw-Hill, New York, 1983. Google Scholar
- 17 RUSHBY, J. Quality measures and assurance for ai software. Tech. Rep. SRI International, May 1988. Final report Contract NAS1 17067 (Task 5). Google Scholar
- 18 STACHOWITZ, R. A., COMBS, J. B., AND CHANG, C.L. Validation of knowledge-based systems. In Proceedings of Second AIAA /NASA / USAF Symposium on Automation, Robotics and Advanced Computing for the National Space Program (Arlington, Va., Mar. 1987).Google Scholar
- 19 STAGNER, J.R. Toward a certification methodology for expert systems. SI and TRC Tech. Mem. 1, Jet Propulsion Laboratory, Pasadena, Calif., July 1988.Google Scholar
- 20 SUWA, M., SCOTT, A. C., AND SHORTLIFFE, E.H. An approach to verifying completeness and consistency in a rule-based expert system. AI Mag. (Fall 1982), 16-21.Google Scholar
- 21 WEYUKER, E.J. The evaluation of program-based software test data adequacy criteria. Commun. ACM 31, 6 (June 1988), 668-675. Google Scholar
Index Terms
- Structural testing of rule-based expert systems
Recommendations
Improving the quality of rule-based applications using the declarative verification approach
The quality of rule-based applications depends on the quality of rules. However, due to various reasons, for instance communication problems between business people and rule modellers, rules may become inconsistent, incomplete or redundant. A particular ...
A Software Engineering Methodology for Rule-Based Systems
Current expert systems are typically difficult to change once they are built. The authors introduce a method for developing more easily maintainable rule-based expert systems, which is based on dividing the rules into groups and focusing attention on ...
Rule-Based Forecasting: Development and Validation of an Expert Systems Approach to Combining Time Series Extrapolations
This paper examines the feasibility of rule-based forecasting, a procedure that applies forecasting expertise and domain knowledge to produce forecasts according to features of the data. We developed a rule base to make annual extrapolation forecasts ...
Comments