![]() |
|||
![]()
|
![]() |
![]() |
![]() |
3. TECHNIQUES, PRACTICES, METHODOLOGIES, AND APPLICATIONSExplanation is an important function in symbolic artificial intelligence. Explanation is used in machine learning, in case-based reasoning and, most importantly, in the explanation of the results of a reasoning process. Experience with expert systems has shown that the ability to generate explanations is important for the user acceptance of AI systems. 3.1. USER INTERFACE AND EXPLANATIONOne of the main challenges in the development of user interfaces to expert systems is to provide both the end-user and the knowledge engineer with means for applying the knowledge in the system in different ways. For example, the same medical knowledge base may be consulted by a physician to solve a medical diagnostic problem, or it may be browsed to determine which findings are typical in a given disease, or it may even be used to instruct a medical student by explaining why the disease she or he suspects in a patient does or does not fit the patient data available. It is clear that such different ways of exploiting a knowledge base require different user interfaces. Expert systems capable of adapting their behavior to the user are usually associated with a user model. The goals and roles of user behavior modeling are the major factors that determine how user behavior should be modeled, how the information is to be acquired and presented, and how the resultant model is to be used. The application of user models in expert systems is a subject of ongoing research. However, it seems evident that there is a substantial gap as well as conceptual discrepancies between HCI and AI in terms of user modeling (McTear, 1993). There are various possible dialog forms for expert system users. A user-initiated dialog refers to the interaction that is always initiated on the side of the user, whereas a computer-initiated dialog refers to ones in which the computer asks users questions. Most current expert systems have a dialog form lying somewhere in between the two extremes, and the initiative of the dialog can be switched between the computer and the user. An expert system with such a dialog form is called a system supporting a mixed-initiated dialog. Expert systems primarily developed for the inexperienced user usually take the computer-initiated form. Systems for experienced users generally give the users more control to the discourse of dialog. An important aspect of the interaction between user and expert systems is the explanation to a user of the line of reasoning undertaken by the system during a specific consultation. A clear and understandable explanation can be a valuable means for justifying the recommendations of the expert system, for indicating its limitations to the user, and for instructing users about the problem domain covered by the system. Designing an expert system that is able to provide understandable and helpful explanations involves issues such as the level of detail of the information presented to the user, the structuring of the information presented, and the distinction between various types of knowledge. Explanation is a very complicated form of human communication that is not well understood. Most conventional expert systems provide a form of explanation limited to a description of the reasoning steps that were undertaken in confirming or rejecting a set of hypotheses. They usually are not able to adapt their behavior to the user's experience in the problem domain. Hendler (1988) remains a main source of information on the development of user interfaces for expert systems. 3.2. DIALOG MODELSMore directly related to user interface design for expert systems is to see explanations as communications or dialogs between users and expert systems. For example, human verbal explanations are essentially interactive. If someone is giving a complex explanation, the listener will be given the opportunity to indicate whether he/she is following as the explanation proceeds, and if necessary interrupt with clarification questions. These interactions allow the speaker to both clear up the listener's immediate difficulties as they arise, and to update assumptions about their level of understanding. Better models of the listener's level of understanding in turn allow the speaker to continue the explanation in a more appropriate manner, lessening the risk of continuing confusion. Despite its apparent importance, existing explanation and text generation systems fail to allow for this sort of interaction. Although some systems allow follow-up questions at the end of an explanation, they assume that a complete explanation has been planned and generated before such interactions are allowed. However, for complex explanations, interactions with the user should take place as the explanation progresses, and should influence how that explanation continues. Casey (1993) described the EDGE system, which is able to plan complex, extended explanations that allow such interactions with the user. The system can update assumptions about the user's knowledge on the basis of these interactions, and uses this information to influence the detailed further planning of the explanation. When the user appears confused, the system can attempt to fill in missing knowledge or to explain things another way. In recent years the emphasis in natural language understanding research has shifted from studying mechanisms for understanding isolated utterances to developing strategies for interpreting sentences within the context of a discourse or an extended dialog. A very fruitful approach to this problem has derived from a view of human behavior as goal-directed and understanding as explanation-based. According to this view, people perform actions and communicate to advance their goals, and language understanding therefore involves recognizing and reasoning about the goals and plans of others.
|
![]() |
|
Use of this site is subject certain Terms & Conditions. Copyright (c) 1996-1999 EarthWeb, Inc.. All rights reserved. Reproduction in whole or in part in any form or medium without express written permission of EarthWeb is prohibited. Please read our privacy policy for details. |