Explanatory dialogues
- 1 April 1989
- journal article
- Published by Oxford University Press (OUP) in Interacting with Computers
- Vol. 1 (1), 69-92
- https://doi.org/10.1016/0953-5438(89)90008-8
Abstract
Explanations are important in many areas of human-computer interaction. In help systems, tutoring systems and within expert system, lengthy explanations of some topic or justifications of some reasoning process may be required. If a long explanation is given, there is a good chance that at some point the user will ‘lose track’, and fail to grasp the main content of the explanation. There has therefore been recent emphasis on generating explanations and textual descriptions that are tailored to the knowledge and goals of the particular user. However, there is no guarantee that such a model will be accurate. By allowing interactions with the user within the explanation this no longer becomes crucial. Then, if users are confused in the middle of an explanation they can interrupt and seek clarification, and the system may provide explicit checks on the user's understanding. Therefore this paper presents an approach to explanation generation based on the assumption that explanations must both use and track a model of what the user knows, and also involve interactions with the user. The framework is based on sociolinguistic studies of human-human interaction as well as artificial intelligence work on explanation, text planning, tutoring and user modelling. It has been implemented and used for generating tutorial explanatory dialogues in electronics.Keywords
This publication has 4 references indexed in Scilit:
- Constructive Interaction and the Iterative Process of UnderstandingCognitive Science, 1986
- Reasoning and natural explanationInternational Journal of Man-Machine Studies, 1983
- Users are individuals: individualizing user modelsInternational Journal of Man-Machine Studies, 1983
- Structure of planning discourseJournal of Social and Biological Systems, 1978