Model-based contrastive explanations for explainable planning

Benjamin Krarup, Michael Cashmore, Daniele Magazzeni, Tim Miller

Research output: Chapter in Book/Report/Conference proceedingConference contribution book

145 Downloads (Pure)

Abstract

An important type of question that arises in Explainable Planning is a contrastive question, of the form “Why action A instead of action B?”. These kinds of questions can be answered with a contrastive explanation that compares properties of the original plan containing A against the contrastive plan containing B. An effective explanation of this type serves to highlight the differences between the decisions that have been made by the planner and what the user would expect, as well as to provide further insight into the model and the planning process. Producing this kind of explanation requires the generation of the contrastive plan. This paper introduces domain-independent compilations of user questions into constraints. These constraints are added to the planning model, so that a solution to the new model represents the contrastive plan. We introduce a formal description of the compilation from user question to constraints in a temporal and numeric PDDL2.1 planning setting.
Original languageEnglish
Title of host publicationICAPS 2019 Workshop on Explainable AI Planning (XAIP)
Place of PublicationMenlo Park, US-CA.
Number of pages9
Publication statusPublished - 15 Jul 2019
Event29th International Conference on Automated Planning and Scheduling - ICAPS 2019 - Berkeley, United States
Duration: 11 Jul 201915 Jul 2019

Conference

Conference29th International Conference on Automated Planning and Scheduling - ICAPS 2019
Abbreviated titleICAPS 2019
Country/TerritoryUnited States
CityBerkeley
Period11/07/1915/07/19

Keywords

  • contrastive explanation
  • domain-independent compilations

Fingerprint

Dive into the research topics of 'Model-based contrastive explanations for explainable planning'. Together they form a unique fingerprint.

Cite this