Submitted: March 2018
Abstract
Markov decision processes are of major interest in the planning community as well as in the model checking community. The latter is focused on distributed and concurrent systems, as they appear in real-time and cyber-physical control engineering and are often modeled as networks of extended finite state machines (automata) running concurrently. Powerful analysis algorithms and tools have been developed by either community. This thesis bridges between these two areas. We present a translation from Jani, an overarching input language for quantitative model checkers including the Modest and Prism toolsets, into the probabilistic planning domain definition language (PPDDL), as it is used in many classical planning systems, including Fast Downward. The translation makes it possible to treat the existing case studies in the model checking domain as inputs to planning systems, thereby enlarging the benchmark base for planners considerably. It also provides a way to cross-compare the efficiency and efficacy of algorithms developed in the planning domain with those developed in the model checking community, so as to pinpoint to what extent their respective strengths are linked to domain-specific modeling features. Concretely, we study the existing algorithms in the Modest toolset and in Fast Downward on a selection of examples. We present suggestions for improvements on either side and enhance the Modest toolset by a new heuristic search algorithm, FRET-π LRTDP, which performs very well in Fast Downward. In addition, we show analysis results of several case studies on it. Our findings lead to a new area of common research between these two communities, where heuristic search in model checking tools will exploit both, the strength of the expressiveness of its input languages and the good performance of algorithms developed in the planning community.