- About IIIA
- Current news
- 25th anniversary
The design of artificial agents -that autonomously decide what to do in order to fulfill their goals- has become an important problem at the intersection of applied logic, computer science, game theory and artificial intelligence. This problem is just a rephrasing in contemporary terms of some older questions on practical reasoning and rationality, with an added emphasis on the need for social abilities in multi-agent systems.
The topics of this thesis are logics and methods for planning. Historically, these two areas independently introduced the same agency-related concepts (ac- tions, time, causality, observations, beliefs, non-determinism, intention), but they developed completely different tools and models for them in order to solve the tasks of inference and plan search (e.g. logical calculi and search methods).
As a consequence, it is difficult to obtain formal theories for practical reason- ing that combine the strengths of the two fields for practical reasoning. Thus, for example, the tools used in planning do not easily generalize to planning problems demanding sophisticated reasoning tasks. At its turn, logics have also experi- enced considerable difficulties to accomplish tasks that plan search algorithms routinely do. Combining the strengths of the two areas would greatly extend the class of scenarios that can be addressed by self-motivated artificial agents.
The aim of this thesis is the study of methods for extending a logic into a planning system (with the expressivity of this logic), and the use of plan- ning techniques to solve practical goals in a logically sound way. To this end, two particular logics are considered: temporal defeasible logic programming and dynamic epistemic logics. These logics are respectively oriented towards the physical or causal aspects of actions, and their epistemic and social dimensions. Although we focus on those particular logics, the proposed techniques seem to generalize to other logics with similar characteristics.