102 (1) 2010


  1. Concurrency Specification and Programming (CS&P).  Preface i-i
  2. Time-independent Liveness in Time Petri Nets
    Jörg Peter Bachmann,  Louchka Popova-Zeugmann 1-17
    In this paper we consider a class of Time Petri nets defined by structural restrictions. Each Time Petri net which belongs to this class has the property that their liveness behaviour does not depend on the time. Therefore, the Time Petri net is live when its skeleton is live.
  3. A Notion of Biological Diagnosability Inspired by the Notion of Opacity in Systems Security
    Roberto Barbuti,  Andrea Maggiolo-Schettini,  Paolo Milazzo, Damas P. Gruska 19-34
    A formal model for diagnostics of biological systems modelled as P systems is presented. We assume the presence of some biologically motivated changes (frequently pathological) in the systems behavior and investigate when these changes could be diagnosed by an external observer by exploiting some techniques originally developed for reasoning on system security.
  4. Update of Probabilistic Beliefs: Implementation and Parametric Verification
    Katarzyna Budzyńska,  Magdalena Kacprzak,  Paweł Rembelski 35-48
    The aim of the paper is to propose how to enrich a formal model of persuasion with a specification for actions which are typical for persuasion process. First, since these actions are verbal, they influence a receiver but do not change the agent's environment. In a formal framework, we represent them as actions that change not the particular state of a model, but the whole model. Second, effects of those actions depend on how much the receiver trusts the persuader. To formally model this phenomenon, we use a trust function. Finally, we want to represent uncertainty in terms of probability. Thus far, our model did not allow to express those properties of the persuasion process. Therefore, in this paper we extend Multimodal Logic of Actions and Graded Beliefs (AGn) with Probabilistic Dynamic Epistemic Logic (PDEL) and elements of Reputation Management framework (RM). Incorporation of PDEL into the model of persuasion requires some modifications of PDEL. Such extended model is then used to enrich Perseus - our software tool that enables to examine persuasive multi-agent systems. New components of the tool allow us to execute parametric verification of the different properties related to updating probabilistic beliefs in persuasion.
  5. Agent Oriented Techniques for Programming Autonomous Robots
    Hans-Dieter Burkhard 49-62
    Robots and agents are both autonomously interacting with their environment. It is even argued that robots can be regarded as embodied agents. This paper discusses the use of Agent Oriented Techniques for controlling robots, their benefits and their limitations. Basic skills need much more emphasis for robots than related skills of agents. Algorithms for perception in the real world are a key problem for robot programming with many (still unsolved) challenges. More deliberative tasks like planning can benefit from techniques developed for agents. Since real environments are highly dynamic, classical planning techniques are not flexible enough, but there exist more sophisticated techniques for agents which are useful for robotics as well.
  6. Process Algebra Contexts and Security Properties
    Damas P. Gruska 63-76
    A general framework for defining security properties is presented. It allows us to model many traditional security properties as well as to define new ones. The framework is based on process algebras contexts and processes relations. By appropriate choice of both of them we can model also probabilistic and quantified security properties.
  7. Model Checking Optimisation Based Congestion Control Algorithms
    Alessio Lomuscio,  Ben Strulo,  Nigel G. Walker,  Peng Wu 77-96
    Model checking has been widely applied to the verification of network protocols. Alternatively, optimisation based approaches have been proposed to reason about the large scale dynamics of networks, particularly with regard to congestion and rate control protocols such as TCP. This paper intends to provide a first bridge and explore synergies between these two approaches. We consider a series of discrete approximations to the optimisation based congestion control algorithms. Then we use branching time temporal logic to specify formally the convergence criteria for the system dynamics and present results from implementing these algorithms on a state-of-the-art model checker. We report on our experiences in using the abstraction of model checking to capture features of the continuous dynamics typical of optimisation based approaches.
  8. Checking Consistency of an ABox w.r.t. Global Assumptions in PDL
    Linh Anh Nguyen,  Andrzej Szałas 97-113
    We reformulate Pratt's tableau decision procedure of checking satisfiability of a set of formulas in PDL. Our formulation is simpler and its implementation is more direct. Extending the method we give the first EXPTIME (optimal) tableau decision procedure not based on transformation for checking consistency of an ABox w.r.t. a TBox in PDL (here, PDL is treated as a description logic). We also prove a new result that the data complexity of the instance checking problem in PDL is coNP-complete.
  9. SAT as a Programming Environment for Linear Algebra
    Marian Srebrny, Lidia Stępień 115-127
    In this paper we pursue the propositional calculus and the SATisfiability solvers as a powerful declarative programming environment that makes it possible to create and run the propositional declarative programs for computational tasks in various areas of mathematics. We report some experimental results on our application of the propositional SATisfiability environment to computing some simple orthogonal matrices and the orders of some orthogonal groups. Some encouraging (and not very encouraging) experiments are reported for the proposed propositional search procedures using off-the-shelf general-purpose SAT solvers. Our new software toolkit SAT4Alg is announced.
  10. A Categorical View of Timed Behaviours
    Irina Virbitskaite,  Natalya Gribovskaya,  Eike Best 129-143
    Timed transition systems are a widely studied model for real-time systems. The intention of the paper is to show how several categorical (open maps, path-bisimilarity and coalgebraic) approaches o an abstract characterization of bisimulation relate to each other and to the numerous suggested behavioural equivalences of linear time - branching time spectrum, in the setting of timed transition systems.

102 (2) 2010


  1. Dependently Typed Programming.  Preface i-i
  2. Correct-by-Construction Concurrency: Using Dependent Types to Verify Implementations of Effectful Resource Usage Protocols
    Edwin Brady,  Kevin Hammond 145-176
    In the modern, multi-threaded, multi-core programming environment, correctly managing system resources, including locks and shared variables, can be especially difficult and error-prone. A simple mistake, such as forgetting to release a lock, can have major consequences on the correct operation of a program, by, for example, inducing deadlock, often at a time and location that is isolated from the original error. In this paper, we propose a new type-based approach to resource management, based on the use of dependent types to construct a Domain-Specific Embedded Language (DSEL) whose typing rules directly enforce the formal program properties that we require. In this way, we ensure strong static guarantees of correctness-by-construction, without requiring the development of a new special-purpose type system or the associated special-purpose soundness proofs. We also reduce the need for "over-serialisation", the overly-conservative use of locks that often occurs in manually constructed software, where formal guarantees cannot be exploited. We illustrate our approach by implementing a DSEL for concurrent programming and demonstrate its applicability with reference to an example based on simple bank account transactions.
  3. A Tutorial Implementation of a Dependently Typed Lambda Calculus
    Andres Löh,  Conor McBride,  Wouter Swierstra 177-207
    We present the type rules for a dependently typed core calculus together with a straightforward implementation in Haskell. We explicitly highlight the changes necessary to shift from a simply-typed lambda calculus to the dependently typed lambda calculus. We also describe how to extend our core language with data types and write several small example programs. The article is accompanied by an executable interpreter and example code that allows immediate experimentation with the system we describe.
  4. Automation for Dependently Typed Functional Programming
    Sean Wilson,  Jacques Fleuriot,  Alan Smaill 209-228
    Writing dependently typed functional programs that capture non-trivial program properties is difficult in current systems due to lack of proof automation. We identify proof patterns that occur when programming with dependent types and detail how automating such patterns allow us to work more comfortably with types that capture, for example, membership, ordering and non-linear arithmetic properties. We describe the role of the rippling heuristic, both for inductive and non-inductive proofs, and generalisation in providing such automation. We then discuss an implementation of our ideas in Coq with practical examples of dependently typed programs, that capture useful program properties, which can be verified automatically. We demonstrate that our proof automation is generic in that it can provide support for working with theorems involving user-defined functions and inductive data types.

102 (3-4) 2010


  1. RCRA 2008 - Experimental Evaluation of Algorithms for Solving Problems with Combinatorial Explosion Preface i-ii
  2. Evaluating and Improving Modern Variable and Revision Ordering Strategies in CSPs
    Thanasis Balafoutis,  Kostas Stergiou 229-261
    A key factor that can dramatically reduce the search space during constraint solving is the criterion under which the variable to be instantiated next is selected. For this purpose numerous heuristics have been proposed. Some of the best of such heuristics exploit information about failures gathered throughout search and recorded in the form of constraint weights, while others measure the importance of variable assignments in reducing the search space. In this work we experimentally evaluate the most recent and powerful variable ordering heuristics, and new variants of them, over a wide range of benchmarks. Results demonstrate that heuristics based on failures are in general more efficient. Based on this, we then derive new revision ordering heuristics that exploit recorded failures to efficiently order the propagation list when arc consistency is maintained during search. Interestingly, in addition to reducing the number of constraint checks and list operations, these heuristics are also able to cut down the size of the explored search tree.
  3. Cluster Tree Elimination for Distributed Constraint Optimization with Quality Guarantees
    Ismel Brito,  Pedro Meseguer 263-286
    Some distributed constraint optimization algorithms use a linear number of messages in the number of agents, but of exponential size. This is often the main limitation for their practical applicability. Here we present some distributed algorithms for these problems when they are arranged in a tree of agents. The exact algorithm, DCTE, computes the optimal solution but requires messages of size exp(s), where s is a structural parameter. Its approximate version, DMCTE(r), requires smaller messages of size exp(r), r < s, at the cost of computing approximate solutions. It provides a cost interval that bounds the error of the approximation. Using the technique of cost function filtering, we obtain DMCTEf(r). Combining cost function filtering with bound reasoning, we propose DIMCTEf, an algorithm based on repeated executions of DMCTEf(r) with increasing r. DIMCTEf uses messages of previous iterations to decrease the size of messages in the current iteration, which allows to alleviate their high size. We provide evidences of the benefits of our approach on two benchmarks.
  4. Efficient Plan Adaptation through Replanning Windows and Heuristic Goals
    Alfonso E. Gerevini,  Ivan Serina 287-323
    Fast plan adaptation is important in many AI applications. From a theoretical point of view, in the worst case adapting an existing plan to solve a new problem is no more efficient than a complete regeneration of the plan. However, in practice plan adaptation can be much more efficient than plan generation, especially when the adapted plan can be obtained by performing a limited amount of changes to the original plan. In this paper, we investigate a domain-independent method for plan adaptation that modifies the original plan by replanning within limited temporal windows containing portions of the plan that need to be revised. Each window is associated with a particular replanning subproblem that contains some "heuristic goals" facilitating the plan adaptation, and that can be solved using different planning methods. An experimental analysis shows that, in practice, adapting a given plan for solving a new problem using our techniques can be much more efficient than replanning from scratch.
  5. Abductive Logic Programming as an Effective Technology for the Static Verification of Declarative Business Processes
    Marco Montali, Paolo Torroni,  Federico Chesani,  Paola Mello,   Marco Alberti,  Evelina Lamma 325-361
    We discuss the static verification of declarative Business Processes. We identify four desiderata about verifiers, and propose a concrete framework which satisfies them. The framework is based on the ConDec graphical notation for modeling Business Processes, and on Abductive Logic Programming technology for verification of properties. Empirical evidence shows that our verification method seems to perform and scale better, in most cases, than other state of the art techniques (model checkers, in particular). A detailed study of our framework's theoretical properties proves that our approach is sound and complete when applied to ConDec models that do not contain loops, and it is guaranteed to terminate when applied to models that contain loops.
  6. Combinatorial Optimization Solutions for the Maximum Quartet Consistency Problem
    António Morgado,  Joao Marques-Silva 363-389
    Phylogenetic analysis is a widely used technique, for example in biology and biomedical sciences. The construction of phylogenies can be computationally hard. A commonly used solution for construction of phylogenies is to start from a set of biological species and relations among those species. This work addresses the case where the relations among species are specified as quartet topologies. Moreover, the problem to be solved consists of computing a phylogeny that satisfies the maximum number of quartet topologies. This is referred to as the Maximum Quartet Consistency (MQC) problem, and represents an NP-hard optimization problem. MQC has been solved both heuristically and exactly. Exact solutions for MQC include those based on Constraint Programming, Answer Set Programming, Pseudo-Boolean Optimization (PBO), and Satisfiability Modulo Theories (SMT). This paper provides a comprehensive overview of the use of PBO and SMT for solving MQC, and builds on recent work in this area. Moreover, the paper provides new insights on how to use SMT for solving optimization problems, by focusing on the concrete case of MQC. The solutions based on PBO and SMT were experimentally compared with other exact solutions. The results show that for instances with small percentage of quartet errors, the models based on SMT can be competitive, whereas for instances with higher number of quartet errors the PBO models are more efficient.
  7. An Empirical Study of QBF Encodings: from Treewidth Estimation to Useful Preprocessing
    Luca Pulina,  Armando Tacchella 391-427
    From an empirical point of view, the hardness of quantified Boolean formulas (QBFs), can be characterized by the (in)ability of current state-of-the-art QBF solvers to decide about the truth of formulas given limited computational resources. In this paper, we start from the problem of computing empirical hardness markers, i.e., features that can discriminate between hard and easy QBFs, and we end up showing that such markers can be useful to improve our understanding of QBF preprocessors. In particular, considering the connection between classes of tractable QBFs and the treewidth of associated graphs, we show that (an approximation of) treewidth is indeed a marker of empirical hardness and it is the only parameter which succeeds consistently in being so, even considering several other purely syntactic candidates which have been successfully employed to characterize QBFs in other contexts. We also show that treewidth approximations can be useful to describe the effect of QBF preprocessors, in that some QBF solvers benefit from a preprocessing phase when it reduces the treewidth of their input. Our experiments suggest that structural simplifications reducing treewidth are a potential enabler for the solution of hard QBF encodings.
  8. SLGAD Resolution for Inference on Logic Programs with Annotated Disjunctions
    Fabrizio Riguzzi 429-466
    Logic Programs with Annotated Disjunctions (LPADs) allow to express probabilistic information in logic programming. The semantics of an LPAD is given in terms of the well-founded models of the normal logic programs obtained by selecting one disjunct from each ground LPAD clause.
    Inference on LPADs can be performed using either the system Ailog2, that was developed for the Independent Choice Logic, or SLDNFAD, an algorithm based on SLDNF. However, both of these algorithms run the risk of going into infinite loops and of performing redundant computations.
    In order to avoid these problems, we present SLGAD resolution that computes the (conditional) probability of a ground query from a range-restricted LPAD and is based on SLG resolution for normal logic programs. As SLG, it uses tabling to avoid some infinite loops and to avoid redundant computations.
    The performances of SLGAD are evaluated on classical benchmarks for normal logic programs under the well-founded semantics, namely a 2-person game and the ancestor relation, and on games of dice.
    SLGAD is compared with Ailog2 and SLDNFAD on the problems in which they do not go into infinite loops, namely those that are described by a modularly acyclic program.
    The results show that SLGAD is sometimes slower than Ailog2 and SLDNFAD but, if the program requires the repeated computations of the same goals, as for the dice games, then SLGAD is faster than both.
  9. A SAT-based Method for Solving the Two-dimensional Strip Packing Problem
    Takehide Soh, Katsumi Inoue,  Naoyuki Tamura,  Mutsunori Banbara,  Hidetomo Nabeshima 467-487
    We propose a satisfiability testing (SAT) based exact approach for solving the two-dimensional strip packing problem (2SPP). In this problem, we are given a set of rectangles and one large rectangle called a strip. The goal of the problem is to pack all rectangles without overlapping, into the strip by minimizing the overall height of the packing. Although the 2SPP has been studied in Operations Research, some instances are still hard to solve. Our method solves the 2SPP by translating it into a SAT problem through a SAT encoding called order encoding. The translated SAT problems tend to be large; thus, we apply several techniques to reduce the search space by symmetry breaking and positional relations of rectangles. To solve a 2SPP, that is, to compute the minimum height of a 2SPP, we need to repeatedly solve similar SAT problems. We thus reuse learned clauses and assumptions from the previously solved SAT problems. To evaluate our approach, we obtained results for 38 instances from the literature and made comparisons with a constraint satisfaction solver and an ad-hoc 2SPP solver.