## Abstracts of *Fundamenta Informaticae*
Volume 28.

**Number 1-2**

**J. J. Alferes, L. M. Pereira, T. C. Przymusinski
**
*Belief Revision in Non-Monotonic Reasoning and Logic Programming.
*
pages 1-22
In order to be able to explicitly reason about beliefs, we've introduced
a non-monotonic formalism, called the Autoepistemic Logic of Beliefs, *AEB$*,
obtained by augmenting classical propositional logic with a belief operator,
*B*. For this language we've defined the static autoepistemic expansions
semantics. The resulting non-monotonic knowledge representation framework
turned out to be rather simple and yet quite powerful. Moreover, it has
some very natural properties which sharply contrast with those of Moore's
*AEL*.
While static expansions seem to provide a natural and intuitive semantics
for many belief theories, and, in particular, for all affirmative belief
theories (which include the class of all normal and disjunctive logic programs),
they often can lead to inconsistent expansions for theories in which (subjective)
beliefs clash with the known (objective) information or with some other
beliefs. In particular, this applies to belief theories (and to logic programs)
with strong or explicit negation.
In this paper we generalize *AEB* to avoid the acceptance of inconsistency
provoking beliefs. We show how such *AEB* theories can be revised
to prevent belief originated inconsistencies, and also to introduce declarative
language level control over the revision level of beliefs, and apply it
to the domains of diagnosis and declarative debugging. The generality of
our *$AEB$* framework can capture and justify the methods that have
been deployed to solve similar revision problems within the logic programming
paradigm.

**K. R. Apt, R. Ben-Eliyahu**
*Meta-variables in Logic Programming, or in Praise of Ambivalent
Syntax.*
pages 23-36
We show here that meta-variables of Prolog admit a simple declarative
interpretation. This allows us to extend the usual theory of SLD-resolution
to the case of logic programs with meta-variables, and to establish soundness
and strong completeness of the corresponding extension of the SLD-resolution.
The key idea is the use of ambivalent syntax which allows us to use the
same symbols as function and relation symbols.
We also study the problem of absence of run-time errors in presence
of meta-variables. We prove that this problem is undecidable. However,
we also provide some sufficient and polynomial-time-decidable conditions
which imply absence of run-time errors.

**H.A.Blair**
*Designing Dependencies.*
pages 37-54
Given a binary recursively enumerable relation *R*, one or more
logic programs over a language **L** can be constructed and interconnected
to produce a dependency relation *D* on selected predicates within
the Herbrand base **B**L isomorphic to *R*.
*D* can be, optionally, a positive, negative or mixed dependency relation.
The construction is applied to representing any effective game of the type
introduced by Gurevich and Harrington, which they used to prove Rabin's
decision method for S2S, as the dependency relation of a logic program.
We allow games over an infinite alphabet of possible moves. We use this
representation to reveal a common underlying reason, having to do with
the shape of a program's dependency relation, for the complexity of several
logic program properties.

**A. Blikle**
*Why Denotational? Remarks on Applied Denotational Semantics.*
pages 55-85
This is an essay where the author expresses his views on applied denotational
semantics. In the author's opinion, whether a software system has or does
not have a sufficiently abstract denotational semantics should be regarded
as a pragmatic attribute of the system rather than merely as a mathematical
attribute of its description. In a software system with denotational semantics
structured programming is feasible and for such systems there is a routine
method of developing program-correctness logic. All that may not be the
case if denotationality is not ensured. On the other hand, a non-denotational
semantics can be always artificially made denotational on the expense of
lowering its level of abstraction. This leads to an important pragmatic
question: to what extent and in which situations can we sacrifice denotationality
and/or abstraction of a semantics? All discussions are carried on an algebraic
ground but the paper is not very technical and contains a short introduction
to the algebraic theory of denotational semantics.

**J. Dix, G. Gottlob, W. Marek**
*Reducing Disjunctive to Non-Disjunctive Semantics by Shift-Operations.*
pages 87-100
It is wellknown that Minker's semantics GCWA for positive disjunctive
programs *P*, i.e.~to decide if a literal is true in all minimal models
of *P* is Pi^P_2-complete. This is in contrast to the same entailment
problem for semantics of non-disjunctive programs such as STABLE and SUPPORTED
(both are co-NP-complete) as well as M^{supp}_P and WFS (that are even
polynomial).
Recently, the idea of reducing disjunctive to non-disjunctive programs
by using so called *shift-operations* was introduced independently
by *Bonatti*, *Dix/Gottlob/Marek*, and *Schaerf*. In fact,
Schaerf associated to each semantics SEM for normal programs a corresponding
semantics Weak-SEM for disjunctive programs and asked for the properties
of these weak semantics, in particular for the complexity of their entailment
relations. While Schaerf concentrated on Weak-STABLE and Weak-SUPPORTED,
we investigate the weak versions of Apt/Blair/Walker's stratified semantics
M^{supp}_P and of Van Gelder/Ross/Schlipf's well-founded semantics WFS.
We show that credulous entailment for both semantics is NP-complete
(consequently, sceptical entailment is co-NP--complete). Thus, unlike GCWA,
the complexity of these semantics belongs to the *first* level of
the polynomial hierarchy. Note that, unlike Weak-WFS, the semantics Weak-M^{supp}_P
is not always defined: testing consistency of Weak-M^{supp}_P is also NP--complete.
We also show that Weak-WFS and Weak-M^{supp}_P are cumulative and rational
and that, in addition, Weak-WFS satisfies some of the *well-behaved principles*
introduced by Dix.

**M.Fitting**
*A Modal Herbrand Theorem.*
pages 101-122
We state and prove a modal Herbrand theorem that is, we believe, a
more natural analog of the classical version than has appeared before.
The statement itself requires the enlargement of the usual machinery of
first-order modal logic---we use the device of *predicate abstraction*,
something that has been considered elsewhere as well. This expands the
expressive power of modal logic in a natural way. Our proof of the modal
version of Herbrand's theorem uses a tableau system that takes predicate
abstraction into account. It is somewhat simpler than other systems for
the same purpose that have previously appeared.

**G. Gottlob, M. Truszczynski**
*Approximating the Stable Model Semantics is Hard.*
pages 123-128
In this paper we investigate the complexity of problems concerned with
approximating the stable model semantics. We show that under rather weak
assumptions it is NP-hard to decide whether the size of a polynomially
computable approximation is within a constant factor from the size of the
intersection (union) of stable models of a program. We also show that unless
P=NP, no approximation exists that uniformly bounds the intersection (union)
of stable models.

**J. Hsiang, A. Wasilewska**
*Automating Algebraic Proofs in Algebraic Logic.*
pages 129-140
We present here an effective proof theory that allows one to reason
within algebras of algebraic logic in a purely syntactic, algebraic fashion.
We demonstrate the effectiveness of the method by discussing our automated
proofs of problems and theorems taken from Professor Helena Rasiowa's book
*An Algebraic Approach to Non-Classical Logics*, Studies in Logic
and Fundation of Mathematics, Volume 78, North Holland Publishing Company,
Amsterdam, London - PWN, Warsaw (1974). We include a detailed proof of
a problem presented to us by Professor Helena Rasiowa in June 1993. Most
of these proofs are the first *direct* proofs ever discovered, and
they are produced by the computer without human assistance.

**G. Mirkowska, A. Salwicki**
*The Algebraic Specifications do not have the Tennenbaum Property.*
pages 141-152
It is commonly believed that a programmable model satisfying the axioms
of a given algebraic specification guarantees good properties and is a
correct implementation of the specification. This convinction might be
related to the Tennenbaum's property[?] of the arithmetic: every computable
model of the Peano arithmetic of natural numbers is isomorphic to the standard
model.
Here, on the example of stacks, we show a model satisfying all axioms
of the algebraic specification of stacks which can not be accepted as a
good model in spite of the fact that it is defined by a program. For it
enables to ''pop'' a stack infinitely many times.

**J. A. Plaza**
*Logic Programming from the Perspective of Algebraic Semantics.*
pages 153-164
We present an approach to foundations of logic programming in which
the connection with algebraic semantics becomes apparent. The approach
is based on omega-Herbrand models instead of conventional Herbrand models.
We give a proof of Clark's theorem on completeness of SLD-resolution by
methods of the algebraic semantics. We prove the existence property for
definite programs.

**V. Pratt, J. Tiuryn**
*Satisfiability of Inequalities in a Poset.*
pages 165-182
We consider tractable and intractable cases of the satisfiability problem
for conjunctions of inequalities between variables and constants in a fixed
finite poset. We show that crowns are intractable. We study members and
closure properties of the class of tractable posets. We define a feasible
poset to be one whose potential obstacles to satisfiability are representable
by a certain formula of the first-order extended by the least fixed operator.
For bipartite posets we give a complete classification of hard posets and
feasible ones.

**B. A. Trakhtenbrot**
*On the Power of Compositional Proofs for Nets: Relationships
Between Completenesess and Modularity.*
pages 183-195
The intention of the present paper is to elucidate at a more abstract
(and hopefully more persuasive) level the relationship between modularity
of net semantics and completeness of compositional proofs for nets. This
is to be achieved by a separation of concerns which is not reflected in
common terminology.

**P. Urzyczyn**
*Positive Recursive Type Assignment.*
pages 197-209
We consider several different definitions of type assignment with positive
recursive types, from the point of view of their typing ability. We discuss
the relationships between these systems. In particular, we show that the
class of typable pure lambda terms remains the same for different type
disciplines involving positive type fixpoints, and that type reconstruction
is decidable.

**Number 3-4**

**M. Banerjee, M. K. Chakraborty **
*Rough Sets Through Algebraic Logic *
While studying rough equality within the framework of the modal system
S_{5}, an algebraic structure called rough algebra [1], came up.
Its features were abstracted to yield a *topological quasi-Boolean algebra*
(tqBa). In this paper, it is observed that rough algebra is more structured
than a tqBa. Thus, enriching the tqBa with additional axioms, two more
structures, viz. {\em *pre-rough algebra* and *rough algebra*,
are defined. Representation theorems of these algebras are also obtained.
Further, the corresponding logical systems **L**_{1}, **L**_{2
}are proposed and eventually, is **L**_{2 }proved to be
sound and complete with respect to a rough set semantics.

**Alexander Bochman**
*On a Logical Basis of Normal Logic Programs*
pages 223--245
In this paper we develop a logical framework for normal logic programs,
called default consequence relations. We give a representation of major
semantics for logic programs in this formalism and study the question what
logics are adequate for the latter. It is shown that, in general, default
consequence relations based on three-valued inference are appropriate for
these semantics, though different semantics admit different kinds of inference.

**Nguyen Cat Ho **
*A Method in Linguistic Reasoning on a Knowledge Base Representing
by Sentences with Linguistic Belief*
pages 247--259
In this paper we shall introduce a notion of formal knowledge base
consisting of statements associated with linguistic truth degrees and a
set of inference rules handling this kind of statements. A deductive reasoning
method based on these inference rules will be examined. The logical foundation
of this method is linguistic valued logic based on extended hedge algebras
and, based on this basis, the consistency of the knowledge base will be
considered.

**P. Doherty, W. Lukaszewicz, A. Szalas**
*A Reduction Result for Circumscribed Semi-Horn Formulas*
pages 261--271
Circumscription has been perceived as an elegant mathematical technique
for modeling nonmonotonic and commonsense reasoning, but difficult to apply
in practice due to the use of second-order formulas. One proposal for dealing
with the computational problems is to identify classes of first-order formulas
whose circumscription can be shown to be equivalent to a first-order formula.
In previous work, we presented an algorithm which reduces certain classes
of second-order circumscription axioms to logically equivalent first-order
formulas. The basis for the algorithm is an elimination lemma due to Ackermann.
In this paper, we capitalize on the use of a generalization of Ackermann's
Lemma in order to deal with a subclass of universal formulas called *semi-Horn
formulas*. Our results subsume previous results by Kolaitis and Papadimitriou
regarding a characterization of circumscribed definite logic programs which
are first-order expressible. The method for distinguishing which formulas
are reducible is based on a boundedness criterion. The approach we use
is to first reduce a circumscribed semi-Horn formula to a fixpoint formula
which is reducible if the formula is bounded, otherwise not. In addition
to a number of other extensions, we also present a fixpoint calculus which
is shown to be sound and complete for bounded fixpoint formulas.

**J. Green, N. Horne, E. Orlowska, P. Siemens **
*A Rough Set Model of Information Retrieval*
pages 273--296
The purpose of this paper is to define and investigate document retrieval
strategies based on rough sets.
A subject classification index is modelled by means of a family of
set-theoretic information systems. The family induces a power set hierarchy
built from the set of keywords.
In the paper, the task of information retrieval in response to a query
that refers to several levels of that hierarchy is defined. Retrieval strategies
are proposed and discussed. The strategies are, roughly speaking, based
on various types of similarity between sets of keywords included in a query
and in a document. It is not necessarily required that the document matches
the query; we allow for approximate matching. Measures of relevance of
documents to a query are proposed for the given strategies.

**D. Grigoriev, M. Karpinski, A. M. Odlyzkol **
*Short Proofs for Nondivisibility of Sparse Polynomials under
the Extended Riemann Hypothesis*
pages 297--301
We prove for the first time an existence of the short (polynomial size)
proofs for *nondivisibility of two sparse polynomials* (putting thus
this = problem is the class NP) under the Extended Riemann Hypothesis.
The *divisibility problem* is closely related to the problem of *rational
interpolation*. Its computational complexity was studied in [5], [4],
and [6].
We prove also, somewhat surprisingly, the problem of deciding whether
a rational function given by a black box equals to a polynomial belong
to the *parallel class* NC (see, e.g., [KR 90]), provided we know
the degree of its sparse representation.

**A.Gomolinska **
*On Semantics for the Nonmonotonic Modal Formalization of the
Logic of Acceptance and Rejection*
pages 303--313
The logic of acceptance and rejection (AEL2) is a nonmonotonic formalism
in the spirit of Moore's autoepistemic logic (AEL). AEL2 is to represent
knowledge of an ideal rational agent who makes decisions about pieces of
information. The case of uncertainty of the agent is considered as well.
Therefore the agent's choices are: acceptance, rejection, and the lack
of decision. AEL is treated as the nonmonotonic KD45 modal logic. AEL2
may be formalized in a similar way. In the paper the problem of an appropriate
semantics for the modal formalization of AEL2 is discussed.

**G. Paun, L. Polkowski, A. Skowron**
*Parallel Communicating Grammar Systems with Negotiation*
pages 315--330
In a parallel communicating grammar system, several grammars work together,
synchronously, on their own sentential forms, and communicate on request.
No restriction is imposed usually about the communicated strings. We consider
here two types of restrictions, as models of the negotiation process in
multi-agent systems: (1) conditions formulated on the transmitted strings,
and (2) conditions formulated on the resulting string in the receiving
components. These conditions are expressed in terms of regular languages,
whose elements the mentioned strings should be. Moreover, we allow that
the communicated string is a part (any one, a prefix, a maximal, a minimal
one, etc.) of the string of the communicating component. We investigate
these variants from the point of view of the generative capacity of the
corresponding parallel communicating grammar systems.

**S. Radev **
*Argumentation Systems*
pages 331--346
Argumentation systems are useful instruments for decision support when
the arguments are changed dynamically in dependence of the users and the
situation. In argumentation systems different strategies are used when
the system prepares the argumentation for a given situation and/or user.
These strategies correspond to some many valued logics and some of these
strategies have fuzzy and rough interpretations. Many information systems
have interpretation in different argumentation systems. For the lower and
upper approximations there are equivalent argumentation strategies that
correspond to the well known logical systems. Moreover when a practical
system chooses the arguments for the user it needs more compact argumentation
then the list of all procedures it uses. Then the argumentation system
finds arguments with the same persuade power as its argumentation.

**W. Skarbek **
*Dynamic Behaviour of Spatial Signal OR-Graphs*
pages 347--352
Spatial Signal OR Graphs (*SSOG*) are defined and specialized
to fractal pattern generation (*FSSOG*) and cellular array pattern
generation(*CSSOG*). It is proved that fractal patterns always stabilize
after predictable transient time, while cellular pattern graphs enter in
parallel mode into a stable state if the initial state is even, otherwise
they enter into a cycle of length 2. If nondeterministic mode is considered
then any G in *SSOG* enters a stable state with probability one.

**Z. Suraj **
*Discovery of Concurrent Data Models from Experimental Tables:
A Rough Set Approach*
pages 353--376
The main objective of machine discovery is the determination of relations
between data and of data models. In the paper we describe a method for
discovery of data models represented by concurrent systems from experimental
tables. The basic step consists in a determination of rules which yield
a decomposition of experimental data tables; the components are then used
to define fragments of the global system corresponding to a table. The
method has been applied to automatic data models discovery = from experimental
tables with Petri nets as models for concurrency.

**K. Szlechta **
*A Two-Stage Approach to First Order Default Reasoning*
pages 377--402
Our subject is the representation and analysis of simple first-order
default statements of ordinary language, such as ``normally, birds fly''.
There are, among other approaches, two kinds of analysis, both semantic
in style. One interprets ``normally, birds fly'' along the lines of ``for
every item *x *in the domain of discourse, the most normal models
of ``x is a bird'' are models of ``x flies''''. This is the *preferential*
*models* approach, first outlined by Bossu/Siegel and Shoham, and
studied by Kraus, Lehmann, Magidor and others. The other interprets ``normally,
birds fly'' along the lines of ``there is an important subset of the birds,
all of whose elements fly''. This is the *generalized quantifier*
approach, formulated and developed by the author. The purpose of the present
paper is to show how the two approaches may usefully be combined into a
single two-stage approach, and how such a combination provides an elegant
account of certain problematic examples.

**J. Winkowski **
*Concatenable Weighted Pomsets and Their Applications to Modelling
Processes of Petri Nets*
pages 403--421
Structures called concatenable weighted pomsets are introduced which
can serve as models of processes of Petri nets, including nets with time
features. Operations on such structures are defined which allow to combine
them sequentially and in parallel. These operations correspond to natural
operations on processes. They make the universe of concatenable weighted
pomsets a partial algebra which appears to be a symmetric strict monoidal
category. Sets of processes of timed and time Petri nets are characterized
as subsets of this algebra.

**J. Wroblewski **
*Theoretical Foundations of Order-Based Genetic Algorithms*
pages 423--430
A lot of research on genetic algorithms theory is concentrated on classical,
binary case. However, there are many other types of useful genetic algorithms
(GA), e.g. tree-based (genetic programming), or order-based ones. This
paper shows, that many of classical results can be transferred into the
order-based GAs. The analysis includes the Schema Theorem and Markov chain
modelling of order-based .