Logic for Computer Science/Logics
Sub-FO Logics[edit | edit source]
From Propositional Logic to FO[edit | edit source]
Propositional Logic[edit | edit source]
Monadic FO[edit | edit source]
Existential FO[edit | edit source]
Universal FO[edit | edit source]
Modal Logic[edit | edit source]
Modal logic expands propositional logic by operators for the modalities possibility and necessity. These basic modal operators are usually written (or L) for necessarily and (or M) for possibly. Each can be defined from the other in the following way:
Conjunctive Logic[edit | edit source]
Conjunctive Logic is a fragment of FO that is restricted to atomic FO expressions joined by conjunctions that can be preceded by existential quantifiers. It is especially important in Database theory.
- In other words: a CON-formula is a (relational) FO-formula built from atomic formulas using only conjunction and existential quantification.
- The semantics of FO applies here correspondingly.
- The 'usual' notions of free and bound variables also apply here.
- A formula of the form , where each φ is quantifier free, is said to be in Normal Form
- Each CON formula is equivalent to one in Normal Form
- All CON formulas are satisfyable
- has a bound x and a free variable y
- is equivalent to . The latter is said to be in normal form.
Additionally Equality between variables or constants can be be introduced by
Example. Not always satisfiable. Each satisfiable CON formula is equivalent to one without equality.
Higher-order Logics[edit | edit source]
Fixed-Point Logics[edit | edit source]
Fixed-Point Logics augment the syntax of FO by a fixed-point operator, a mechanism that enables recursion on expressions, and so allows higher expressive power. This idea is quite similar to adding the construct of while-loop to an imperative programming language.
Example Transitive Closure
Consider the transitive closure of a graph G. For distance one we have:
for distance up to two:
so we can go on by expanding this disjunction for further fixed distances (only). In order to express transitive closure for an arbitrary distance (what is vital for most situations) recursion can be introduced employing:
where T is the transitive closure up to some i. Thus φ(T) gives the transitive closure up to i + 1. So it can be defined a sequence
that clearly gives the transitive closure for some k, i.e. the sequence converges (i.e. Jk = Jk+j, for all j > 0) and its limit is the transitive closure.
In the illustration we have an initial graph G as in (a) and thus
and the fixed point (c) as
and thus J3 = J4 = ... .
As we have seen in this example, recursion can be used to express formulas of arbitrary length. But since we are always dealing with finite formulas here some terminator must be given for the recursion. This role can be played by the fixpoint. In the above example such a fixpoint exists. But this is not always the case, as we see next.
As we have seen in this example, a fixpoint of a recursion not always exists. So we must be careful defining a recursion-based logic. We extend FO by a fixpoint operator to PFP for cases where a fixpoint exists.
Partial Fixed-Point Logics (PFP)[edit | edit source]
The above assumption that a fixpoint exists it not really satisfying, so one now can think of restrictions on the recursion in order to enforce the existence in every case. Therefore three ideas are introduced: monotonicity, positiveness and inflation of the recursion.
Inflationary Fixed-Point Logics (IFP)[edit | edit source]
Least Fixed-Point Logics (LFP)[edit | edit source]
Comparison of IFP and LFP[edit | edit source]