I have been thinking about this question for weeks and can't figure it out! I reckon it's decidable and in EXPTIME, but not sure how to prove this!!! Any help would be reallllly appreciated!!!

(Note: the question is in the attachment)

-Peter

Printable View

- Dec 4th 2011, 02:37 PMcomplexity9Complexity of SAT in First-order logic
I have been thinking about this question for weeks and can't figure it out! I reckon it's decidable and in EXPTIME, but not sure how to prove this!!! Any help would be reallllly appreciated!!!

(Note: the question is in the attachment)

-Peter - Dec 5th 2011, 08:18 PMAnnatalaRe: Complexity of SAT in First-order logic
Don't say SAT. That refers to a very particular decision problem that is known to be NPC. This problem may or may not be similar (we'll see).

(Also: someone check my argument here just to make sure I'm not full of ****. Thanks!)

To approach this I would consider atomic formulas and work my way up. Iff there's an effective procedure to calculate the answer without running forever for some case, it should be decidable. Since there are no function symbols, this is a relational language without equality. For each predicate in the discourse P(<x0, x1, x2,...>) (where x0, x1, x2, ... can only be atomic objects), there will be an assignment of true or false to the result. To a limited extent, we can model functions with equality by using predicates. For example, I can define + by defining triples where a + b = c: {(0, 0, 0), (0, 1, 1), (1, 0, 1), (1, 1, 2), ... }. The problem is I can't define + so that it can take*terms*, unless I allow for an infinite number of predicates to handle every possible equation (and without function symbols I would need an infinite language, which is generally not allowed unless expressly stated).

The question is, can we decidably answer "there exists x with property P" in this restricted language? Well, each evaluation of P(<a>) for some P and some finite sequence of x is an atomic operation that takes constant time to determine (by table lookup), so will be linear by the number of predicates appearing. The only thing that takes variable amounts of time to perform is the evaluation of each of the logical connectives. This is already known to be decidable by Godel's completeness theorem, since completeness applies to a complete set of logical connectives and therefore also applies to any incomplete subset.

You should be able to frame any SAT sentence as en existentially-bounded sentence in a vocabulary where each variable is a predicate or its inverse, and an additional predicate keeps those two mutually separate. You have to design the vocab after the fact but that's not a problem since we're trying to prove something about*any*existentially-bounded sentence. Similarly, you should be able to translate any existentially-bounded sentence into SAT, by trading predicates for variables. Both of these translations can be done in polynomial time (this is intuitive but not hard to show).

So existentially-bounded first-order sentences are reducible to SAT and vice versa, which means they are NPC, and thus they fall within EXPTIME. However, NP is not known to be a strict subset of EXPTIME (this is widely believed but not proven) so it is possible it may be EXPTIME-complete for all we know. - Dec 6th 2011, 08:11 AMcomplexity9Re: Complexity of SAT in First-order logic
I understand if given a SAT sentence, we can write this as an existentially-bounded sentence by making our model = {0,1} and A(0) = False, A(1) = True. Then replacing A(x) whenever we see x, and replace \not A(x) whenever we see \not x. Of course, we must have many existential quantifiers in front, for each predicates in the SAT sentence (variables in our new sentence).

But the other way round: translating existentially-bounded sentence into a SAT. I have problem with this one since we don't know the model (and whether it is finite?) and we don't know the outcome of the predicate symbols.

If you have a method of doing this, can you show me this in more detail since I can't find a way to turn an existentially-bounded sentence into a SAT. - Dec 6th 2011, 09:01 AMAnnatalaRe: Complexity of SAT in First-order logic
Caveat: I'm not 100% certain about what I'm writing below...I may need to rethink my approach here. But here are my current thoughts anyway. Maybe someone else can step in and assist.

I feel like the predicate symbol outcomes are not relevant to the computational complexity when they're not quantified. Each existentially-bound predicate in the EBS becomes a variable in SAT, and each predicate is just a constant. For example, Ex P(x) ^ Ey ~P(y) ^ P(x) either becomes a ^ ~b ^ true or a ^ ~b ^ false. Either way it doesn't change the difficulty of computing SAT in the worst-case scenario.

The model has to be finite. Since there are no function symbols, the language can only consist of logical symbols, predicates, and constants. The only way to get an infinite number of objects out of a finite language is by applying functions an unbounded number of times, or by starting with an infinite number of predicates. I would assume a finite language and finite set of predicates unless otherwise specified.