Any hints on the strategy to derive ∃y∀x(Ax & By) from ∀x∃y(Ax & By) in fitch-format? I've been stuck on this one for weeks.
Starting from , we can instantiate x with an arbitrary term t (e.g., a fresh variable) and get a y such that , in particular, . So, we are left to show . Fix an arbitrary x, instantiate the universal variable in with x and thus get .
Emakarov, I'm not sure it's that easy to do in a fitch-format derivation. A violation of rules involving the eigen-variable occur. So, for example, I can't do this:
∀x∃y(Ax & By)
∃y(Az & By)
('-' indicates a sub-derivation). The last line in the sub-derivation is a violation. I can't universally quantify anything that's free in an assumption.