Problem of Asymptotic theory

Let (Xi)i=1....n de iid according to a uniform distribution U (g-t,g+t). Assuming g is known, the null hypothesis Ho: t=1 is rejected in favor of H1:t>1 at the alpha level of significance when:

max 1<=i<=n |Xi-g|>1+ (1/n)log (1-alpha)

i) show that the test with rejection region (1) has hasymptotic level alpha

ii) determine the power of the test in the direction of local alternatives

t=1+(delta/n^1/2) and t=1+(delta/n) with delta>0

iii)if g in unknown, replace g by g^n= ((Xn -X1)/2) in (1) where X1 and Xn are the minimum and the maximum of the sample, respectively. show that the test (1) with g replaced by g^n no longer has asymptotic level alpha. Provide details on intermediate results used.

I solved the two first question but I can't solve the last one. Can anyone help me?