Suppose X1, X2...Xn are independent, each with density

where x, a are in R.

Essentially I've got to the stage in a question where I need to show that Fisher's information for a is n/2.

I seem to be getting that it should be 2n, so am wondering if I am making a mistake or if there is an error in the question.

I've got that the log likelihood is

So differentiating twice and introducing a minus sign I've found the observed information (after simplifying):

To find the expected value of this (Fisher's information) I have multiplied by the pdf and integrated, as normal, to get that the Fisher's information for one observation is

where the integral is over -infinity to infinity. This gives an answer of 2 as the integral evaluates to pi/2.

So the Fisher's information of the sum is 2n, not n/2.

Have I made a mistake somewhere above? Thanks.