I am truly stuck. I'm having trouble using the Fisher information to determine the information inequality for this question:

The question is:
Suppose that X1,...,Xn form a random sample from a Bernoulli distribution for which the parameter p is unknown. Show that the variance of every unbiased estimator of (1-p)^2 must be at least 4p(1-p)^3/n.

I've tried using the information inequality but dont know how to computer the fischer information for (1-p)^2. Could someone show me how this could be done?

Thank you