Hi guys,

I have a dilemma about a log-transformed regression

$\displaystyle y=cd^x$

And so the transformation is

$\displaystyle \ln y = \ln c + x \ln d$

Assuming I have found the coefficient parameters for $\displaystyle \ln c$ and $\displaystyle \ln d$, I am required to test the hypothesis that $\displaystyle c,d =1$ by building the 95% confidence interval using $\displaystyle \ln c$ and $\displaystyle \ln d$.

I am unsure about transforming back or keeping the interval. Here is my thinking

1) I use the standard errors of the coefficients of $\displaystyle \ln c$ and $\displaystyle \ln d$ from the regression output and build a confidence interval:

$\displaystyle \ln c \pm SE_{c} \times t^*$ where t - is the critical value

2) Take the exponential to get back to normal scale and if the confidence interval contains 1, then accept the null hypothesis

Would this be the correct procedure?

Some people have said to me that this distorts the standard errors of the coefficient.

I would also like to note that the original data (pre-transformed) has no scale and was just raw numbers

Thanks in advance for any feedback

Linda