Hi,

I have a scattergram of a sample of data with a regression line marked on it. The x data vary over a range of nearly 3 decades, and the y data for any given x value varies over a range of about 5 to 1. The regression line is a cube root law, and both axes are logarithmic.

What I would like to do is estimate what the accuracy/confidence interval are for the regression line given, and what size of sample it would take to detect a specified change in the regression line with a specified degree of confidence.

How would I go about attempting this? Should I use the coordinates of the data as it is, or would it make more sense to use linear coordinatess measured off the graph. My intuition has always made me uncomfortable about the validity of concepts such as standard deviation and gaussian distributions when logarithmic scales are used to accomodate a range of variation that is very large relative to the mean.

Thanks.