I have a claim that is very logical, yet hard to prove. Namely, if performing nonlinear regression, on data sample affected by normally distributed noise only, using corresponding (and of course nonlinear) model in which parameters are highly correlated, regression will yield parameters with larger variances (equivalent to wider confidence intervals) than when corresponding model in which parameters are less correlated or uncorrelated is employed.
I searched a tons of literature (books, papers), and as close I could get was page 109 of the:
Nonlinear regression - Google Books
Do you have any ideas where to find more on this topic or how to stand more firm behind the statement? I am writing a paper (solid-state electronics is the topic) and I would not like that reviewers start putting this in doubt.