This is a point of confusion for me...
When r = 1 or -1 for a set of bivariate data, does that mean that all the points must fall on the standard deviation line (which has a slope of + or -SDy/SDx and passes through the point of averages)? And conversely, when all points fall on the SD line does that mean that r is 1 or -1?
I'm just a little confused on how the SD line relates to correlation. Thanks!
Conversion to standard units: (x-value - x-average)/SDx
So standard units is the number of standard deviations a particular value is above and below the average. To reword my question: if all points are perfectly correlated, then if one of the x-values is one SDx above the x-average, then its corresponding y-value is also one SDy above the y-average (and so on for every point). Is this true?
Sorry, my book must use different terminology