Page:Proceedings of the Royal Society of London Vol 60.djvu/516

Rh square, r is Bravais’ value of the coefficient of correlation. Rewriting hi in terms of these symbols, we have &i = r — .................................... (3). —

respectively minima, whatever he the form of the correlation between the two variables. Again, whatever the form of the correlation, if the regression be really linear, the equations to the lines of regression are those given above (as we pointed out in the introduction). This theorem admits of a very simple and direct geometrical proof. Let n be the number of correlated pairs in any one array taken parallel to the axis of x, and let 6 be the angle that the line of regression makes with the axis of y. Then, for a single array, S(*y) = yS(x) — 4 tan

or extending the significance of S to summation over the whole surface, S(xy) = N tan 0<r22, that is, tan 6 — r —a*

In any case, then, where the regression appears to he linear, * formula} may he used at once without troubling to investigate the normality of the distribution. The exponential character of the surface appears to have nothing whatever to do with the result.

To return, again, to the most general case, we see that both coefficients of regression must have the same sign, namely, the sign of r. Hence, either regression will serve to indicate whether there is correlation or no, for there is no reason, d priori, why the values of hi and 62, us determined above, should be positive rather than negative. But, nevertheless, the regressions are not convenient measures of correlation, for, on comparing two similar cases, we may find, say,