Page:Proceedings of the Royal Society of London Vol 60.djvu/515

480 tion this “ characteristic relation ” must become the “ equation of regression ” which gives the means of any a?r array, as only in this way can S nd*be made a minimum, i.e., zero.

It might be said that it would be more natural to form a “ characteristic relation ” between the absolute values of the variables and not their deviations from the mean. This may, however, be most conveniently done by working with the mean as origin until the characteristic is obtained, and then transferring the equation to zero as origin. It would be much more laborious and would only lead to the same result if zero were used ab initio as origin.

We may now proceed to the discussion of the special cases of two, three, or more variables. The actual formulae obtained are not, it will be found, novel in themselves, but throw an unexpected light on the meaning of the expressions previously given by Bravais* for the case of normal correlation.

(1) Case of Two Variables.—Since and y represent deviations from their respective means, we have, using S to denote summation over the whole.surface, S(a;) = S (y 0.

The characteristic or regression equations which we have to find are of the form x = ai + &i//T y = a 2 + /

Taking the equation for x first, the normal equations for av and l\ are S(*) = N a 1+ 6,S(y) 1 f2, S(xy) — cciS(y) + &iS(y2) / being the total number of correlated pairs. From the first of these equations we have at once = 0. From the second t _ S(xy) 'l *

To simplify our notation let us write S(a2) = NV. Sy = 1ST S(xy) = N><Ti<r2. and are then the two standard-deviations or errors of mean

on “ Regression, Heredity, &c.” * Phil. Trans.,’ A, vol. 187 (1896), p. 261 et seq.
 * “ Memoires par divers Savants,” 1846, p. 255, and Professor Pearson’s paper