Page:EB1911 - Volume 01.djvu/661

 expression for the determinant becomes $$\Sigma(-)^k a_{1\beta} a_{2\alpha} a_{3\gamma} ... a_{n\nu}$$, viz. $$\alpha$$ and $$\beta$$ are transposed, and it is clear that the number of transpositions necessary to convert the permutation $$\beta \alpha \gamma ... \nu$$ of the second suffixes to the natural order is changed by unity. Hence the transposition of columns merely changes the sign of the determinant. Similarly it is shown that the transposition of any two columns or of any two rows merely changes the sign of the determinant.

Theorem.—Interchange of any two rows or of any two columns merely changes the sign of the determinant.

Corollary.—If any two rows or any two columns of a determinant be identical the value of the determinant is zero.

Minors of a Determinant.—From the value of $$\Delta$$ we may separate those members which contain a particular element $$a_{ik}$$ as a factor, and write the portion $$a_{ik} \text{A}_{ik}$$; $$\text{A}_{ik}$$, the cofactor of $$a_{ik}$$, is called a minor of order $$n-1$$ of the determinant.

Now $$a_{11} \text{A}_{11} = \Sigma\pm a_{11} a_{22} a_{33} ... a_{nn}$$, wherein $$a_{11}$$ is not to be changed, but the second suffixes in the product $$a_{22} a_{33} ... a_{nn}$$ assume all permutations, the number of transpositions necessary determining the sign to be affixed to the member.

Hence $$a_{11} \text{A}_{11} = a_{11} \Sigma\pm a_{22} a_{33} ... a_{nn}$$, where the cofactor of $$a_{11}$$ is clearly the determinant obtained by erasing the first row and the first column.

Hence $\text{A}_{11} = \begin{vmatrix} a_{22} & a_{23} & ... & a_{2n} \\ a_{32} & a_{33} & ... & a_{3n} \\. & . & ... & . \\ a_{n2} & a_{n3} & ... & a_{nn} \end{vmatrix}$

Similarly $$\text{A}_{ik}$$, the cofactor of $$a_{ik}$$, is shown to be the product of $$(-)^{ik}$$ and the determinant obtained by erasing from $$\Delta$$ the i&#8198;th row and k&#8198;th column. No member of a determinant can involve more than one element from the first row. Hence we have the development

$\Delta = a_{11}\text{A}_{11} + a_{12}\text{A}_{12} + a_{13}\text{A}_{13} + ... + a_{1n}\text{A}_{1n}$,

proceeding according to the elements of the first row and the corresponding minors.

Similarly we have a development proceeding according to the elements contained in any row or in any column, viz.

This theory enables the evaluation of a determinant by successive reduction of the orders of the determinants involved.

Since the determinant

$\begin{vmatrix} a_{11} & a_{22} & a_{23} & ... & a_{2n} \\ a_{11} & a_{22} & a_{23} & ... & a_{2n} \\ a_{31} & a_{32} & a_{33} & ... & a_{3n} \\. & . & . & ... & . \\ a_{n1} & a_{n2} & a_{n3} & ... & a_{nn} \end{vmatrix}$, having two identical rows,

vanishes identically; we have by development according to the elements of the first row

$a_{21}\text{A}_{11} + a_{22}\text{A}_{12} + a_{23}\text{A}_{13} + ... + a_{2n}\text{A}_{1n} = 0$;

and, in general, since

$a_{i1}\text{A}_{i1} + a_{i2}\text{A}_{i2} + a_{i3}\text{A}_{i3} + ... + a_{in}\text{A}_{in} = \Delta$,

if we suppose the ith and kth rows identical

$a_{k1}\text{A}_{i1} + a_{k2}\text{A}_{i2} + a_{k3}\text{A}_{i3} + ... + a_{kn}\text{A}_{in} = 0 \qquad (k \gtrless i)$;

and proceeding by columns instead of rows,

$a_{1i}\text{A}_{1k} + a_{2i}\text{A}_{2k} + a_{3i}\text{A}_{3k} + ... + a_{ni}\text{A}_{nk} = 0 \qquad (k \gtrless i)$

identical relations always satisfied by these minors.

If in the first relation of $$(A)$$ we write $$a_{is} = b_{is} + c_{is} + d_{is} + ...$$ we find that $$\Sigma a_{is}\text{A}_{is} = \Sigma b_{is}\text{A}_{is} + \Sigma c_{is}\text{A}_{is} + \Sigma d_{is}\text{A}_{is} + ...$$ so that $$\Delta$$ breaks up into a sum of determinants, and we also obtain a theorem for the addition of determinants which have $$n-1$$ rows in common. If we multiply the elements of the second row by an arbitrary magnitude $$\lambda$$, and add to the corresponding elements of the first row, $$\Delta$$ becomes $$\Sigma a_{1s}\text{A}_{1s} + \lambda \Sigma a_{2s}\text{A}_{1s} = \Delta$$, showing that the value of the determinant is unchanged. In general we can prove in the same way the—

Theorem.—The value of a determinant is unchanged if we add to the elements of any row or column the corresponding elements of the other rows or other columns respectively each multiplied by an arbitrary magnitude, such magnitude remaining constant in respect of the elements in a particular row or a particular column.

Observation.—Every factor common to all the elements of a row or of a column is obviously a factor of the determinant, and may be taken outside the determinant brackets.

The minor $$\text{A}_{ik}$$ is $$\frac{\partial \Delta}{\partial a_{ik}}$$, and is itself a determinant of order $$n-1$$. We may therefore differentiate again in regard to any element $$a_{rs}$$ where $$r \gtrless i$$, $$s \gtrless k$$; we will thus obtain a minor of $$\text{A}_{ik}$$, which is a minor also of $$\Delta$$ of order $$n-2$$. It will be $$\text{A}_{ik \atop rs} = \frac{\partial \text{A}_{ik}}{\partial a_{rs}} = \frac{\partial^2 \Delta}{\partial a_{ik} \partial a_{rs}}$$ and will be obtained by erasing from the determinant $$\text{A}_{ik}$$ the row and column containing the element $$a_{rs}$$; this was originally the r&#8198;th row and the sth column of $$\Delta$$; the r&#8198;th row of $$\Delta$$ is the r&#8198;th or (r–1)th row of $$\text{A}_{ik}$$ according as $$r \gtrless i$$ and the sth column of $$\Delta$$ is the sth or (s−1)th column of $$\text{A}_{ik}$$ according as $$s \gtrless k$$. Hence, if $$T_{ri}$$ denote the number of transpositions necessary to bring the succession $$ri$$ into ascending order of magnitude, the sign to be attached to the determinant arrived at by erasing the i&#8202;th and r&#8198;th rows and the k&#8198;th and s&#8202;th columns from $$\Delta$$ in order produce $$\text{A}_{ik \atop rs}$$ will be $$-1$$ raised to the power of $$T_{ri} + T_{ks} + i + k + r + s$$.

Similarly proceeding to the minors of order $$n-3$$, we find that $$\text{A}_{ik \atop{rs \atop tu}} = \frac{ \partial } { \partial a_{tu} } \text{A}_{ik \atop rs} = \frac{ \partial^2 }{ \partial a_{rs} \partial a_{tu} } \text{A}_{ik} = \frac{ \partial^3 }{ \partial a_{ik} \partial a_{rs} \partial a_{tu}} \Delta$$ is obtained from $$\Delta$$ by erasing the i&#8198;th, r&#8198;th, t&#8198;th, rows, the k&#8202;th, s&#8202;th, u&#8202;th columns, and multiplying the resulting determinant by $$-1$$ raised to the power $$T_{tri} + T_{usk} + i + k + r + s + t + u$$ and the general law is clear.

Corresponding Minors.—In obtaining the minor $$\text{A}_{ik \atop rs}$$ in the form of a determinant we erased certain rows and columns, and we would have erased in an exactly similar manner had we been forming the determinant associated with $$\text{A}_{is \atop rk}$$, since the deleting lines intersect in two pairs of points. In the latter case the sign is determined by $$-1$$ raised to the same power as before, with the exception that $$T_{uks}$$, replaces $$T_{usk}$$; but if one of these numbers be even the other must be uneven; hence

$\text{A}_{ik \atop rs} = -\text{A}_{is \atop rk}$.

Moreover

$a_{ik} a_{rs} \text{A}_{is \atop rk} = \begin{vmatrix} a_{ik} & a_{is} \\ a_{ik} & a_{rs} \end{vmatrix} \text{A}_{ik \atop rs}$,

where the determinant factor is given by the four points in which the deleting lines intersect. This determinant and that associated with $$\text{A}_{ik \atop rs}$$ are termed corresponding determinants. Similarly $$p$$ lines of deletion intersecting in $$p^2$$ points yield corresponding determinants of orders $$p$$ and $$n - p$$ respectively. Recalling the formula

$\Delta = a_{11}\text{A}_{11} + a_{12}\text{A}_{12} + a_{13}\text{A}_{13} + ... + a_{1n}\text{A}_{1n}$,

it will be seen that $$a_{1k}$$ and $$\text{A}_{1k}$$ involve corresponding determinants. Since $$\text{A}_{1k}$$ is a determinant we similarly obtain

$\text{A}_{1k} = a_{21}\text{A}_{1k \atop 21} + ... + a_{2, k-1}\text{A}_{1, k \atop 2, k-1} + ... + a_{2, n} \text{A}_{1, k \atop 2, n}$,

and thence

and as before

$\Delta = \sum_{i, k} \begin{vmatrix} a_{1i} & a_{2i} \\ a_{1k} a_{2k} \end{vmatrix} \text{A}_{1i \atop 2k} \quad i > k$,

an important expansion of $$\Delta$$.

Similarly

and the general theorem is manifest, and yields a development in a sum of products of corresponding determinants. If the jth column be identical with the ith the determinant $$\Delta$$ vanishes identically; hence if $$j$$ be not equal to $$i$$, $$k$$ or $$r$$,

Similarly, by putting one or more of the deleted rows or columns equal to rows or columns which are not deleted, we obtain, with Laplace, a number of identities between products of determinants of complementary orders.

Multiplication.—From the theorem given above for the expansion of a determinant as a sum of products of pairs of corresponding determinants it will be plain that the product of $$\Delta = (a_{11}, a_{22}, ... a_{nn})$$ and $$D = (b_{11}, b_{22}, b_{nn})$$ may be written as a determinant of order $$2n$$, viz.

$\begin{vmatrix} a_{11} & a_{21} & a_{31} & ... & a_{n1} & -1 & 0 & 0 & ...& 0 \\ a_{12} & a_{22} & a_{32} & ... & a_{n2} & 0 & -1 & 0 & ...& 0 \\ a_{13} & a_{23} & a_{33} & ... & a_{n3} & 0 & 0 & -1 & ... & 0 \\. & . & . & ... & . & . & . & . & ... & . \\ a_{1n} & a_{2n} & a_{3n} & ... & a_{nn} & 0 & 0 & 0 & ... & -1 \\ 0 & 0 & 0 & ... & 0 & b_{11} & b_{12} & b_{13} & ...& b_{1n} \\ 0 & 0 & 0 & ... & 0 & b_{21} & b_{22} & b_{23} & ... & b_{2n} \\ 0 & 0 & 0 & ... & 0 & b_{31} & b_{32} & b_{33} & ... & b_{3n} \\ 0 & 0 & 0 & ... & 0 & b_{n1} & b_{n2} & b_{n3} & ... & b_{nn} \\ \end{vmatrix} ~ \begin{matrix} = \begin{vmatrix} \text{A} & \text{B} \\ \text{C} & \text{D} \end{vmatrix} \\ \text{for brevity.} \end{matrix}$

Multiply the 1st, 2nd ... nth rows by $$b_{11}, b_{12}, ... b_{1n}$$ respectively, and