# Linear Algebra/Laplace's Theorem

We will have here, a more general theorem on the expansion of a determinant based on more than one row or column.

Let us have a square matrix M or order n. Let us specify 0<k<n rows and columns. The intersection of those rows and columns form a square matrix of order k. Such a matrix is called a minor of order k. We shall denote such a minor with the following notation:

Where specify the selected rows and where specify the selected columns.

Now suppose that we delete the selected columns and rows. Then, again, we have a square matrix, this time of order n-k. This is called the **complementary minor** of the previous minor.

Consider the minor consisting of the first k rows and columns.

Now consider all the terms of the determinant M such that the elements of the first k columns also belong to the first k rows. Let such a term be denoted q. Its first k terms is in the minor . Now we shall determine the sign in front of this term in the determinant. The first k factors are all in the minor, so the number of reversals among the first k terms is the same as the number of reversals in the minor. The last n-k terms are all in the complementary minor so the number of reversals among the last n-k terms is the same as the number of reversals in the complementary minor. Note how there is no reversal between the first k terms, and the last n-k terms. Thus, the total number of reversals is the sum of the reversals in the minor and in the complementary minor. Let the number of reversals in the minor be denoted and let the number of reversals in the minor be denoted . Then the sign of the term in the determinant M is simply , so it is equal to the product of the signs of the term in the minor and the term in the complementary minor. Thus, we can conclude that the product of one term in the minor, and one term in the complementary minor gives a term in the determinant M. Moreover, for all terms in the determinant M, there exists the corresponding two terms in the minor and complementary minor. Thus, the sum of all terms whose first k factors are also in the first k rows, is exactly equal to the product of the minor and its complementary minor.

Now we consider an arbitrary minor

.

We can exchange rows and columns of the matrix M until all the rows and columns of the minor are in the top-left corner. This requires a total of exchanges of rows. Thus, in performing the rows exchanges, it is multiplied by -1 that many times, or, to factor out even powers of -1, it is multiplied by .

We have already proved that the sum of all terms with k terms in the first k rows and columns is the product of the minor with its complementary minor. Thus, the sum of all terms with k terms in the rows and columns of the minor is essentially the product of the determinant of the minor, times the product of the determinant of the complementary minor, multiplied by .

If we denote the complementary minor of as

Then we can express this relation the sum as

Or

Where is called the cofactor of the minor .

Since all terms of the determinant M has at least k terms in the minor, one on each selected row, we can make a selection of rows, and then group together all the terms which pass through a selection of columns. We have already proven that the sum of all such terms is the product of the minor with its cofactor. Since we can select any term by making an appropriate selection of columns, all terms are in one of the groups characterized by such a selection of columns. Thus, all the groups contain all terms of the determinant, and therefore, if we are given a selection of k rows the determinant can be characterized as the sum

where the columns vary over all possible combinations of k columns.

Of course, an analogous result that given a selection of k columns, that the determinant can also be characterized as the sum of all products of minors with their cofactors, where the rows vary over all combinations of k rows, can be proved in an analogous method.