# 12. Bilinear spaces

### Bilinear forms

A bilinear form is a function , where is a vector space over , such that for any and ,

and

### Matrix of a bilinear form

If is a basis of , then

is a matrix of with respect to .

Notice that

where and are coordinates of respectively in .

### Congruent matrices

Since , one can easily notice that if and are bases and are coordinates of in , then

gives the coordinates of this vector in written horizontally. Thus,

Which motivates the following definition. Matrices and if and only if there exists a reversible matrix , such that . In other words and are congruent if they are matrices of the same bilinear form in two different bases.

It is easy to notice that if two matrices are congruent, they have the same rank, and one of them is invertible if and only if the second is. A form is nondegenerate if its matrix (in any basis) is invertible.

A form is symmetrical if and only if for any vectors . Notice that if is a matrix of such a form in any basis, then is a symmetrical matrix, i.e. .

### Bilinear spaces

A vector space over along with a symmetric bilinear form is called a bilinear (or orthogonal) space. We shall say that a space or a subspace is nondegenerate, if the form on this space is nondegenerate. In other words, a subspace is degenerate, if and only if there exists a non-zero vector , such that for all , . A subspace is completely degenerate , .

The rank of a bilinear space is the rank of its bilinear form.

### Perpendicularity

We shall say that is perpendicular to (), if . A vector perpendicular to itself is called a null vector.

If is a subspace, then

is also a subspace.

The following theorem plays an important role: is a nondegenerate subspace of a bilinear space if and only if . Indeed, if , then in particular , but if is degenerate then there exists , such that for any , , so . Reversly, we have to prove that for every there exists uniquely a decomposition , where and if is nondegenerate. If is a basis of , then the existence of such a decomposition is equivalent to existence and uniqueness of coefficients such that , which gives exactly a system of equations of form

but the matrix of this system of equation is exactly the matrix of and if it is invertible (which means that is nondegenerate), means that there exists a unique solution to this system of equations.

### Orthogonal bases

Similarly as in the case of euclidean spaces, a basis of is orthogonal (or perpendicular), if for , .

If is a field of character not equal to , then an orthogonal basis exists (over it may not exist!). The proof is inductive and shows the procedure to find such a basis. If there exists a non-null vector then we take it to the basis. By the above theorem if , then , because obviously is nondegenerate. So we can then recursively find an orthogonal basis of . If on the other hand we encounter a space such that there is no more non-null vectors, then any basis is orthogonal (every two vectors are perpendicular). Indeed, , so (here we make use of the fact that we are not over , so ).

### Diagonalization

Notice also, that the matrix of a given form in an orthogonal basis is diagonal, so to find such a matrix which is diagonal and congruent to a given symmetric matrix, it is enough to find an orthogonal basis of the form definied by this matrix. We then get zeroes except for the diagonal on which we get for subsequent vectors of the orthogonal basis.

Notice also that if , then we can divide every non-null vector from the orthogonal basis by , i.e. we may consider vectors (for non-null , and , if is null). Then for non-null vectors we get:

and the sign is the same as the sign of . In other words, the matrix of with respect to has only , and on the diagonal. This means, that every real symmetrical matrix is congruent over reals to a matrix with only , and on the diagonal. If is a symmetrical matrix, then the number of in such a congruent diagonal matrix is denoted by , the number of is denoted by and the signature of is . Notice that two symmetrical matrices and are congruent over reals if and only if and .

Over complex numbers, it is even easier, because we can divide by without the absolute value. Thus, for all non-null vectors we get . This means that the matrix of with respect to has only and on the diagonal. In other words, every symmetrical matrix is congruent over complex numbers to a diagonal matrix with only and on the diagonal. Therefore, two symmetrical matrices and are congruent over complex numbers if and only if .

### Congruent diagonal matrices and similar diagonal matrices

What is the relation between finding a diagonal congruent matrix (i.e. ) and finding a diagonal similar matrix (i.e. )? There is a relation!

Recall that a symmetric matrix can be not only understood as a matrix of a bilinear form, but also as a matrix of a self-adjoint endomorphism (with respect to the standard inner product in ). Moreover, we know that such endomorphisms have an orthonormal basis consisting of eigenvectors, i.e. such that is orthogonal, meaning . Thus, the diagonal matrix has the property that , so matrices and are not only similar but also congruent! This means, that the diagonal matrix consisting of eigenvalues of the matrix (with appropriate fold) is congruent to (notice that this is the only similar diagonal matrix up to the order of elements on the diagonal but not the only such congruent matrix). On the other hand, such basis which is orthonormal with respect to the standard inner product is also orthogonal with respect to the form defined by , which yields a new method of finding an orthogonal basis for a given form.