# 13. Scalar product and perpendicularity

### Scalar product

To study angles between vectors it will be convenient to use scalar product. The scalar product of two vectors , is . Standard scalar product in is sum of products on subsequent places, so e.g.: .

### Length of a vector and angles between vectors

By Pitagoras Theorem it is easy to see that is the square of the length of a vector, e.g. . The length of a vector , also called the norm of , will be denoted as . We get that:

Assume now that we are given three vectors and forming a triangle. So . Let be the angle between i . The law of cosines states that:

Therefore:

So:

So cosine of an angle between vectors is given by the following formula:

One more application of scalar product is calculating perpendicular projection of a vector onto a direction given by a second vector. Let be the perpendicular projection of onto direction given by . It will have the same direction as and length , where is the angle between and . Therefore:

because: is the vector of length in direction of .

### Perpendicularity, perpendicular spaces

We know that two vectors are perpendicular if cosine of the angle between them equals zero. Therefore if and only if .

Notice that if we would like to find all vectors perpendicular to , then the above is the equation we have to solve. Moreover this is a linear uniform equation. If we would like to find the set of vectors perpendicular to all vectors from a given list, then we will get a system of uniform linear equations. So given a linear subspace , the set (called the orthogonal complement of ) of all vectors perpendicular to all the vectors from is also a linear subspace! It is the space of solutions of some system of linear equations.

For example, let . A vector is perpendicular to those vectors (and so also to every vector of ), if and , in other words, if it satisfies the following system of equations:

so:

So the general solution has the following form: , and therefore we have the following basis of : .

Notice that the coefficients in a system of equations describing given linear space are vectors which span the perpendicular space! Which is a new insight into our method of finding a system of equations for a space given by its spanning vectors.

### Orthogonal and orthonormal bases

A basis of a space will be called orthogonal, if every pair of vectors in it are perpendicular. E.g. is an orthogonal basis of — indeed it is easy to check, that all pairs are perpendicular, e.g. .

We will learn how to find such basis later on. Now let us notice that calculating coordinates of a vector in such a basis is fairly simple. Actually, we are calculating its projections onto vectors from the basis. Given vector , its -th coordinate in an orthogonal basis, which has as its -th vector is simply . Therefore, coordinates of in the above exemplary basis are: , and .

Orthonormal basis is an orthogonal basis in which additionally all the vectors have length . Given an orthogonal basis we can simply divide each of the vectors by its length to get an orthonormal basis. So is an orthonormal basis of . Notice that calculating the coordinates of a vector in an orthonormal basis is even simpler. Since , we get that the -th coordinate of is simply , where is the -th vector of the basis. Therefore the coordinates of in the basis in the example are: , and .

### Projection onto a linear subspace

We already know how to calculate a projection of a vector onto a line defined by a given vector. To calculate a projection onto more-dimensional subspace we have to calculate its orthogonal basis and calculate projections onto directions defined by each of the vectors from the basis and sum those projections.

Notice also that if is a projection of a vector onto , and is its projection onto , then . Which can be nicely used when calculating a projection of a vector onto a plane in the case of three-dimensional space. Indeed, if is a plane in , then the perpendicular space is a line and is given by a vector. Denote this vector by . Therefore the projection of onto plane is:

### Orthogonal reflection across a linear subspace

We can use our ability to calculate the projection of onto , to easily calculate (see the image below) its image under reflection across :

### Gram-Schmidt orthogonalization

Assume that we are given a space spanned by some vectors, e.g. . We would like to find an orthogonal basis of this space. The method of finding such a basis is called Gram-Schmidt orthogonalization. The idea is to take as the first vector of the new basis the first vector from the original basis:

The second vector has to be be perpendicular to the first one, so we take the second vector and subtract its projection onto the first one. So:

so only perpendicular ,,part” is left. In the case of the third vector we need to subtract projections onto both already constructed vectors:

In the case of more dimensions we will continue this procedure further.

In our case:

So the basis we were looking for is (I can drop the fractions since multiplication by a number does not change angles).

### Linear maps: projection onto a linear subspace and reflection across a linear subspace

Notice that a projection onto a linear subspace and reflection across are linear mappings. Moreover, it is easy to see its eigenvectors:

• since projection does not change vectors in , they are eigenvectors with eigenvalue . On the other hand, vectors from are multiplied by zero, so they are eigenvectors with eigenvalue zero.
• since reflection does not change vectors in , they are eigenvectors with eigenvalue . On the other hand, vectors from are multiplied by , so they are eigenvectors with eigenvalue .

So basis consisting of vectors from a basis of and of vectors from a basis of is a basis of eigenvectors of both those maps. Which make it possible to calculate their formulas.

E.g. let . Therefore basis is . So if is the projection onto , and is the reflection across , then are eigenvectors with eigenvalue of both maps. Also is an eigenvector with eigenvalue zero for , and for . Therefor basis is a basis of eigenvectors of both maps, and::

Let us calculate their formulas. We have:

So:

Therefore:

and

so:

and