6. Describing a vector space

How to describe mathematically a given vector subspace of K^n? There are few ways:

  1. list a set of vectors which spans it,
  2. write out its basis,
  3. give a homogeneous system of linear equations which describes it.

The above needs few more words of explanation. We would also like to know how to move from one of the descriptions to another.

A spanning system of vectors

It is easy to notice that the set of all linear combinations of a given set of vectors is always a vector space. We check it in the case of three vectors, let \alpha,\beta\in\text{lin}(v_1,v_2,v_3), then \alpha=av_1+bv_2+cv_3, \beta=dv_1+fv_2+gv_3, for some numbers a, b, c, d, f, g. Therefore, (\alpha+\beta)= (a+d)v_1+(b+f)v_2+(c+g)v_3 and for any number k, k\alpha=kav_1+kav_2+kav_3 are also linear combinations of v_1, v_2, v_3, so they are in \text{lin}(v_1,v_2,v_3), so we are dealing with a vector space.

Therefore, \text{lin}((2,2,3),(1,1,1),(-1,-1,0)), or \text{lin}((1,0),(0,1)) are examples of vector spaces, defined by listing the sets of vectors which span them.

Basis

A system of vectors which spans a given vector space and additionally is linearly independent is called its basis. Basis of a vector space is not unique, e.g. (1,0),(0,1) (so called standard basis), and also (-1,0),(2,1) are both bases of the plane. But in every basis of a given vector space you will find the same number of vectors. This number is called the dimension of the given vector space V and denoted as \text{dim}V.

In other words, a basis is a maximal linearly independent system of vectors in a given space, or minimal system of vectors which spans the space.

Every vector in the given space is a linear combination of vectors from the basis. Furthermore, the coefficients in this combinations are uniquely determined and called coordinates. So, e.g. the vector (1,1) has coordinates 1,1, with respect to the basis (-1,0),(2,1) and (0,-1) with respect to the same basis has coordinates -2,-1. Obviously, we calculate the coordinates of a given vector with respect to a given basis by solving a system of equations. We have done it already before while determining whether a vector is a linear combination of other vectors. This time the system will always have exactly one solution.

It is clear from Steinitz Theorem, that a system of vectors in a finite dimensional space is a basis if and only if it satisfies at least two of the following conditions (the third one is satisfied then as well): it is linearly independent, it spans the whole space, and the number of vectors in it equals the dimension of the space.

A system of equations describing a space

Notice that given a homogeneous system of linear equations and two its solutions, both their sum and a solution multiplied by any number are also solutions of the system. Let us check it in the following example. Vectors (1,-1,0) and (0,2,2) are solutions of the equation x+y-z=0. Indeed, 1+(-1)-0=0 and 0+2-2=0, so also (1+0)+(-1+2)-(0+2)=(1+(-1)-0)+(0+2-2)=0 and 5\cdot 1+ 5\cdot(-1)-5\cdot 0= 5\cdot(1+(-1)-0)=0.

Therefore, a set of solutions to a homogeneous system of linear equations is always a vector space! And a homogeneous system itself is one of possible descriptions of a vector space.

Now let us learn how to when given one form of description find another one.

Finding a basis from a set of vectors which spans the space

Given a set of vectors we would like to find a linearly independent set, which spans the same space. We have mentioned that in a matrix when we carry out the operations on rows we do not change the set of possible linear combinations of vectors written in the rows. Furthermore, notice that in a echelon form of matrix all vectors from the non-zero rows are linearly independent. Indeed, to get the zero vector in the combination of them, we need to multiply the first one by zero because of the first place. Therefore, we need to multiply the second one by zero because of the second place. An so on. Thus, all we need to do is to write the given vectors in the rows of a matrix and calculate its echelon form. The non-zero rows form a basis of the space.

For example, let us find a basis of the space \text{lin}((1,0,1,),(1,1,1),(-2,-3,-2)).

    \[\left[\begin{array}{ccc}1&0&1\\1&1&1\\-2&-3&-2\end{array}\right]\underrightarrow{w_2-w_1, w_3+2w_1}  \left[\begin{array}{ccc}1&0&1\\0&1&0\\0&-1&0\end{array}\right]\underrightarrow{w_3+w_2}  \left[\begin{array}{ccc}1&0&1\\0&1&0\\0&0&0\end{array}\right]\]

So (1,0,1),(0,1,0) is a basis of the given space, and the space has dimension 2.

Finding a basis of a space described by a system of equations

We start by finding a general solution to the given system in the parametrized form. E.g.

    \[\begin{cases}x-z+w-t=0\\x+y+w=0\end{cases}\]

    \[\left[\begin{array}{ccccc|c}1&0&-1&1&-1&0\\1&1&0&1&0&0\end{array}\right]\underrightarrow{w_2-w_1} \left[\begin{array}{ccccc|c}1&0&-1&1&-1&0\\0&1&1&0&1&0\end{array}\right]\]

so the general solution is of form: (z-w+t,-z-t,z,w,t), so every vector in the given space is of form (z-w+t,-z-t,z,w,t), for some z,w,t. But notice that (z-w+t,-z-t,z,w,t)=z(1,-1,1,0,0)+w(-1,0,0,1,0)+t(1,-1,0,0,1). Therefore every vector in the space is a linear combination of vectors (1,-1,1,0,0),(-1,0,0,1,0),(1,-1,0,0,1), which I have calculated by substituting 1 for one parameter and zero for others. It is also easy to see that those three vectors are linearly independent. So a (1,-1,1,0,0),(-1,0,0,1,0),(1,-1,0,0,1) is a basis of the space and the space has dimension 3. Notice also that 3=5-2, and 5 is the number of the variables, and 2 is the number of independent equations in the system.

Finding a system of equations describing a vector space given by a system of vectors which spans it (or a basis)

Assume that we study the vector space \text{lin}((1,2,-1,0),(1,1,0,1),(0,1,-1,-1)). Therefore, our system of equations will have 4 variables. Assume that there will be an equation of form ax+by+cz+dw=0 in it. The three given vectors have to satisfy this equation. So a+2b-c=0, a+b+d=0, b-c-d=0, which actually gives a system of equations which has to be fulfilled by the coefficients of the system we are looking for. A system of equations describing the coefficients of a system of equations, looks a bit confusing, but bear with me. Let us write this system down:

    \[\begin{cases}a+2b-c=0\\ a+b+d=0\\ b-c-d=0\end{cases}\]

And solve it:

    \[\left[\begin{array}{cccc|c}1&2&-1&0&0\\1&1&0&1&0\\0&1&-1&-1&0\end{array}\right]\underrightarrow{w_2-w_1}\left[\begin{array}{cccc|c}1&2&-1&0&0\\0&-1&1&1&0\\0&1&-1&-1&0\end{array}\right]\underrightarrow{w_3+w_2}\]

    \[\left[\begin{array}{cccc|c}1&2&-1&0&0\\0&-1&1&1&0\\0&0&0&0&0\end{array}\right]\underrightarrow{w_2\cdot(-1)}\]

    \[\left[\begin{array}{cccc|c}1&2&-1&0&0\\0&1&-1&-1&0\\0&0&0&0&0\end{array}\right]\underrightarrow{w_1-2w_2}\left[\begin{array}{cccc|c}1&0&1&2&0\\0&1&-1&-1&0\\0&0&0&0&0\end{array}\right]\]

So the general solution, which will tell us what are the coefficients in the system we are looking for, is the following: (-c-2d,c+d,c,d). Notice that the system we are trying to find is not unique, many systems of equations are equivalent. Actually, the set of possible coefficients of an equation which is fulfilled by any vector in the given space forms also a vector space. We write down its basis using the calculated general solution: (-1,1,1,0),(-2,1,0,1). If we take those two vectors and write down two equations with such coefficients we will be able to get any equation which is fulfilled by any vector of our given space by combining them linearly. It means the system we look for is the system with coefficients from the calculated basis, namely:

    \[\begin{cases}-x+y+z=0\\-2x+y+w=0\end{cases}\]

And we are done!

Rank of a matrix

It is easy to prove, that in a square matrix, the rows are linearly independent if and only if the columns are linearly independent. This implies that the dimension of the space spanned by the rows of a matrix is the same as the dimension of the space spanned by the column of this matrix. This dimension is called the rank of matrix and denoted by r(A).

Kronecker-Capelli Theorem

This is the theorem which describes when a system of equations has a solution in terms of ranks of matrices.

Let M be a matrix of a system of equations U without the column of free coefficients, and M_u be the matrix of this system of equations with this column. Then the system U has a solution if and only if r(M)=r(M_u). Moreover, the dimension of the space of solutions of the homogeneous system of equations U' described by matrix M equals to n-r(M), where n is the number of variables. Also, if v is a solution of U, and W is the space of solutions of homogeneous system of equations U', then the set of solutions of U has the form \{v+w\colon w\in W\}.