Linear Algebra/OLD/Vector Spaces

From Wikibooks, open books for an open world
Jump to navigation Jump to search

A vector space is a way of generalizing the concept of a set of vectors. For example, the complex number 2+3i can be considered a vector, since in some way it is the vector .

The vector space is a "space" of such abstract objects, which we term "vectors".

Some familiar friends[edit | edit source]

Currently in our study of vectors we have looked at vectors with real entries: , and so on. These are all vector spaces. The advantage we gain in abstracting to vector spaces is a way of talking about a space without any particular choice of objects (which define our vectors), operations (which act on our vectors), or coordinates (which identify our vectors in the space). Further results may be applied to more general spaces which may have infinite dimension, such as in Functional Analysis.

Notations and concepts[edit | edit source]

We write a vector, like we have before, bold, but you should write these on paper underlined or with an arrow on top. So we write for that vector.

When we multiply a vector by a scalar number, we usually ascribe it a Greek letter, writing λv for the multiplication of v by a scalar λ. We write addition and subtraction of vectors as we have been doing before, x+y for the sum of vectors x and y.

With scalar multiplication and adding vectors, we can move to our definition of a vector space.

When we refer to an operation being 'closed' in a definition, we are saying that the result of the operation does not violate our definition. For example, if we are looking at the set of all integers, we can say that it is closed under addition, because adding any integers results in something inside the set of integers. However the set of integers is not closed under division, because dividing 3 by 2 (for example) doesn't result in a member of the set of integers.

Definition[edit | edit source]

A vector space is a nonempty set V of objects, called vectors, on which are defined two operations, called vector addition and scalar multiplication, respectively, such that, for and α, where F is a field x+y and αx are well-defined elements of V with the following properties:

  1. commutativity of addition: x+y=y+x
  2. associativity of addition: x+(y+z)=(x+y)+z
  3. additive identity: there is a vector 0 such that 0+x=x for all x
  4. additive inverse: for each vector x, there exists another vector y such that x+y =0
  1. scalar associativity: α(βx) = (αβ)x
  2. scalar distributivity: (α + β)xxx
  3. vector distributivity: α(x+y)=αxy
  4. scalar identity: 1x=x

Alternative Definition[edit | edit source]

People who are familiar with group theory and field theory may find the following alternative definition more compact:

  • is an Abelian group.

Some Basic Theorems[edit | edit source]

  1. The 0 vector is unique.
    Proof: Let 01 and 02 both be 0 vectors. Then 01=01+02=02.
  2. The additive inverse is unique.
    Proof: Suppose there is a x and y1 and a y2. are inverses of x Then y1=y1+(x+y2)=(y1+x)+y2=y2.
  3. 0x=0.
    Proof: Let y be the additive inverse of x. Then 0x= 0x + x + y = (0+1)x + y = x+y=0.
  4. (-1)x is the inverse of x.
    Proof: x+(-1)x=(1-1)x=0x=0.

Linear Spaces[edit | edit source]

The linear space is a very important vector space. Let n1, n2, n3, ..., nk be k elements of a field F. Then the ordered k-tuples (n1, n2, n3, ..., nk) form a vector space with addition being the sum of the corresponding numbers, and scalar multiplication by an element of F being the result of multiplying each one in the k-tuple. This would then be the k dimensional linear space.

Subspaces[edit | edit source]

A subspace is a vector space inside a vector space. When we look at various vector spaces, it is often useful to examine their subspaces.

The subspace S of a vector space V means that S is a subset of V and that it has the following key characteristics

  • S is closed under scalar multiplication: if λ∈R, v∈S, λv∈S
  • S is closed under addition: if u, v ∈ S, u+v∈S.

Any subset with these characteristics is a vector space.

The trivial subspace[edit | edit source]

The singleton set with the zero vector ({0}) is a subspace of every vector space.

Scalar multiplication closure: a 0=0 for all a in R

Addition closure: 0+0=0. Since 0 is the only member of the set we need to check only 0

Zero vector: 0 is the only member of the set and it is the zero vector.

Examples[edit | edit source]

Let us examine some subspaces of some familiar vector spaces, and see how we can prove that a certain subset of a vector space is in fact a subspace.

A slightly less trivial subspace[edit | edit source]

In R2, the set V of all vectors from R2 of the form (0,α) where α is in R is a subspace

Scalar multiplication closure: a (0,α) = (0,a α) and a α is in R

Addition closure: (0,α) +(0,β) =(0, α + β) and α + β is in R

Zero vector: taking α to be zero in our definition of (0, α) in V we get the zero vector (0,0)

A whole family of subspaces[edit | edit source]

Pick any number from R, say ρ. Then the set V of all vectors of the form (α, ρα) is a subspace of R2

Scalar multiplication closure: a (α, ρα) = (aα, ρaα) which is in V.

Addition closure: (α, ρα) +(β, ρβ) =(α + β, ρα + ρβ) = (α+β, ρ(α+β)) which is in V

Zero vector: taking α to be zero in our definition we get (0, ρ0) = (0,0) in V.

That means V2 = the set of all vectors of the form (α,2α) is a subspace of R2

and V3 = the set of all vectors of the form (α,3α) is a subspace of R2

and V4 = the set of all vectors of the form (α,4α) is a subspace of R2

and V5 = the set of all vectors of the form (α,5α) is a subspace of R2

and Vπ = the set of all vectors of the form (α,πα) is a subspace of R2

and V√2 = the set of all vectors of the form is a subspace of R2

As you can see, even a simple vector space like R2 can have many different subspaces.

Linear Combinations, Spans and Spanning Sets, Linear Dependence, and Linear Independence[edit | edit source]

Linear Combinations[edit | edit source]

Definition: Assume is a vector space over a field and is a nonempty subset of . Then a vector is said to be a linear combination of elements of if there exists a finite number of elements and such that .

Spans[edit | edit source]

Definition: Assume is a vector space over a field . The set of all linear combinations of is called the span of . This is sometimes denoted by .


Note that is a subspace of .

Proof: Consider closure under addition and scalar multiplication for two vectors, x and y, in the span of the vectors

, which is also contained in the set.

, which is also contained in the set.


Spanning Sets[edit | edit source]

Definition: Assume is a vector space over a field and are vectors in such a vector space. The set is a spanning set for the vector space if and only if every vector in is a linear combination of . Alternately,

Linear Independence[edit | edit source]

Definition: Assume is a vector space over a field and is a finite subset of . Then we say is linearly independent if implies .

Linear independence is a very important topic in Linear Algebra. The definition implies that linearly dependent vectors may form the nulvector as a non-trivial combination, from which we may conclude that one of the vectors can be expressed as a linear combination of the others.

If we have a vector space V spanned by 3 vectors we say that v1, v2, and v3 are linearly dependent if there is a combination of one or two of them that can produce a third. For instance, if one of the following equations:

can be satisfied, then the vectors in V are said to be linearly dependant.

How can we test for linear independence? The definition sets it out to us: If V is a vector space spanned by 3 vectors of length N:

and we try to test whether these 3 vectors are linearly independent, we form the equations:

and solve them. If the only solution is

then the 3 vectors are linearly independent. If there is another solution they are linearly dependent.


?????? We can say that for V to be linearly independent it must satisfy this condition:

Where we are using 0 to denote the null vector in V. If is square and invertable, we can solve this equation directly:

And if we know that is zero, then we know that the system is linearly independent. If, however, is not square, or if it is not invertable, we can try the following technique:

Multiply through by the transpose matrix:

Find the inverse of , and multiply through by the inverse:

Cancel the terms:

And our conclusion:

This again means that V is linearly independent.

Span[edit | edit source]

A span is the set of all possible vectors that are in a given vector space.

Basis[edit | edit source]

A basis for a vector space is the least amount of linearly independent vectors that can be used to describe the vector space completely. The most common basis vectors are the kronecker vectors, also called canonical basis:

In the cartesian graphing space, we say an ordered triple of coordinates is defined as:

And we can make any point (x, y, z) by combining the kronecker basis vectors:

Some theorems:

  • A basis of a vector space V has the maximal number of linearly independent vectors.
  • (Converse) A maximal number of linearly independent vectors in a vector space is a basis.

Bases and Dimension[edit | edit source]

If a vector space V is such that:
it contains a linearly independent set B of N vectors, and

any set of N + 1 or more vectors in V is linearly dependent,

then V is said to have dimension N, and B is said to be a basis of V.


External links[edit | edit source]

TODO[edit | edit source]

Tell about what is a basis in a vector space and about coordinate transformations. (this article contains an abstract definition of a basis which is a generalization of a basis in vector space and can be used as the foundation to explain about bases and coordinate transformations.)


Discuss the geometry of subspaces (points, lines, planes, hypersurfaces) and connect them to the geometry of solutions of linear systems. Connect the algebra of subspaces and linear combinations of vectors to the algebra of linear systems.