MATH 323 Lecture 12

From Notes
Jump to navigation Jump to search

« previous | Thursday, October 4, 2012 | next »

Substitute lecture

Chapter 3.4: Basis and Dimension

Definitions

We say that vectors are linearly independendent if

We say that span if any can be written as a linear combination of .

Let be a vector space.

The set is called a basis in if

  1. are linearly independent
  2. They span

The number of vectors in the basis is called the dimension of

Theorem

All bases in have the same number of elements called the dimension of .

Note: it is possible for a "finitely defined" Vector space to have infinite dimension.

Theorem 3.4.1

If is a spanning set of , , then any vectors are linearly independent.

Proof. (substitution/elimination process).

If , is a 0 vector, then there is nothing to prove:

If , then . At least one of the 's is nonzero. Without loss of generality, we assume that it is which is nonzero.

Therefore , so is a spanning set for .

Repeat the process with the above spanning set and . When we can no longer continue, we end up with as a spanning set for , and

Example

has bases , and , .

Let , , and

Note: is a spanning set of .

Corollary 3.4.2

If and are two bases in , then .

Proof. Assume 's are spanning set and , then from Thm. 3.4.1, we know that are linearly dependent. Not a basis (contradiction)

Flip and by same logic → contradiction. Therefore .

Infinite Dimension

If the vector space has no basis of finitely many vectors, we say that has infinite dimension.

If , we say that

Example

Find the dimension of the space of solutions to the system

Three variables, two constraints:

Note that the solutions form a vector space, and the set of solutions will be the nullspace of the coefficient matrix .

The solution is of the form , so the basis has only one vector. Therefore

Example

Find the dimension of the nullspace for the operation over

In other words, does . For a polynomial of degree ≤ 1, . If , then , and is anything. Therefore, , the basis is , and .

Example

A basis in would be:

Theorem 3.4.3

Let be a vector space, and . Then

  1. any linearly independent vectors span
  2. any vectors that span are linearly independent.


Proof.

  1. assume they do not span : there is a vector which is not a linear combination of 's, then would be linearly independent. this contradicts #Theorem 3.4.1 since there are vectors with only in a basis.
  2. assume they are linearly dependent: then , where are not all 0. This means that one vector could be written as a linear combination of other vectors, so vectors would span . This contradicts the definition of a basis, stating that vectors must be linearly independent.

Theorem 3.4.4

  1. No set of fewer than vectors spans
  2. Any linearly independent vectors could be extended (by more vectors) to form a basis.
  3. If , , span , we can pare them down to vectors, which would be a basis.