__Systems of
Linear Equations__

The basic approach that we will take in this course is to start with simple, specialized examples that are designed to illustrate the concept before the concept is introduced with all of its generality.

__Example # 1__: Solve this system of 2 equations with 2 unknowns.

_{}

_{}

This system can be stated in matrix form, _{}.

_{}

_{ }

First we look at the "* row
picture*". This calls for us to plot the implications of the
first row and then the second row.

Page 1 of 18

_{}

Both rows produce straight lines.*
Row 1* and *row 2* lines
intersect at the point:_{}.
Therefore, the system has a solution, namely, _{} and _{}.

Now we look at the "*
column picture*". This calls for us to form this linear
combination of the two columns of the matrix, "A".

_{}

The solution to this equation requires us the right choices
of "x" and "y" so as to yield exact number of column 1s and
the exact number of column 2s such that their *vector
sum* produces the vector "_{}".

Page 2 of 18

We already know the answer from the row picture. We need to
add 1 column of column 1 to 2 columns of column 2 to produce our particular
vector, _{}.

_{}

This vector equation calls for us to plot the sum of the 2 vectors on the left side of the equation to confirm that the result is indeed the vector on the right side.

_{}

Here we placed the tail of 2 times the *column 2 vector* to the tip of *column 1 vector* to obtain the *vector sum *whose tip resides at the
coordinates:_{}.

Page 3 of 18

The idea of a* *LINEAR COMBINATION of vectors is the CENTRAL THEME of Linear Algebra. That's why we emphasize the "*Column Picture*".

What if we chose a different vector, _{}.
Could we find a new linear combination of the column vectors of the matrix,
"A", to satisfy the matrix equation, _{}? The
answer is YES for our particular matrix
because it is invertible or non-singular. (We'll define these terms later.)

The simple explanation for why we can always find a linear
combination of the columns to produce any 2-dimensional vector we choose is
that the two columns are * Linearly Independent*
of each other. In our 2-dimensional example, that means that the two column vectors of the matrix, "A",
are NOT PARALLEL!

Since the two vectors point in different directions, you can
always find a unique linear combination of them that results in the vector, _{},
that you chose. Accordingly, all points in 2-dimensions can be
"reached" by the appropriate choice of how much weight to give each
column vector in the sum. Thus, the column vectors of our matrix "A"
are said to__ ____Span__* *the whole plane of 2-space.

Page 4 of 18

__Example # 2__: Solve this system of 3 equations with 3 unknowns.

_{}

_{}

_{}

_{ }

First we look at the "* row
picture*". This calls for us to plot the implications of
each of the three rows individually in 3-dimensional space.

_{}

Page 5 of 18

Note that two intersecting planes create a line along their
intersection. So each of the two pairs of planes create a line of intersection.
Those two lines of intersection intersect each other at a point. That point:_{} is
the "solution" to the system of equations.

Without the aide of computer graphics, I would have had considerable difficulty in sketching this 3-dimensional situation. For higher dimensional systems, we must rely on our imaginations and analogues with 2 & 3-dimensional spaces.

Now we look at the "* column
picture*".

_{}

What linear combination of the three columns of
"A" do we need in order to construct the vector b? In this case, it
is obvious that we need none of the first or second columns and precisely one
of the third column. Accordingly, the solution is _{}, _{}, and
_{},
which is the intersection point: _{}.

Page 6 of 18

Do all of the possible linear combinations of the three column vectors fill all of 3-dimensional space? Here a picture will answer the question for us.

_{}

Since the three column vectors do NOT lie in the same plane, every point in 3-space can be "reached" by a vector that is a unique linear combination of the three column vectors. So the answer is YES!

Before we begin our discussion of a systematic approach to solving the equation: , we should show the 2 ways to multiply a matrix times a vector.

Page 7 of 18

__Example # 3__: Perform the indicated multiplication.

_{}

Here is the "* row method*"
which features the inner product of each row of "A" with the vector.

_{}

Here is multiplication by the "* column method*".

_{}

Page 8 of 18

Now we introduce a systematic procedure for solving Systems of Linear Equations.

A system of linear equations may have a unique solution, no solution, or an infinity of solutions.

__Example # 4__: Determine the solution(s) if any of the given system of
linear equations.

_{}

_{}

_{}

_{}

_{ }

Form the *Augmented Matrix*,"_{}", * *by
including the vector, _{}, as another column of the matrix "A".

_{}

Page 9 of 18

We will use *elementary row
operations* to obtain an upper triangular matrix for the
"A" portion of "_{}".

*Scale* row 3 by
"2".

_{}

*Replace* row 3 by
the sum of itself and row 1.

_{}

*Replace* row 3 by
the sum of itself and _{} times row 2.

_{}

We have now achieved our objective of obtaining an upper triangular matrix for the
"A" portion of "_{}".
This is termed *Echelon Form*. It is
short-hand for the following system which is equivalent to our original system
of Linear Equations.

Page 10 of 18

_{}

_{}

_{}

_{}

_{}

We can now readily solve this equivalent system by *back-substitution*.

_{}

_{}

_{}

We could have further simplified the Echelon Form to the *Reduced Echelon Form*.

Here is our Echelon Form.

_{}

Page 11 of 18

*Replace* row 1 by
the sum of itself and _{} times row 2.

_{}

*Replace* row 1 by
the sum of 4 times itself and 1 times row 3.

_{}

*Replace* row 2 by
the sum of 4 times itself and 1 times row 3.

_{}

*Scale* all three
rows to achieve "1" in the *pivot positions*.

_{}

Page 12 of 18

We can now readily solve this equivalent system again by *back-substitution *but
now we have decoupled all of the variables and can thus cite the solution by
inspection.

_{}

_{}

_{}

_{ }

_{}

This is an example where there is a unique solution to the given Linear System of Equations. It corresponds geometrically to the intersection of the three nonparallel planes.

__Example # 5__: Determine the solution(s) if any of the given system of
linear equations.

_{}

_{}

_{}

_{}

_{}

_{ }Page
13 of 18

As before, form the augmented matrix,"_{}".

_{}

Using elementary row operations we obtain an Echelon Form.

_{}

_{}

_{}

Note that the last row which corresponds to the first equation in back-substitution is a contradiction.

_{}

Page 14 of 18

Thus, this System of Linear Equations has no solution and is
said to be *inconsistent*. That is
because there are one or more contradictory constraints in the system.

__Example # 6__: Determine the solution(s) if any of the given system of
linear equations.

_{}

_{}

_{}

_{}

_{}

Use elementary row operations to obtain the reduced row echelon form.

_{}

The *pivot columns*
are 1,2, & 4. The variables associated with those columns are termed *basic variables*. The
other two variables which are associated with columns 3 & 5 are termed *free variables*.

Page 15 of 18

Express the basic variables in terms of the free variables.

_{}

_{}

_{}

_{}

There an thus an infinity of solutions because there is at least one free variable. In this case, we have two.

Earlier, we briefly made reference to Linear Combinations of vectors and the concept of space spanning. Now, we examine these topics more closely.

Page 16 of 18

__Definition__: If _{} are in _{},
then the set of all linear combinations of _{} is denoted by Span{_{}} and
is called the subset of _{} spanned by _{}.
That is, Span{_{}}, is
the set of all vectors formed by the Linear Combination of _{}.
Those vectors, thus, have this form, where the _{} are scalars.

_{}

__Example # 7__: Give a geometric description of

Span{_{}, _{}}.

_{ }

These vectors reside in _{}. But
they do not span _{}.
Instead they span a plane in _{} that passes thru the origin. More
specifically, that plane is the xz-plane.

Page 17 of 18

_{}

__Theorem__:The equation, _{}, has
a solution if and only if _{} is a linear combination of the columns of
"A".

Let "A" have pxn dimensions with columns, _{}.

_{}

_{}

Thus, _{} must be expressible as a linear combination of
the columns of "A" for _{} to have at least one solution.

Page 18 of 18