Systems of Linear Equations

The basic approach that we will take in this course is to start with simple, specialized examples that are designed to illustrate the concept before the concept is introduced with all of its generality.

Example # 1: Solve this system of 2 equations with 2 unknowns.

This system can be stated in matrix form, .

First we look at the "row picture". This calls for us to plot the implications of the first row and then the second row.

Page 1 of 18

Both rows produce straight lines. Row 1 and row 2 lines intersect at the point:. Therefore, the system has a solution, namely,  and .

Now we look at the " column picture". This calls for us to form this linear combination of the two columns of the matrix, "A".

The solution to this equation requires us the right choices of "x" and "y" so as to yield exact number of column 1s and the exact number of column 2s such that their vector sum produces the vector "".

Page 2 of 18

We already know the answer from the row picture. We need to add 1 column of column 1 to 2 columns of column 2 to produce our particular vector, .

This vector equation calls for us to plot the sum of the 2 vectors on the left side of the equation to confirm that the result is indeed the vector on the right side.

Here we placed the tail of 2 times the column 2 vector to the tip of column 1 vector to obtain the vector sum whose tip resides at the coordinates:.

Page 3 of 18

The idea of a LINEAR COMBINATION of vectors is the CENTRAL THEME of Linear Algebra. That's why we emphasize the "Column Picture".

What if we chose a different vector, . Could we find a new linear combination of the column vectors of the matrix, "A", to satisfy the matrix equation, ? The answer is YES for our particular matrix because it is invertible or non-singular. (We'll define these terms later.)

The simple explanation for why we can always find a linear combination of the columns to produce any 2-dimensional vector we choose is that the two columns are Linearly Independent of each other. In our 2-dimensional example, that means that the two column vectors of the matrix, "A", are NOT PARALLEL!

Since the two vectors point in different directions, you can always find a unique linear combination of them that results in the vector, , that you chose. Accordingly, all points in 2-dimensions can be "reached" by the appropriate choice of how much weight to give each column vector in the sum. Thus, the column vectors of our matrix "A" are said to Span the whole plane of 2-space.

Page 4 of 18

Example # 2: Solve this system of 3 equations with 3 unknowns.

First we look at the "row picture". This calls for us to plot the implications of each of the three rows individually in 3-dimensional space.

Page 5 of 18

Note that two intersecting planes create a line along their intersection. So each of the two pairs of planes create a line of intersection. Those two lines of intersection intersect each other at a point. That point: is the "solution" to the system of equations.

Without the aide of computer graphics, I would have had considerable difficulty in sketching this 3-dimensional situation. For higher dimensional systems, we must rely on our imaginations and analogues with 2 & 3-dimensional spaces.

Now we look at the " column picture".

What linear combination of the three columns of "A" do we need in order to construct the vector b? In this case, it is obvious that we need none of the first or second columns and precisely one of the third column. Accordingly, the solution is , , and , which is the intersection point: .

Page 6 of 18

Do all of the possible linear combinations of the three column vectors fill all of 3-dimensional space? Here a picture will answer the question for us.

Since the three column vectors do NOT lie in the same plane, every point in 3-space can be "reached" by a vector that is a unique linear combination of the three column vectors. So the answer is YES!

Before we begin our discussion of a systematic approach to solving the equation: , we should show the 2 ways to multiply a matrix times a vector.

Page 7 of 18

Example # 3: Perform the indicated multiplication.

Here is the "row method" which features the inner product of each row of "A" with the vector.

Here is multiplication by the " column method".

Page 8 of 18

Now we introduce a systematic procedure for solving Systems of Linear Equations.

A system of linear equations may have a unique solution, no solution, or an infinity of solutions.

Example # 4: Determine the solution(s) if any of the given system of linear equations.

Form the Augmented Matrix,"",   by including the vector, , as another column of the matrix "A".

Page 9 of 18

We will use elementary row operations to obtain an upper triangular matrix for the "A" portion of "".

Scale row 3 by "2".

Replace row 3 by the sum of itself and row 1.

Replace row 3 by the sum of itself and  times row 2.

We have now achieved our objective of obtaining  an upper triangular matrix for the "A" portion of "". This is termed Echelon Form. It is short-hand for the following system which is equivalent to our original system of Linear Equations.

Page 10 of 18

We can now readily solve this equivalent system by back-substitution.

We could have further simplified the Echelon Form to the Reduced Echelon Form.

Here is our Echelon Form.

Page 11 of 18

Replace row 1 by the sum of itself and  times row 2.

Replace row 1 by the sum of 4 times itself and 1 times row 3.

Replace row 2 by the sum of 4 times itself and 1 times row 3.

Scale all three rows to achieve "1" in the pivot positions.

Page 12 of 18

We can now readily solve this equivalent system again by back-substitution but now we have decoupled all of the variables and can thus cite the solution by inspection.

This is an example where there is a unique solution to the given Linear System of Equations. It corresponds geometrically to the intersection of the three nonparallel planes.

Example # 5: Determine the solution(s) if any of the given system of linear equations.

Page 13 of 18

As before, form the augmented matrix,"".

Using elementary row operations we obtain an Echelon Form.

Note that the last row which corresponds to the first equation in back-substitution is a contradiction.

Page 14 of 18

Thus, this System of Linear Equations has no solution and is said to be inconsistent. That is because there are one or more contradictory constraints in the system.

Example # 6: Determine the solution(s) if any of the given system of linear equations.

Use elementary row operations to obtain the reduced row echelon form.

The pivot columns are 1,2, & 4. The variables associated with those columns are termed basic variables. The other two variables which are associated with columns 3 & 5 are termed free variables.

Page 15 of 18

Express the basic variables in terms of the free variables.

There an thus an infinity of solutions because there is at least one free variable. In this case, we have two.

Earlier, we briefly made reference to Linear Combinations of vectors and the concept of space spanning. Now, we examine these topics more closely.

Page 16 of 18

Definition: If  are in , then the set of all linear combinations of  is denoted by Span{} and is called the subset of  spanned by . That is, Span{}, is the set of all vectors formed by the Linear Combination of . Those vectors, thus, have this form, where the  are scalars.

Example # 7: Give a geometric description of

Span{, }.

These vectors reside in . But they do not span . Instead they span a plane in  that passes thru the origin. More specifically, that plane is the xz-plane.

Page 17 of 18

Theorem:The equation, , has a solution if and only if  is a linear combination of the columns of "A".

Let "A" have pxn dimensions with columns, .

Thus,  must be expressible as a linear combination of the columns of "A" for   to have at least one solution.

Page 18 of 18