INNER PRODUCT & ORTHOGONALITY

 

Definition: The Inner or "Dot" Product of the vectors:  ,  is defined as follows.

 

 

Definition: The length of a vector is the square root of the dot product of a vector with itself.

 

 

Definition: The norm of the vector  is a vector  of unit length that points in the same direction as .

 

 

Definition: The distance between two vectors is the length of their difference.

 

 

                                                                        Page 1 of 15


 

Definition: Two vectors are orthogonal to each other if their inner product is zero. That means that the projection of one vector onto the other "collapses" to a point. So the distances from  to  or from  to   should be identical if they are orthogonal (perpendicular) to each other.

 

 

 

 

 

 

 

 

The two distances are thus only the same if the two vectors have zero projection one on another.

 

                                                                        Page 2 of 15


 

Example # 1: Determine if  and  are orthogonal.

The vectors are orthogonal because .

 

Example # 2: Verify the parallelogram law for vectors

*       and  in .

*         

 

 

 

Add the last two equations and the parallelogram law in n-space is confirmed.

 

                                                                            Page 3 of 15


 

Example # 3: Let . Show that if  is orthogonal to each of the vectors  , then it is orthogonal to every vector in "W".

 

 

 

Definition: If   is orthogonal to every vector in a subspace "W", then it is said to be orthogonal to "W". The set of all such vectors is called the orthogonal complement of "W".

 

Theorem: Let "A" be an m x n matrix. Then the orthogonal complement of the row space of "A" is the null space of "A", and the orthogonal complement of the column space of "A" is the null space of .

 

                                                                            Page 4 of 15


 

If , then  is orthogonal to every row in "A" because the dot product between each row in "A" and the vector  is zero. Since the rows of "A" span row space ,  , is orthogonal to .

 

Since , if , then   is orthogonal to every column in "A".

 

Definition: A set of vectors is said to be an orthogonal set if each and every pair of different vectors in the set is orthogonal. Also, an orthogonal set of "p" vectors spans a p-space and is an orthogonal basis for that space.

 

Example # 4: Determine if the given set of vectors is orthogonal.

 

                     

 

There are three distinct pairs.

 

 

They are not orthogonal.

 

                                                                            Page 5 of 15


 

Theorem: Let {  } be an orthogonal basis for "W". Then each  in "W" has a unique representation as a linear combination of that basis. If,  , then

 

Example # 5: Show that that {   } is an orthogonal basis and express  as a linear combination of that set.

 

            

 

 

They are orthogonal and thus span 2-space.

 

 

 

   

 

                                                                   Page 6 of 15


 

  

 

 

Example # 6: Compute the orthogonal projection of

 onto the line thru  and the origin.

 

The orthogonal projection of   onto   is given by  and  is the component orthogonal to . Therefore, in our specific example we have  and .

 

 

 

                                                                          Page 7 of 15


 

Example # 7: Let "U" and "V" be orthogonal matrices.

Explain why "UV" is an orthogonal matrix. (That is, explain why "UV" is invertible and its inverse is  .

 

An orthogonal matrix, "U", is a square invertible matrix such that : .

 

 

But ,

 

Therefore , "(UV)" is an orthogonal matrix.

 

Definition: Let "W" be a subspace of  then each  in  can be written uniquely in this form:  where  is in "W" and  is orthogonal to the subspace, "W". If {  } is any orthogonal basis of "W", then this is the orthogonal decomposition of   and  

 

Example # 8: Assume that  {  } is an orthogonal basis for . Write  as the sum of two vectors, one in Span {  } and the other in

Span {  }.

 

                                                                            Page 8 of 15


 

        

 

 

     

 

         

 

                                                                           Page 9 of 15


 

        

 

 

 

 

 

                                                                          Page 10 of 15


 

 

 

Example # 9: Find the orthogonal projection of   onto the subspace spanned by the orthogonal vectors  .

 

        

 

 

 

                                                                          Page 11 of 15


 

   

 

 

 

 

 

                                                                          Page 12 of 15


 

Example # 10: Let "W" be the subspace spanned by the   and write  as the sum of a vector in "W" and a vector orthogonal to "W".

 

         

 

    

 

 

 

 

                                                                          Page 13 of 15


 

 

 

     

 

 

                                                                    Page 14 of 15


 

Example # 11: Find the closest point to y in the subspace "W" spanned by   & .

 

           

 

The orthogonal projection of   onto "W" is the closest point in "W" to   .

 

     

       

 

 

                                                                          Page 15 of 15