__INNER PRODUCT
& ORTHOGONALITY__

__Definition__: The *Inner or
"Dot" Product *of the
vectors: _{} , _{} is defined as
follows.

_{}

__Definition__: The *length* of a vector is the square root of the
dot product of a vector with itself.

_{}

__Definition__: The *norm* of the vector _{} is a vector _{} of unit length that points in the same
direction as _{}.

_{}

__Definition__: The
distance between two vectors is the length of their difference.

_{}

Page 1 of 15

__Definition__: Two
vectors are *orthogonal* to each other
if their inner product is zero. That means that the projection of one vector
onto the other "collapses" to a point. So the distances from _{} to _{} or from _{} to _{} should be identical if they are orthogonal
(perpendicular) to each other.

_{}

_{}

_{}

_{}

_{}

_{}

_{}

The two distances are thus only the same if the two vectors have zero projection one on another.

Page 2 of 15

__Example # 1__:
Determine if _{} and _{} are orthogonal.

_{}

The vectors are orthogonal because _{}.

__Example # 2__: Verify
the parallelogram law for vectors

and _{} in _{}.

_{}

_{}

_{}

Add the last two equations and the parallelogram law in n-space is confirmed.

Page 3 of 15

__Example # 3__: Let _{}.
Show that if _{} is orthogonal to each of the vectors _{} , then it is orthogonal to every vector in
"W".

_{}

_{}

_{}

__Definition__: If _{} is orthogonal to every vector in a subspace
"W", then it is said to be orthogonal to "W". The set of
all such vectors is called the *orthogonal
complement* of "W".

__Theorem__: Let
"A" be an m x n matrix. Then the orthogonal complement of the row
space of "A" is the null space of "A", and the orthogonal
complement of the column space of "A" is the null space of _{}.

Page 4 of 15

If _{},
then _{} is orthogonal to every row in "A"
because the dot product between each row in "A" and the vector _{} is zero. Since the rows of "A" span
row space , _{} , is orthogonal to _{}.

Since _{}, if
_{},
then _{} is
orthogonal to every column in "A".

__Definition__: A set of
vectors is said to be an* orthogonal set*
if each and every pair of different vectors in the set is orthogonal. Also, an
orthogonal set of "p" vectors spans a p-space and is an orthogonal
basis for that space.

__Example # 4__:
Determine if the given set of vectors is orthogonal.

_{ }

There are three distinct pairs.

_{}

They are not orthogonal.

Page 5 of 15

__Theorem__: Let { _{} } be an orthogonal basis for "W".
Then each _{} in "W" has a unique representation
as a linear combination of that basis. If, _{} , then _{}

__Example # 5__: Show
that that { _{} } is an orthogonal basis and express _{} as a linear combination of that set.

_{ }

_{}

They are orthogonal and thus span 2-space.

_{}

_{}

_{ }

Page 6 of 15

_{ }

_{}

__Example # 6__: Compute
the orthogonal projection of

_{} onto the line thru _{} and the origin.

The orthogonal projection of _{} onto _{} is given by _{} and _{} is the component orthogonal to _{}.
Therefore, in our specific example we have _{} and _{}.

_{}

_{}

Page 7 of 15

__Example # 7__: Let
"U" and "V" be orthogonal matrices.

Explain why "UV" is an orthogonal matrix. (That
is, explain why "UV" is invertible and its inverse is _{}.

An orthogonal matrix, "U", is a square invertible
matrix such that : _{}.

_{}

But , _{}

Therefore , "(UV)" is an orthogonal matrix.

__Definition__: Let
"W" be a subspace of _{} then each _{} in _{} can be written uniquely in this form: _{} where _{} is in "W" and _{} is orthogonal to the subspace, "W".
If { _{} } is any orthogonal basis of "W",
then this is the orthogonal decomposition of _{} _{} and _{}

__Example # 8__: Assume
that { _{} } is an orthogonal basis for _{}.
Write _{} as the sum of two vectors, one in Span { _{} } and the other in

Span { _{} }.

Page 8 of 15

_{ }

_{}

_{ }

_{ }

Page 9 of 15

_{ }

_{}

_{}

_{}

_{}

Page 10 of 15

_{}

_{}

__Example # 9__: Find the
orthogonal projection of _{} onto the subspace spanned by the orthogonal
vectors _{} .

_{ }

_{ }

Page 11 of 15

_{ }

_{}

_{}

_{}

_{}

Page 12 of 15

__Example # 10__: Let
"W" be the subspace spanned by the _{} and write _{} as the sum of a vector in "W" and a
vector orthogonal to "W".

_{ }

_{ }

_{ }

_{}

Page 13 of 15

_{}

_{}

_{ }

_{}

Page 14 of 15

__Example # 11__: Find
the closest point to y in the subspace "W" spanned by _{} & _{}.

_{ }

The orthogonal projection of
_{} onto "W" is the closest point in
"W" to _{} .

_{ }

_{ }

_{}

Page 15 of 15