Inner Product Spaces and Orthogonal Complements



Introduction


The following entries are derived from a relatively large yet concise topic called Inner Product Spaces. I would focus on two subtopics which are the Inner Product Spaces themselves and Orthogonal Complements. Other essential subtopics would also be posted in the form of background knowledge to ensure the thoroughness of readers' understanding. Please also note that the Cross Products subtopic is not required in the context of MA 26500.


Part 1: Inner Product Spaces


Background Knowledge - The Basics of Vectors

Inner Product Spaces

As appeared briefly for calculating the angle between two vectors in the background knowledge section, the standard scalar inner product is defined as:

$ \langle v,w\rangle = v_1w_1 + v_2w_2 + \cdots + v_nw_n $

for any vectors v and w in Rn.


The properties of this standard inner product are:

  • $ \langle v,v\rangle \geq 0 $; $ \langle v,v\rangle = 0 $
if and only if v = 0;
  • $ \langle v,w\rangle = \langle w,v\rangle $;
  • $ \langle v + u,w\rangle = \langle v,w\rangle + \langle u,w\rangle $;
  • $ \langle kv,w\rangle = \langle v,kw\rangle = k\langle v,w\rangle $.


Another type of inner products can be seen in continuous functions such as in the form of:

$ \langle f,g\rangle = \int_a^b \! f(x)g(x)\,dx \, $

where x is greater than zero.


In continuation, an inner product space is a vector space with an inner product. Orthogonality in an inner product space occurs when the following example of conditions occurs:

  • $ \langle v,w\rangle = 0 $
and
  • $ \langle v,v\rangle = 1 $
and
  • $ \langle w,w\rangle = 1 $.


Lastly, the Cauchy-Bunyakovsky-Schwarz (CBS) Inequality states the following for inner product spaces:

$ |\langle v,w\rangle| \leq \|v\| \|w\|\, $.

In most contexts, this is indeed equivalent to the Triangle Inequality that states:

$ \displaystyle \|v + w\| \leq \|v\| + \|w\| $

where v and w are the shorter vectors of a triangle.


Part 2: Orthogonal Complements


Background Knowledge - The Gram-Schmidt Algorithm

Orthogonal Complements

Let V be an inner product space with a subspace called W. In this case, an orthogonal complement W (read as "W perp") is the set of all vectors in V that are orthogonal to all vectors in W.


In relation to fundamental subspaces associated with a matrix, the following statements are true when applied to an m x n matrix called A:

  • The null space of A is the orthogonal complement of the row space of A.
  • The null space of AT is the orthogonal complement of the column space of A.


The other prominent application would be to deal with projections as appeared briefly in the background knowledge section. In such a case, the derivation of the concept is shown in the following:

for conditions such as

$ \mathbf{v}=\mathbf{w} + \mathbf{u}, $

we transform the given orthonormal basis {w1,w2,...,wn} using the Gram-Schmidt Algorithm to get

$ \mathbf{w}=\langle\mathbf{v},\mathbf{w}_1\rangle\mathbf{w}_1 + \langle\mathbf{v},\mathbf{w}_2\rangle\mathbf{w}_2 + \cdots + \langle\mathbf{v},\mathbf{w}_n\rangle\mathbf{w}_n. $

Therefore,

$ \mathrm{proj}_{\mathbf{w}}\,(\mathbf{v}) = {\langle \mathbf{v}, \mathbf{w}_1\rangle\over\langle \mathbf{w}_1, \mathbf{w}_1\rangle}\mathbf{w}_1 + {\langle \mathbf{v}, \mathbf{w}_2\rangle\over\langle \mathbf{w}_2, \mathbf{w}_2\rangle}\mathbf{w}_2 + \cdots + {\langle \mathbf{v}, \mathbf{w}_n\rangle\over\langle \mathbf{w}_n, \mathbf{w}_n\rangle}\mathbf{w}_n. $

NOTE:

w must be perpendicular to u for the above to be true.


Hint - The Least Squares Solution


Main Reference


Kolman, B., & Hill, D. (2007). Elementary linear algebra with applications (9th ed.). Prentice Hall.


Ryan Jason Tedjasukmana


Back to MA265 Fall 2010 Prof Walther

Back to MA265

Alumni Liaison

Questions/answers with a recent ECE grad

Ryne Rayburn