metamerist

Friday, March 16, 2007

A simple vector projection

This is post #3 of a series discussing the expression (X'X)-1X'Y used in multiple linear regression. (Here are links to earlier posts: #1, #2).

In the last post, we showed the dot product of two orthogonal vectors is zero. This time we'll leverage that knowledge to find the orthogonal projection of one vector onto another another.

If you're following along and you're not sure what an orthogonal projection is, the following figure should be helpful.



Given vectors A and B, we're going to find the point on vector A that's closest to point B. The shortest possible path from B to A is the line perpendicular to A that also intersects B, hence "orthogonal projection."

Because the point is somewhere on A, it can be expressed as A multiplied some scalar 'v'. I've labeled it 'Av' above. We'll find a solution by determining the value of 'v'.

Since we know the dot product of orthogonal vectors is zero (last post), we know that the dot product of vector A and the vector from Av to point B must be zero as well, which gives us:

A . (B-Av) = 0

(I'm using a period '.' to indicate dot product.)

Because dot product is distributive, we can change the equation to:

A . B - A . Av = 0

Adding A . Av to both sides gives us:

A . B = A . Av

And because dot product is associative for scalars:

A . B = (A . A)v

Solving for v by multiplying both sides of the equation by the inverse of (A . A):

(1 / A . A) * (A . B) = v

We've found 'v'.

Since the projection of B on A is A*v, we multiply them for the solution:

projA(B) = A*v = A (1 / A . A) * (A . B)

The utility of this goes beyond the context of this series of posts. You can use this technique to find the shortest distance from a point to a line and exactly where that nearest point on the line actually is.

Next, we'll extend the idea and project a vector onto a plane.

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home