metamerist

Tuesday, October 31, 2006

The Happy Gnome

The Happy Gnome

There used to be a place in St. Paul called Chang O'Haras (the obvious fusion of Chinese and Irish). Now the pub goes by the moniker of The Happy Gnome. I took this shot over the weekend. I find it intriguing with the distortion of my wife through amber and glass contrasted with blue highlights flowing from the window. It's an interesting shade of blue called, "forgot to set the white balance blue." Not sure why, but I find something about it a little "Blade Runner-ish."

Hitting the Free Notes

Whilst waiting for Trick-or-Treaters, I spent a little time screening tunes at music.download.com. In the process, I discovered a few free MP3s that are keepers. Enjoy.

Iron & Wine - Naked as We Came. I liked Sam Beam's work from the beginning, and it just keeps growing on me. The track of the hour, Naked as We Came, is available on Our Endless Numbered Days and the In Good Company soundtrack. And you can download it free here.

Greg Graffin - Talk About Suffering. Think Bad Religion, Neil Young. This track is a prime candidate for a sequel to O' Brother Where Art Thou? Download it here.

Patti Witten - Black Butterfly. Guitar a la Chris Isaak. Hints of Beth Orton, perhaps. A great track. Download it here.

Paso Fino - The Boy Next Door. Great marriage of snappy acoustic guitar riffs and latin percussion. You'll find it right here.

Silverman - Ctl Alt Del. Download.com editors note similarities with Portishead and Nick Cave, which is close enough to get you into the ball park. Link.

Pendulum - Sounds of Life. If I ever create a futuristic film in which the protagonist speeds away with the girl in a dystopian Jetsonville, this frenetic electronic song is a strong candidate for the backdrop. Link.

The History of Knowledge

If you've seen the film Quiz Show, you may remember Ralph Fiennes' portrayal of Charles Van Doren. The fixing of Twenty-One resulted in the resignation of Van Doren from his professorship at Columbia. Unfortunately, many first learned of Van Doren through the movie, but my first knowledge came after stumbling upon his book, A History of Knowledge in the early 1990s.

Ever since reading Van Doren, the history of knowedge has been an area of personal interest for me. In the process of researching this post, I saw Amazon recommendations for other books in my library, books such as S.E. Frost's The Basic Teachings of the Great Philosophers and Will Durant's The Greatest Minds of All Time (on the subject of Durant, I would also include his The Story of Philosophy).

The history of knowlege is an object of personal fascination for a number of reasons. I find it fascinating to think of the times when everyone knew the world was flat, when everyone knew for certain the solar system was geocentric, when life was the result of spontaneous generation, or when it was widely assumed that space and time were absolutes.

What was it like to live in such times? Who and what did it take to move human knowledge beyond such obstacles and on to the next level? Where are we now? What's the next great scientific revolution over the horizon, and who will discover it? (Thomas Kuhn: The Structure of Scientific Revolutions.)

I once read an author (I forget who) claiming the burning of the library at Alexandria set human knowledge back 1000 years. Who's to say if it's true, but it's a horrifying thought--if you can imagine being propelled back another thousand years and left to live in the year 1006. If the claim is accurate, we're where the people in 1006 should have been, and heaven knows where humankind might be now (hopefully not back to "sticks and stones.") Who's to say? It's all speculation, but the broader point brings us to the pondering of such questions.

In the process of googling my way around the Net this evening, I stumbled upon an interesting set of online lecture materials related to the history of knowledge compiled by Piero Scaruffi at Berkeley: Paradigm Shifts: The Ideas that Changed the Way We See the World (A brief history of knowledge). Unfortunately, the PowerPoint and Acrobat slides don't seem to work, but you may find the HTML outlines and timelines worthy of perusal, especially the summary here.

If you find the subject interesting too, I recommend picking up a copy of Van Doren's A History of Knowledge: Past, Present, Future, the book that sparked my interest in the subject.

Musing: Types as Constraints

Designing programming languages isn't my area of expertise, but that fact alone isn't enough to keep me from musing about it.

Seeing data types as constraints

A consideration with any programming language is the data typing system supported. In statically-typed languages, variables are strictly defined with immutable types such as integer, string, real, etc. These statically-typed languages tend to be compiled, and the type of the variable is determined at compile time.

In dynamically-typed languages, variables assume a mutable type determined during program execution. In contrast to a statically-typed language, the type of a variable can change in a dynamically-typed language. Generally, either one typing strategy or the other is employed; languages tend to be either statically-typed or dynamically-typed with little middle ground in-between.

Unfortunately, neither strategy is generally superior. There are software development challenges where static typing is the preferable approach and there are other challenges where dynamic typing yields greater benefits.

The point of this post is to comment on a thought I've been mulling over for a long time: the consideration of types as constraints. Googling for "types as constraints" (and "types as sets") is enough to show me I'm not original in this line of thought. Not original, but possibly not crazy either.

I've always seen type presented as a fundamental and irreducible concept, but I think there is utility in simply considering type as a constraint on a variable rather than an essential property. From this perspective, the only truly essential property of a variable is its capacity to assume various values for a set of possible values.

By viewing type as a mere constraint, I think it may be possible to build a sensible logical framework that offers a clean conceptual bridge between dynamic-typing and static-typing. (At this point, I've yet to investigate the newly open-sourced Strongtalk in any detail, but I hope to do so in the near future.)

Consider the union of integers + reals + strings. This set can be considered the set of all possible values for a variable in a dynamically-typed language. Given the perspective of type-as-a-constraint, a declaration of type in a statically-typed language merely constrains a variable to one of three subsets: integers, reals or strings.

It should be easy to see that booleans, shorts, longs, floats, doubles, enumerations, bit fields and ranged subtypes (as in Ada) can all be seen as subsets of some set of all possible values (e.g., integers + reals + strings). Likewise regular expressions might be seen as constraints as well.

Musing my way along, here's a quick stab at syntax proposal, with three declarations in a hypothetical language in which types are viewed as constraints.

variable A;
variable B : { integer };
variable C : { integer, range(1,20) };

In the first case, the variable A is completely unconstrained. It can assume any member of the set integers + reals + strings. In the second case, the variable B is restricted to the subset integers. In the final example, the range(1, 20) constraint further restricts variable C to a subset of integers, specifically the integers 1, 2, 3 ... 20.

Such a language could not only support standard intrinsic constraints associated with basic types (i.e., integer, string, real, etc.), but also an extensible system of constraints. Newly defined constraints could come in the form of boolean functions and the invocation of exception handling could be handled by the language compiler or intepreter.

E.g.,

variable D : { positive_integer };

where positive is a boolean function returning true (note: below const-ness is also expressed in the form of a constraint)

bool positive_integer(v : { integer, constant })
{
return (v > 0);
}

It is implied that the parametric constraints (integer, constant) are preconditions for the newly derived constraint.

There are many tradeoffs involved with various systems of typing. Generally, the more strictly typed a variable is, the greater the opportunities are for compile time error detection, performance optimization, resource optimization, etc. Although you lose many of those benefits with dynamic-typing, you do gain in terms of other benefits such as flexibility and fewer execution paths.

Given a language in which types are expressed as constraints, one could leave a variable unconstrained when the freedom of dynamic languages is desired (or the development task doesn't require highly constrained rigor). However, in cases where performance and rigor are essential, variables could be as strictly defined as in a language such as Ada. Conceptually, I think the approach could be a cohesive strategy for bridging the chasm between statically-typed and dynamically-typed languages.

How Adobe Launched a Revolution

A short history of Adobe...

"...Then John Warnock and Charles 'Chuck' Geschke came along and launched the desktop-publishing revolution..."

link

( San Jose Mercury News via Seattle Times)

Monday, October 30, 2006

Broken Mirrors: A Theory of Autism

SciAm publishes a piece theorizing a connection between mirror neurons and autism.

Broken Mirrors: A Theory of Autism
By Vilayanur S. Ramachandran and Lindsay M. Oberman

link

Monday, October 23, 2006

Standing

Stand

My son at the Getty Center. Los Angeles, California.

Man on Books

Today I realized Google's Froogle works with ISBN numbers. Cool!

In other bibliophile news, my Amazon Wish List continues to grow and grow. A couple thoughts. If Amazon were to offer a "Do It Yourself" book club, I might opt for it. By "Do It Yourself" book club, I mean monthly (or periodic) shipments from my Wish List. The sum of my list is too expensive for a one time purchase and who wants that many books at once anyway, so let me make a commitment, give me an extra discount and sell me a book a month for the duration of the commitment, okay? (If this exists, I have yet to find it, but I'm surprised it's not an option.)

Also, IMHO, the Amazon Wish List should look more like the Netflix queue with items numbered in order of desirability. This would work better for gift shopping, and it would better accommodate something like a "Do It Yourself" book club.

Finally, I'm looking for an RSS driven "Price Watcher" for books I'd like to purchase. There's a pretty long list of technical books I'd love to have. These books range from range of $60 to +$200. It would be nice to have a watcher to alert me when an interesting deal becomes available.

Sunday, October 22, 2006

Linear Algebra for Graphics Geeks (SVD-X)

This was previously posted. After realizing it was redundant in light of an earlier post, I retracted it. On third thought, I'm reposting as it has additional thoughts on the subject of the SVD and eigen decomposition.

If a matrix is symmetric matrix, then

A = AT

Expressed in terms of the SVD

USVT = (USVT)T
USVT = VSUT

In the case of a symmetric matrix

AAT = ATA = A2

Since these are equal and U contains the eigenvectors of AAT and V contains the eigenvectors of ATA, U and V must be equal and both contain the eigenvectors of A2.

Let's state that again.

In the case of the SVD of symmetric matrix A:

U = V = eigenvectors of A2.
S = eigenvalues of A2.

And since U and V are equal...

USUT = VSVT

Given that realization, the next question crossing my mind was what happens if you calculate the SVD of ATA?

Considering the previous conclusions and ATA being a symmetric matrix, it follows that:

U = V = eigenvectors of (ATA)2
S = eigenvalues of (ATA)2

This led me to consider the relationship between the eigenvectors of ATA and the eigenvectors of (ATA)2.

Consider the eigen decomposition of ATA

ATA = USUT

Square both sides...

(ATA)2 = (USUT)(USUT) = US2UT

Bottom line: squaring the matrix amounts a squaring of the eigenvalues and the eigenvectors remain the same.

Pulling everything together to a conclusion, the SVD of symmetric matrix A results in the following.

U = V = eigenvectors of A.
S = squares of the eigenvalues of A.

That is, given a symmetric matrix A...

A = USVT= USUT = VSVT
A2 = US2V = US2U = VS2V

Is there any practical value in this?

Calculating the SVD of a matrix with a great number of rows or a great number of columns can be pretty expensive in terms of memory consumption (although there are more efficient, abbreviated metods highlighted in the Wikipedia page). On the other hand, the dimensionality of ATA might be insignificant by comparison.

If the objective is one of, say, calculating eigenvectors of ATA, is it acceptable to do the SVD of ATA instead? Some of the literature I've read says a benefit of SVD is that it's more numerically stable than calculating ATA, but all of the tests I've run in Scilab for cases interesting me seem to work very well. I don't know the answer to this question. Maybe I can find it.

Where's the Shoeshine Boy from Police Squad! when you need him?

I have a final thought that fits with the ideas in this post.

Given a diagonal matrix W and any old matrix A, consider the SVD of W1/2A.

In this case, V contains the eigenvectors of

(W1/2A)T(W1/2A) =
ATW1/2 W1/2A =
ATWA

And U contains the eigenvectors of

(W1/2A)(W1/2A)T =
W1/2AATW1/2

If we swap the order and calculate the SVD of AW1/2

Then V contains the eigenvectors of

(AW1/2)T(AW1/2) =
W1/2 ATA W1/2

And U contains the eigenvectors of

(AW1/2)(AW1/2)T =
(AW1/2)(W1/2AT) =
AW1/2W1/2AT =
AWAT

This is worthy of further consideration in a future post.

This is post #14 of Linear Algebra for Graphics Geeks. Here are links to posts #1, #2, #3, #4, #5, #6, #7, #8, #9, #10, #11, #12, #13.

Wednesday, October 18, 2006

The Road Goes Ever On...

Untitled

Tuesday, October 17, 2006

For Sale: "Former Library Book"

I am seeing this far too frequently on used book sites when it comes to valuable, out-of-print books. Call me cynical, but I fear a high proportion of these books are stolen from libraries. (It would be interesting to generate a distribution of used book listings containing "library book" at various price levels. A good Amazon coder could probably do this quickly and easily.)



Price information is essential for market efficiency. Unfortunately, there is probably a downside to widely available price information in this case: theft from institutions which may not be appropriately prepared to deal with the problem of increasing demands by book thieves.

Surely, libraries have always been home to valuable out-of-print books, but now the market value of such books is easily available. I hope there's another explanation, but my guess is this is creating new problems. If they don't already, Amazon and other used books sites should do what they can to put policies in place to discourage this.

[library science][library][libraries]

Linear Algebra for Graphics Geeks (X'PDP'X)

After realizing I'd already covered much of yesterday's post, I decided to send it to the bone pile.

I feel I should be putting a disclaimer on many of these posts. The impetus behind my recent posts on linear algebra is the desire for a cerebral form of urban renewal. It's time to raze the crumbling buildings with broken windows and rebuild from the ground up as time permits.

And now today's experiment.

Given an angle theta, the following transformation rotates the the vectors (cos theta, sin theta) and (-sin theta, cos theta) into alignment with the x and y axes.



Given constants A and B, scale the results of the previous transformation.



Next, reverse the first rotation, putting the x and y axes back into alignment with theta.



The net effect of these first three steps is scaling the world along the vector (cos theta, sin theta) and its normal vector (-sin theta, cos theta) by 1/A2 and 1/B2, respectively.

Now multiply everything by (x,y) as a row vector.



Rearrange the parens, which can be done because matrix multiplication is associative.



Impose the constraint that entire product must equal 1.



Finally, it's time to ask the question:

What have we here?

The answer is we have a matrix equation for an ellipse. It takes the following form:

XTQX = 1

Where Q corresponds to the (rotation * scale * rotation) in the final equation.

If theta is zero, the rotations become identity matrices, and it's easy to see that the product sans rotation is X2/A2 + Y2/B2 = 1.

The next point to make is that Q is an eigen decomposition.

Q = PDP-1

And, there's a beautiful symmetry to it all.

XT(PDP-1)X = 1

The question I'll leave to the reader is:

What's the transform that maps the unit circle into the ellipse described by this equation?

Collecting that idea and all the other ideas into a single example was a major objective here.

Please excuse any redundancies in these posts. When I think up examples I find insightful and I feel the inclination, those examples very well may show up here.

This is post #13* of Linear Algebra for Graphics Geeks. Here are links to posts #1, #2, #3, #4, #5, #6, #7, #8, #9, #10, #11, #12.

Next, more commentary on the subject of eigenvectors and the SVD.

Monday, October 16, 2006

Broadband Optimizer



Frustrations with my broadband connection finally reached the point where I decided to actually do something about it. So far, I've been happy with the results of an app I found at Speedguide.net called SG TCP Optimizer.

(I should have done this a long time ago.)

Sunday, October 15, 2006

On the Shortness of Life

Today, I heard Fred Edwords speak on the Stoic philosophers and their views of living the good life. Most memorable are his readings from the works of Seneca, which I found online as translated for the Loeb Classical Library.

"The majority of mortals, Paulinus, complain bitterly of the spitefulness of Nature, because we are born for a brief span of life, because even this space that has been granted to us rushes by so speedily and so swiftly that all save a very few find life at an end just when they are getting ready to live."

continued

IEEE Oddments

A while back, Ramesh Jain posted about an interesting development, IEEE's foray into video, IEEE.tv. Here's a blurb from the IEEE site announcement:

"IEEE.tv is intended to make broadcasting a vibrant and valuable component of the IEEE member's experience," says Pedro Ray, vice president of IEEE Regional Activities, the area that oversees the station. "And it will advance the IEEE's commitment to educating the public on important technology and engineering issues."

Overall, I think IEEE is doing a good job of embracing the Web. My chief complaint is how they segment their papers online. Papers in the IEEE library come up in many of the professional Google searches I make. Since you need an IEEE subscription to get to them, I question the ranking offered by Google.

I think there's a general problem with Google searches here--it seems what I'd call "teaser content" seems to be rapidly evolving and rising to the top. It looks like what you want, it smells like you want, but after you click through you feel like little Ralphie in Christmas Story after he decoded Little Orphan Annie's message about drinking his Ovaltine.

This trend is degrading the quality of search results, and that means there's opportunity for anyone willing to buck the trend and come up with a search algorithm that offers more results and fewer teasers.

As far as IEEE and search goes, the greater frustration, though, is that I often can't get to many IEEE papers even though I have member access. A good proportion the time, the account access available to me won't get me to the paper I need because it's only covered if you have this or that subscription. Why oh why can't we have one stop shopping for members? Tip: If this happens to you, try searching Google for the title with "filetype:pdf" appended to your search. More often than not, you'll strike gold. >:->



A Duh! moment the other day reminded me that Jim Blinn's Corner might be available in the IEEE library. Until now, I never thought to look. Lo and behold, it's there and I even have access to it. Yey! If you've never read Blinn and you have any interest in graphics or you are the sort of person who can find pure joy in beautiful and insightful expositions of mathematics, Blinn is highly recommended. He has published three books derived from his columns. Good stuff, Maynard.

Saturday, October 14, 2006

The Queen

Interesting films always seem to come in streaks. Drought or flood, this weekend was flood, and we were forced to choose from five or six desirable candidates. In the end, overwhelming critical acclaim led us to The Queen.

The praise of Helen Mirren's performance as Queen Elizabeth II is deserved; she does a wonderful job in this subtle film focusing on the tensions between Tony Blair, the British public and the Monarchy after the death of Princess Diana.



Ebert: "The Queen is a spellbinding story of opposed passions -- of Elizabeth's icy resolve to keep the royal family separate and aloof from the death of the divorced Diana, who was legally no longer a royal, and of Blair's correct reading of the public mood."

(4/5)

Babble

Be forewarned this post is going to consist of random chaos.

After seeing his biography, I purchased Daniel Johnston's Discovered Covered. It's an interesting disc. Some of the songs are quite old, recorded on tape in the early 80s while played on a primitive old chord-organ after Johnston moved to Austin and had no piano at his disposal; according to the film, he also lacked dubbing equipment, so when he sold a tape, he'd start the process anew, rerecord another performance and work on selling that tape. It's a two-disc set with covers (Death Cab, Tom Waits, et al.) on the first disc and the originals on the second. Best tracks include Devil Town, Walking the Cow, Story of an Artist.

Cyndi Lauper writes about St. Paul Minnesota, Cyndi Lauper and the Stray Catholics and other events I mentioned the other day.



Gnarls Barkley scores a double with a great remake of the Violent Femmes' Gone Daddy Gone. Cool video.

Joel Spolsky reviews beyond Java and posts a link to Frederick P. Brooks' classic, prescient essay No Silver Bullet, a must read for anyone in software development.

Lately, I have been going through a streak of high-impact kitchen science. As far as Thai goes, I believe I've found the nearest source of holy basil. Love the stuff. The thought crossed my mind it might be good in a mojito; as usual, Google confirmed my lack of original thought. This Gai Pad Bai Gaprow recipe at Epicurious is good.

Ordered three of these digital refrigerator thermometers from Taylor Precision Products via Amazon. Unfortunately, it took USPS a month to get them to me, but I'm happy with them. How close can I get to 32 degrees F without freezing things?



Since I'm on the subject of kitchen products, I must say this OXO peeler has to be the best peeler in the world.



Another recommended Epicuious recipe that I've made a number of times: Filet Mignon with Mushrooms and Madiera. Very good, with the only possible improvement being thicker sauce (maybe it means I'm cutting reductions short). Looking for a cheaper meal, I made it with round steak last night, and it turned out fine.

Tuesday, October 10, 2006

Cyndi Lauper

Some of my favorite concerts over the past few years have included Pete Yorn at 1st Avenue's little annex, 7th St. Entry. There was a Loudon Wainwright concert that included an opening by Justin King that was absolutely incredible (if you ever have an opportunity to see King, take it). Imogen Heap put on a great show a few months ago. Suzanne Vega. Aimee Mann. Waterboys. John Mayer + Counting Crows + Graham Colton. Etc.

That said, there are few performances I've ever enjoyed more than Cyndi Lauper's last night at the College of St. Catherine. It was one of the best shows I've ever seen.



The concert began with Cyndi posing as a bishop, her face in the spotlight as she sang a quirky rendition of Mack the Knife. Singing at a Catholic college and dressed as a Catholic school girl, she joked, "I've been thrown out of two Catholic schools in my life, so I thought I'd see if I could go for 0 and 3!" She failed in that pursuit, leaving an ethuisiastic crowd satisfied with a tremendous multi-song encore that included a ukelele performance by Esera Tuaolo in an Iko Iko + Bob Marley combo after she pulled the ex-NFL player from his seat.

Wandering through the audience, climbing a grand piano a number of times, dancing back and forth the stage, she offered more energy than I've seen from performers twenty years younger. The band was wonderful. Thank you, and bravo! (5/5)

Monday, October 09, 2006

Linear Algebra for Graphics Geeks (SVD-IX)

I have been getting a surprising number of Google referrals from people wondering how to calculate the SVD of a 2x2 matrix by hand. The number of hits is surprising because I haven't answered this question, and now I'm feeling compelled to answer it.

Calculating the SVD of even a 2x2 by hand is sort of tedious, so maybe what I need to do is start writing about the subject, post that start and hope a sense of duty will compel me to finish it.

On this page, Todd Will calculates the SVD of this matrix by hand using other means. If you want a shorter way to do it, I recommend you go there. What I'm going to do here might be called "the long way," but I'm feeling like a glutton for punishment, so here goes...



In this post, I talked about the U containing the eigenvectors of AAT , V containing the eigenvectors of ATA and S containing the square roots of the eigenvalues.

If you want to calculate the SVD of a 2 x 2 matrix, one way to do it is to start by calculating the eigenvalues of ATA and AAT (they share the same eigenvalues).

To calculate the eigenvalues of a 2 x 2, you need to solve this equation:



Multiplying out lambda leaves you with



Matrix subtraction leaves you with



Calculation of the 2x2 determinant leaves you with



Expanding that leaves you with



This is a quadratic you can solve via the quadratic formula by substituting 1 for a, -(a+d) for b and (ad-bc) for c.

Consider the following 2 x 2 matrix.



Let's use this matrix and see if the method I'm using here brings us to the same results. If we succeed you'll have two means of calculating SVD by hand.



I'll do the work in Scilab:

-->A=[-10 8; 10 -1]
A =
-10. 8.

10. -1.

-->AA=A*A'
AA =
164. -108.
-108. 101.

-->z = [164*101-(-108)*(-108) -(164+101) 1]
z =
4900. - 265. 1.


-->q=poly(z,'x','coeff')
q =
4900 - 265x + x^2

-->roots(q)
ans =
20. 245.

The eigenvalues of AAT are 20 and 245, or 2*sqrt(5) and 7*sqrt(5). These are also the eigenvalues of ATA, but I'm not going to work it out here. Finally, the square roots of these values form the elements on the diagonal matrix S in our SVD of A, so they'll return later when we put the whole thing together.



Now that we have the eigenvalues of AAT, we can calculate its eigenvectors. In quick review, an eigenvector is a vector whose direction doesn't change when it's transformed by its matrix. When matrix A transforms its eigenvector X, the length of X may change, but the effect of the multiplication is nothing more than a scaling of X. The scaling factor is the eigenvalue, and it's usually designated with the symbol lambda.



Subtraction gives us



Factoring out X gives us



Earlier (in Scilab) we calculated AAT and its eigenvalues, 20 and 245.



First, we'll plug AAT and 20 into the previous equation.



The product results in these equations.

144*x1 - 108*x2 = 0
-108*x1 + 81*x2 = 0

You might be habitually inclined to seek a unique solution to the system, but we're looking for a vector rather than a point. The two equations are equivalent, and we can find the relationship between x1 and x2 by looking at either. Let's simply look at the first equation.

144*x1 - 108*x2 = 0
144*x1 = 108*x2
x1 = 108/144*x2
x1 = (3/4)*x2

Any multiple of (3/4, 1) should satisfy the equation. Since we'd like the vectors to be unit length (see SVD and orthonormal bases), let's normalize the length of the vector.

Following the Pythagorean theorem, the length of (3/4, 1) is the square root of the following:

sqrt((3/4)2+12) = sqrt(16/16 + 9/16) = sqrt(25/16) = 5/4

It's nice how that worked out.

Our unit length eigenvector is:

(1/(5/4))*(3/4, 1) = (4/5)*(3/4, 1) = (3/5, 4/5)

We have one of the four eigenvectors we need to calculate.



Next we return to the other eigenvalue, 245, and plug it into our eigenvector equation.



This time we get

-81*x1 -108*x2 = 0
-108*x1 -144*x2 = 0

Looking at the first equation again

-81*x1 -108*x2 = 0
-81*x1 = 108*x2
x1 = -108/81*x2
x1 = -4/3*x2

This time our eigenvector solution is (-4/3, 1), which we'll normalize below.

Find the length of the vector

sqrt((-4/3)*(-4/3)+1*1) = sqrt( 16/9 + 9/9) = sqrt(25/9) = 5/3

Again, scale by 1/length

(1/(5/3))*(-4/3,1) = (3/5)*(-4/3,1) = (-4/5, 3/5)

So now we've have unit length eigenvectors of AAT:

(3/5, 4/5) and (-4/5, 3/5).

AAT is symmetric and the eigenvectors of a symmetric matrix are orthogonal, which means their dot product is zero. This is clear by looking at them, but I'll multiply it out, because this post is all about being verbose:

(4/5, 3/5) dot (-3/5,4/5) = (4/5)*(-3/5) + (3/5*4/5) = -12/25 + 12/25 = 0

Let's add our other eigenvector to the first column U. It corresponds to the largest eigenvalue and the top-left "singular value" in the S matrix.



Now that we've calculated U and S, we can solve for V. In doing this, I'm going to exploit some of the properties of U and S.

First, let's multiply the left sides of both equations by the inverse of U. If you recall, because U is an orthogonal matrix, its inverse is its transpose, so we simply need to multiply the left side of the equation by the transpose of U to eliminate U on the right side. In this case, U happens to be symmetric, so it's also its own transpose (that won't always be the case).

A = U*S*VT =
U-1*A = U-1*U*S*VT =
U-1*A = S*VT =



Multiplying both sides by the inverse of S will leave us with VT on the right side. Because S is a diagonal matrix, the inverse is calculated by replacing each element on the diagonal with its reciprocal.

U-1*A = S*VT =
S-1*U-1*A = S-1*S*VT =
S-1*U-1*A = VT =



Multiplying everything out leaves us with the following:



At this point we know U, S and the transpose of V. Putting everything back together leaves us with:

A = U*S*VT =



Update (1/12/2007): If you're reading this, you may be interested in Jim Blinn's paper titled Consider the Lowly 2x2, which you can find on IEEE or in his book Notation, Notation, Notation.

If you want to write a program to calculate the SVD, don't infer the algorithms from this post; sound computational means are found in the classic text Matrix Computations by Golub & Van Loan.

This is post #12 of Linear Algebra for Graphics Geeks. Here are links to posts #1, #2, #3, #4, #5, #6, #7, #8, #9, #10, #11.

Next, some commentary on matrices, elipses and eigen decomposition.

Tuesday, October 03, 2006

Sumac

Sumac

Money Momentum

Possibly the funniest thing Lorne Michaels ever produced is The Kids in the Hall (1988-1994). Today, I stumbled onto the classic KitH spoof of get-rich-quick-hucksterism, Money Momentum.



Hilarious!

link

Monday, October 02, 2006

Computational Photography



Catching up with things I realized today that the August 2006 issue of IEEE Computer is focused on computational photography with articles by Michael F. Cohen, Richard Szeliski, Paul Debevec, Marc Levoy, Fredo Durand and others. Follow the link for article descriptions.

Netflix $1,000,000 Programming Challenge

Here's a programming challenge I'd love to take a stab at. Netflix is offering $1M to anyone who can come up with a 10% improvement in their recommendation system. I don't feel I have enough free time to make a serious go of it, so chances are low I'll actually enter, but it's nonetheless very interesting to me.

via Machine Learning (Theory)

Sunday, October 01, 2006

The Devil and Daniel Johnston

Recommended: The Devil and Daniel Johnston



"No wonder Kurt Cobain was a fan. But it's the way Feuerzeig walks with him on the line between creativity and madness that digs this haunting and hypnotic film into your memory." - Peter Travers, Rolling Stone

"Jeff Feuerzeig, who won the best-director award at the 2005 Sundance festival, cobbles together a moving portrait of the artist as his own ghost, using a wealth of material provided by Mr. Johnston, from home movies to audiocassette diaries to dozens of original, and often heartbreakingly beautiful, songs." - Lawrence Van Gelder , NYT