Thursday, January 24, 2008

Reading for 1/28, 1.1-1.5

Whoa, expressing the standard dot products and cross products (and, later, div and curl) in terms of tensor rank-reductions is really cool!

I'm not sure what to make of the fact that the gradient of a scalar doesn't transform the same way as a 'normal' vector. What is the distinction, what does contravariance actually mean? I don't get it, and the text doesn't go into detail.

No comments: