Whoa, expressing the standard dot products and cross products (and, later, div and curl) in terms of tensor rank-reductions is really cool!
I'm not sure what to make of the fact that the gradient of a scalar doesn't transform the same way as a 'normal' vector. What is the distinction, what does contravariance actually mean? I don't get it, and the text doesn't go into detail.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment