On tensors and their matrix representations

Here's something that caused me undue confusion over the years as I was learning about tensors.

Many textbooks, for instance, will tell you that the metric tensor of special relativity takes a form like

\[ \eta_{\mu\nu}=\begin{pmatrix}-1&0&0&0\\0&1&0&0\\0&0&1&0\\0&0&0&1\end{pmatrix}, \]

and of course we all know what they mean, but this is technically incorrect. Why? Well, let us try and multiply a contravariant vector by this metric tensor. What should we get? Why, it's $\eta_{\mu\nu}v^\nu=v_\mu$, i.e., a covariant vector. OK, so let's do this in matrix form, where contravariant vectors are represented as column vectors:

\[ \begin{pmatrix}-1&0&0&0\\0&1&0&0\\0&0&1&0\\0&0&0&1\end{pmatrix}\cdot\begin{pmatrix}t\\x\\y\\z\end{pmatrix}=\begin{pmatrix}-t\\x\\y\\z\end{pmatrix}.\]

Something is wrong here. Instead of a row vector representing a covariant vector, we got another column vector. How can this be?

Let's take a step backward. What, exactly, does a 4×4 matrix represent? Why, it has 4 rows and 4 columns, meaning that it is a tensor with one covariant and one contravariant index. I.e., what we wrote above makes no sense. What would make sense is this:

\[ \eta^\mu_\nu=\begin{pmatrix}-1&0&0&0\\0&1&0&0\\0&0&1&0\\0&0&0&1\end{pmatrix}, \]

but of course this doesn't work very well either, since we know that the "mixed index metric tensor" is really just the identity matrix (e.g., \(\eta_{\mu\nu}\eta^{\nu\xi}=\delta_\mu^\xi\)). So how could we write the metric tensor correctly in matrix form? Well... it has two covariant indices. Meaning it has... a row of four row vectors? Let's try:

\[\eta_{\mu\nu}=\left[(-1~0~0~0)~(0~1~0~0)~(0~0~1~0)~(0~0~0~1)\right].\]

For it to make any sense, we should be able to use it to multiply a contravariant vector with it on the left and get a covariant vector. Let's give it a try:

\[\left[(-1~0~0~0)~(0~1~0~0)~(0~0~1~0)~(0~0~0~1)\right]\cdot\begin{pmatrix}t\\x\\y\\z\end{pmatrix}=?\]

Can this multiplication be carried out? We need to tweak the usual rules of matrix multiplication a little, but it still kind of makes sense. Let's take the first element of the object on the left, which is the row matrix (–1 0 0 0),and scalar multiply the vector on the right with it; we get \(-t\). Repeat this procedure for the second, third, and fourth element: we get \(x\), \(y\), and \(z\), respectively. Since the four elements on the left were arranged in a row, we arrange the result in a row: \((-t~x~y~z)\). This looks a great deal more like a covariant vector!

Better yet, this way we can in fact represent tensors of higher valence. For instance, the elements of the Christoffel symbol \(\Gamma_{\mu\nu}^\xi\) in some coordinate representation could be arranged as

\[\begin{pmatrix} (\Gamma_{00}^0~\Gamma_{01}^0~\Gamma_{02}^0~\Gamma_{03}^0)& (\Gamma_{10}^0~\Gamma_{11}^0~\Gamma_{12}^0~\Gamma_{13}^0)& (\Gamma_{20}^0~\Gamma_{21}^0~\Gamma_{22}^0~\Gamma_{23}^0)& (\Gamma_{30}^0~\Gamma_{31}^0~\Gamma_{32}^0~\Gamma_{33}^0)\\ (\Gamma_{00}^1~\Gamma_{01}^1~\Gamma_{02}^1~\Gamma_{03}^1)& (\Gamma_{10}^1~\Gamma_{11}^1~\Gamma_{12}^1~\Gamma_{13}^1)& (\Gamma_{20}^1~\Gamma_{21}^1~\Gamma_{22}^1~\Gamma_{23}^1)& (\Gamma_{30}^1~\Gamma_{31}^1~\Gamma_{32}^1~\Gamma_{33}^1)\\ (\Gamma_{00}^2~\Gamma_{01}^2~\Gamma_{02}^2~\Gamma_{03}^2)& (\Gamma_{10}^2~\Gamma_{11}^2~\Gamma_{12}^2~\Gamma_{13}^2)& (\Gamma_{20}^2~\Gamma_{21}^2~\Gamma_{22}^2~\Gamma_{23}^2)& (\Gamma_{30}^2~\Gamma_{31}^2~\Gamma_{32}^2~\Gamma_{33}^2)\\ (\Gamma_{00}^3~\Gamma_{01}^3~\Gamma_{02}^3~\Gamma_{03}^3)& (\Gamma_{10}^3~\Gamma_{11}^3~\Gamma_{12}^3~\Gamma_{13}^3)& (\Gamma_{20}^3~\Gamma_{21}^3~\Gamma_{22}^3~\Gamma_{23}^3)& (\Gamma_{30}^3~\Gamma_{31}^3~\Gamma_{32}^3~\Gamma_{33}^3) \end{pmatrix}.\]

Or, when there are multiple contravariant indices, we get stacked columns. For instance:

\[\begin{pmatrix} \begin{pmatrix}-1\\0\\0\\0\end{pmatrix}\\ \begin{pmatrix}0\\1\\0\\0\end{pmatrix}\\ \begin{pmatrix}0\\0\\1\\0\end{pmatrix}\\ \begin{pmatrix}0\\0\\0\\1\end{pmatrix} \end{pmatrix}.\]

Of course it is not very convenient to write matrices like this, which is why we resort to the incorrect representation that is seen so often. Nevertheless, it is helpful to keep in mind what those representations really mean, in order to avoid confusion.