Is this tensor equation correct?

Discussion in 'Physics & Math' started by neelakash, Jul 11, 2009.

  1. neelakash Registered Senior Member

    Messages:
    491
    I am to prove \(\ det[g_{ij}]=\ V^2\) where \(\[g_{ij}]\) is metric tensor and \(\ V=\epsilon_i\cdot\epsilon_j\times\epsilon_k\)

    Now,I was wondering if the identity

    \(\epsilon_{ijk}\epsilon_{lmn}= \begin{vmatrix} \ g_{11}&\ g_{21}&\ g_{31}\\ g_{12}&\ g_{22}&\ g_{32}\\ g_{13}&\ g_{23}&\ g_{33} \end{vmatrix}\)

    is correct---in which case the problem is solved easily.Note that the antisymmetric tensors are both co-variant.

    Surely,the best way to know whether it is correct or not is to prove it...but somehow I am not getting it.If it is correct can anyone give me some hint?
     
    Last edited: Jul 11, 2009
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. AlphaNumeric Fully ionized Registered Senior Member

    Messages:
    6,702
    You mean to ask if det(G) is equal to the triple product of its rows? If \(g = { a_{1} , a_{2} , a_{3} }\) then det(g) = \(a_{1} \cdot ( a_{2} \times a_{3} )\), which you can do by hand just by finding the det of g.

    \(\epsilon_{ijk}\epsilon_{lmn}\) is a rank 6 tensor, not a scalr so it cannot be equal to det(g) unless you're abusing notation somehow. I think you're getting mixed up by confusing the epsilon antisymmetric rank 3 tensor \(\epsilon_{ijk}\) with the 3 vectors which make up the rows of your metric. Suppose you write \(g = ( u , v , w }\) where each row has entries \(u_{i}\) etc. Then det(g) = \(u \cdot (v \times w) = u_{i} \epsilon_{ijk}v_{j} w_{k}\). That is the tensor identity with correct indices.

    You can see that det(g) = \(u \cdot (v \times w) = u_{i} \epsilon_{ijk}v_{j} w_{k}\) by considering how you work out the determinant, you pick a row or column (in this case u) and you then work out the matrix of cofactors and dot those with the entries of u. This reduces the determinant to being the triplet product of the rows or column vectors. In other dimensions, ie a general n x n metric you have n rows or columns and you need n-1 of them to form a well defined vector product (in 3 dimensions you need 2), which you then dot with the remaining row or column to get a number.

    The triple product (and it's generalisations) is the volume of the parallelopiped whose sides are defined by the vectors which are rows or columns of the matrix, hence you can relate it to a volume element and why det(g) appears in so many integrals when doing multi dimensional calculus.

    /edit

    Bloody lack of \pmatrix latex support!
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. neelakash Registered Senior Member

    Messages:
    491
    \(\epsilon_{ijk}=\ +V\) means ijk-th component of epsilon antisymmatric matrix is equal to V. By the way, if \(\ e_i,\ e_j,\ e_k\)
    are orthonormal,V=1.

    So,can we write ...\(\epsilon_{ijk}\epsilon_{lmn}=\ V\ V=\ V^2\) to mean that as we multiply ijk-th element of \(\epsilon\) tensor and lmn-th element of \(\epsilon\) tensor, we get the squared volume of the parallelopiped spanned by the vectors \(\ e_i,\ e_j,\ e_k\)?
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. AlphaNumeric Fully ionized Registered Senior Member

    Messages:
    6,702
    Except this isn't true in general, as \(\epsilon_{111} = 0\), \(\epsilon_{123} = +1\) and \(\epsilon_{132} = -1\). The entire point of it being antisymmetric is that it satisfies \(\epsilon_{ijk} = \epsilon_{[ijk]}\) which instantly means if any of the indices are equal then the entry is zero.

    I think I see what you're getting at but you're failing to grasp how tensors etc work.

    Let \(a_{1}\), \(a_{2}\) and \(a_{3}\) be an orthonormal basis. You can form the triple product \([a_{1},a_{2},a_{3}] = a_{1}\cdot (a_{2} \times a_{3})\). The triple product has the properties that its antisymmetric in its labels, so if \(A_{ijk} = [a_{i},a_{j},a_{k}] = a_{i} \cdot (a_{j} \times a_{k}) \) then you find that \(A_{ijk} \propto \epsilon_{ijk}\), where the proportionality constant is 1 because the basis is normalised, so \(\epsilon_{ijk} = a_{i} \cdot (a_{j} \times a_{k})\).

    Now suppose it's not normalised? Then you have that \(A_{ijk} \propto \epsilon_{ijk}\) with proportionality constant V, the volume, which is what you mean. You meant to say that the non-zero entries of the antisymmetric tensor are not +1 and -1 but +V and -V, ie \(A_{ijk} = [a_{i},a_{j},a_{k}] = a_{i} \cdot (a_{j} \times a_{k}) = V \epsilon_{ijk}\).

    If its not orthogonal either than \(A_{ijk}\) is not proportional to the isotropic antisymmetric tensor \(\epsilon_{ijk}\) but is a non-orthonormal generalisation of it. In orthogonal bases you have that \((v \times w)_{i} = \epsilon_{ijk}v_{j}w_{k}\) for the isotropic tensor \(\epsilon\) but if its not an orthonormal basis then \(\epsilon\) is modified.

    I'm going to stop there because else I'll end up just giving you the solution. Write cross products in terms of some antisymmetric tensor, which is \(\epsilon_{ijk}\) in orthonormal bases.
     
    Last edited: Jul 12, 2009
  8. AlphaNumeric Fully ionized Registered Senior Member

    Messages:
    6,702
    Beware the three i'd monster. When using sufficies, you should never have more than 2 of them being the same letter if you're summing them else it is ambigious.

    What you mean to say is \((c^{i} e_{i} ) \cdot e^{j} = c^{i} (e_{i} \cdot e^{j}) = c^{i} \delta_{i}^{j} = c^{j}\)

    Yes, that might seem largely "You're being pedantic" but should you ever progress to doing something like \(R^{abcd}R_{abcd}\) where \(R^{a}_{bcd} = \partial_{c}\Gamma^{a}_{bd} - \partial_{d}\Gamma^{a}_{bc} + \Gamma^{e}_{bc}\Gamma^{a}_{de} - \Gamma^{e}_{bd}\Gamma^{a}_{ce}\) you'll be really happy you developed a good index habit early!

    Other than that it looks good, though I don't know how much you got from your reference.
     
  9. neelakash Registered Senior Member

    Messages:
    491
    Basically I did the problem in some other way.But I wanted to know if it could be done in this way...

    ok,let me go through the concepts...
     
  10. neelakash Registered Senior Member

    Messages:
    491
    I am sorry I deleted my post after you have given a reply.This morning I could do it in my way.

    Let \(\ e_i,\ e_j,\ e_k\) form a basis,not necessarily orthogonal.Then,
    \(\ e_i\cdot(\ e_j\times\ e_k)\) is the volume |V| spanned by the bases.
    In the deleted post I intended to prove
    \(\ e_i\cdot(\ e_j\times\ e_k)=\epsilon_{ijk}\)

    Thus,it follows that for a non-orthogonal basis,
    \(\ e_i\cdot(\ e_j\times\ e_k)=\epsilon_{ijk}=+V\)
    So,for even permutation of ijk,ijk-th element of \(\epsilon\) tensor=+V

    \(\ e_i\cdot(\ e_j\times\ e_k)=\epsilon_{ijk}=-V\)
    So,for odd permutation of ijk,ijk-th element of \(\epsilon\) tensor=-V

    etc.

    However,if the basis is orthonormal,we would have |V|=1.

    Thus,\(\ V^2=\epsilon_{ijk}\epsilon_{lmn}=[\ e_i\cdot(\ e_j\times\ e_k)][\ e_l\cdot(\ e_m\times\ e_n)]\) for even permutation of {i,j,k} and {l,m,n} where {i,j,k},{l,m,n}={1,2,3}

    Let \(\ e_i=i,\ e_j=j,\ e_k=k,\ e_l=l,\ e_m=m,\ e_n=n\)

    I was failing to evaluate the following determinant:

    \(\begin{vmatrix}\ i_1&\ i_2&\ i_3\\ j_1&\ j_2&\ j_3\\ k_1&\ k_2&\ k_3 \end{vmatrix} \begin{vmatrix}\ l_1&\ l_2&\ l_3\\ m_1&\ m_2&\ m_3\\ n_1&\ n_2&\ n_3 \end{vmatrix} = \begin{vmatrix}\ i_1\ l_1\ +\ i_2\ l_2\ + \ i_3\ l_3 &\ i_1\ m_1\ + \ i_2\ m_2\ + \ i_3\ m_3 & \ i_1\ n_1\ +\ i_2\ n_2\ + \ i_3\ n_3\\ \ j_1\ l_1\ +\ j_2\ l_2\ + \ j_3\ l_3 &\ j_1\ m_1\ + \ j_2\ m_2\ + \ j_3\ m_3 &\ j_1\ n_1\ +\ j_2\ n_2\ + \ j_3\ n_3\\ \ k_1\ l_1\ +\ k_2\ l_2\ + \ k_3\ l_3 &\ k_1\ m_1\ + \ k_2\ m_2\ + \ k_3\ m_3 & \ k_1\ n_1\ +\ k_2\ n_2\ + \ k_3\ n_3 \end{vmatrix} \\ =\begin{vmatrix} \ i\cdot\ l &\ i\cdot\ m & \ i\cdot\ n\\ \ j\cdot\ l &\ j\cdot\ m & \ j\cdot\ n\\ \ k\cdot\ l &\ k\cdot\ m & \ k\cdot\ n \end{vmatrix} \\ =\begin{vmatrix} \ e_i\cdot\ e_l &\ e_i\cdot\ e_m & \ e_i\cdot\ e_n\\ \ e_j\cdot\ e_l &\ e_j\cdot\ e_m & \ e_j\cdot\ e_n\\ \ e_k\cdot\ e_l &\ e_k\cdot\ e_m & \ e_k\cdot\ e_n \end{vmatrix} \\ =\begin{vmatrix} \ g_{il} &\ g_{im} & \ g_{in}\\ \ g_{jl} &\ g_{jm} & \ g_{jn}\\ \ g_{kl} &\ g_{km} & \ g_{kn} \end{vmatrix} \\ =det[g] \)
     
    Last edited: Jul 15, 2009

Share This Page