Partial matrix operations

Discussion in 'Physics & Math' started by arfa brane, Jun 23, 2015.

  1. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Yes, it's no big deal. Really as I mentioned I'm playing around with permutation matrices.

    Seriously, as a recent student of mathematics I recall the advice to do just that, change something and see what happens, look for patterns.

    Doing this not particularly interesting thing I've been doing does preserve multiplication in \( S_4 \), it takes the 4 x 4 multiplication operation to a 2 x 2 one. The latter seems a bit clunky at first, but using the multiply-on-the-right rule isn't hard to learn. It does change the way you multiply, but not the result.
    And as demonstrated, you can do it in parts which you sum together.
    Yay.
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. rpenner Fully Wired Valued Senior Member

    Messages:
    4,833
    I'm not sure that there is a faithful linear representation of Sym(4) 's 24 elements in 2x2 matrices. Here's a representation in 3x3 real matrices:

    \( \begin{array}{cccc} { \tiny \begin{pmatrix} 0 & 0 & -1 \\ 0 & -1 & 0 \\ 1 & -1 & 0 \end{pmatrix} } & { \tiny \begin{pmatrix} 0 & 0 & -1 \\ 0 & -1 & 0 \\ -1 & 0 & 0 \end{pmatrix} } & { \tiny \begin{pmatrix} 0 & 0 & -1 \\ 1 & 0 & -1 \\ 0 & 1 & -1 \end{pmatrix} } & { \tiny \begin{pmatrix} 0 & 0 & -1 \\ 1 & 0 & -1 \\ 1 & -1 & 0 \end{pmatrix} } \\ { \tiny \begin{pmatrix} 0 & 0 & -1 \\ -1 & 1 & -1 \\ 0 & 1 & -1 \end{pmatrix} } & { \tiny \begin{pmatrix} 0 & 0 & -1 \\ -1 & 1 & -1 \\ -1 & 0 & 0 \end{pmatrix} } & { \tiny \begin{pmatrix} 0 & -1 & 1 \\ 0 & -1 & 0 \\ 1 & -1 & 0 \end{pmatrix} } & { \tiny \begin{pmatrix} 0 & -1 & 1 \\ 0 & -1 & 0 \\ -1 & 0 & 0 \end{pmatrix} } \\ { \tiny \begin{pmatrix} 0 & -1 & 1 \\ 1 & -1 & 1 \\ 0 & 0 & 1 \end{pmatrix} } & { \tiny \begin{pmatrix} 0 & -1 & 1 \\ 1 & -1 & 1 \\ 1 & -1 & 0 \end{pmatrix} } & { \tiny \begin{pmatrix} 0 & -1 & 1 \\ -1 & 0 & 1 \\ 0 & 0 & 1 \end{pmatrix} } & { \tiny \begin{pmatrix} 0 & -1 & 1 \\ -1 & 0 & 1 \\ -1 & 0 & 0 \end{pmatrix} } \\ { \tiny \begin{pmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 1 & -1 \end{pmatrix} } & { \tiny \begin{pmatrix} 1 & 0 & 0 \\ 1 & 0 & -1 \\ 0 & 1 & -1 \end{pmatrix} } & { \tiny \begin{pmatrix} 1 & 0 & 0 \\ 1 & 0 & -1 \\ 1 & -1 & 0 \end{pmatrix} } & { \tiny \begin{pmatrix} 1 & 0 & 0 \\ 1 & -1 & 1 \\ 0 & 0 & 1 \end{pmatrix} } \\ { \tiny \begin{pmatrix} 1 & 0 & 0 \\ 1 & -1 & 1 \\ 1 & -1 & 0 \end{pmatrix} } & { \tiny \begin{pmatrix} -1 & 1 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix} } & { \tiny \begin{pmatrix} -1 & 1 & 0 \\ 0 & 1 & 0 \\ 0 & 1 & -1 \end{pmatrix} } & { \tiny \begin{pmatrix} -1 & 1 & 0 \\ -1 & 0 & 1 \\ 0 & 0 & 1 \end{pmatrix} } \\ { \tiny \begin{pmatrix} -1 & 1 & 0 \\ -1 & 0 & 1 \\ -1 & 0 & 0 \end{pmatrix} } & { \tiny \begin{pmatrix} -1 & 1 & 0 \\ -1 & 1 & -1 \\ 0 & 1 & -1 \end{pmatrix} } & { \tiny \begin{pmatrix} -1 & 1 & 0 \\ -1 & 1 & -1 \\ -1 & 0 & 0 \end{pmatrix} } & { \tiny \begin{pmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix} } \end{array} \)

    Just in case you want a relation between \(M_x\) and these 3x3 matrices:

    \( \begin{array}{llll} m_{17} = m_9 m_4^2 & m_{23} = m_4 m_9^2 m_4^2 & m_9 & m_{15} = m_4 m_9^2 \\ m_{11} = m_4 m_9^2 m_4 & m_{21} = m_9 m_4 & m_{16} = m_9^2 & m_{22} = m_4^2 m_9 m_4 \\ m_3 = m_4^2 & m_{13} = m_9^3 m_4 & m_5 = m_4 m_9^2 m_4^2 m_9 & m_{19} = m_9 m_4^2 m_9 \\ m_6 = m_4 m_9 & m_8 = m_9^2 m_4 & m_{14} = m_4 m_9^3 m_4 & m_2 = m_4 m_9^2 m_4 m_9^3 \\ m_{12} = m_4^2 m_9^2 & m_1 = m_4 m_9^2 m_4 m_9^3 m_4 & m_7 = m_4 m_9^2 m_4^2 m_9^2 & m_4 \\ m_{18} = m_9^3 & m_{10} = m_4^2 m_9 & m_{20} = m_9^2 m_4^2 & m_0 \end{array} \)

    \( \begin{pmatrix} ? & ? \\ ? & m_X \end{pmatrix} = { \tiny \begin{pmatrix} 1 & 1 & 1 & 1 \\ 0 & 1 & 1 & 1 \\ 0 & 0 & 1 & 1 \\ 0 & 0 & 0 & 1 \end{pmatrix} } M_X { \tiny \begin{pmatrix} 1 & -1 & 0 & 0 \\ 0 & 1 & -1 & 0 \\ 0 & 0 & 1 & -1 \\ 0 & 0 & 0 & 1 \end{pmatrix} } \)
     
    Last edited: Jul 7, 2015
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    What I mean by 2 x 2 multiplication is perhaps better illustrated with an example:

    Take a pair of 4 x 4 permutation matrices from \( S_4 \) and multiply them together

    \( \begin{pmatrix} 1 & 0 & 0 & 0 \\ 0 & 0 & 0 & 1 \\ 0 & 0 & 1 & 0 \\ 0 & 1 & 0 & 0 \end{pmatrix} \begin{pmatrix} 1 & 0 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 0 & 1 \end{pmatrix} \)

    But first rewrite both as block matrices:
    \( = \begin{pmatrix} \rho_1 & \rho_2 \\ \rho_2 & \rho_1 \end{pmatrix} \begin{pmatrix} \rho_1 & \overline{\rho_1} \\ \overline{\rho_2} & \rho_2 \end{pmatrix} \)

    Use: \( \{ \rho_i, \overline {\rho_i} \}\rho_i = \{ \rho_i, \overline {\rho_i} \} \), and \( \{ \rho_i, \overline {\rho_i} \}\overline {\rho_j} = \{ \overline {\rho_j}, \rho_j \} \) iff \( i \ne j \); otherwise write 0.

    I can do this in my head because I've learned the above rules so I know the answer is:\( \begin{pmatrix} \rho_1 & \rho_2 \\ \overline{\rho_2} & \overline{\rho_1} \end{pmatrix} = M_8 \), which is \( M_{14}M_2 \).

    It must be a faithful rep because it only rewrites each matrix as a 2 x 2 matrix (of 2 x 2 blocks). Nothing changes.
    Anyway, a question about your 3 x 3 rep above: are you using a vector basis, and if so, what is it?
     
    Last edited: Jul 8, 2015
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Towards answering that question, it has a lot to do with the upper triangular 4 x 4 matrix and the 4 x 4 matrix on the left and right of \( M_x \), respectively of your equation:

    The one on the right looks like it could be rewritten:\( { \begin{pmatrix} e_1 - e_2 \\ e_2 - e_3 \\ e_3 - e_4 \\ e_4 \end{pmatrix} } \)

    Now I just need to remember what triangular matrices do in linear algebra.

    But one other thing about vectors, you get the elements of the set P if you have, say: \( \nu_1 = {\tiny \begin{pmatrix} 1 \\ 0 \end{pmatrix}} ,\; \nu_2 = {\tiny \begin{pmatrix} 0 \\ 1 \end{pmatrix}} \), then

    \( \nu_1^{\intercal}\nu_2 = 0 \), but \( \nu_1\nu_2^{\intercal} = \overline{ \rho_2 } \), etc.
     
    Last edited: Jul 11, 2015
  8. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    So \( \{ \nu_1,\; \nu_2 \} \equiv \{ e_1,\; e_2 \} \), where \( \{ e_1,\; e_2 \} \) is the standard basis of \( \mathbb R^2 \).

    The "2 x 2" multiplication is also defined by :

    \( e_i e_i^{\intercal} = \rho_i,\; e_i e_j^{\intercal} = \overline{\rho_j}\; |\; i \ne j \)

    So we have \( \{ \rho_i,\; \overline{\rho_i} \} \rho_i \equiv \{ e_i e_i^{\intercal},\; e_j e_i^{\intercal} \} e_i e_i^{\intercal} \)

    \( = \{ e_i e_i^{\intercal} e_i e_i^{\intercal} ,\; e_j e_i^{\intercal}e_i e_i^{\intercal} \}\)

    \( = \{ e_i e_i^{\intercal},\; e_j e_i^{\intercal}\} \equiv \{ \rho_i,\; \overline{\rho_i} \} \), and similarly for \( \{ \rho_i,\; \overline{\rho_i} \} \overline{\rho_j} \).

    It's a lot easier to just do without the vectors, but it shows an underlying structure.
     
    Last edited: Jul 15, 2015
  9. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Look for patterns:

    the set \(\{-1,\; 1\} \) is closed under multiplication, because

    \( \{-1,\; 1\}1 = \{-1,\; 1\} \) and \( \{-1,\; 1\}-1 = \{1,\; -1\} \).

    Looks something like the closure of P. But \( \{-1,\; 1\} \) with multiplication is \( S_2 \).
     
  10. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    The closure of P requires that we include the 2 x 2 0-matrix (I've left it out of P for convenience), that is, we take say, \( \mathcal{S} = P ∪ \{0\} \), then \( \mathcal{S} \) is a vector space.

    In this case, over a field which is {0,1}, so we "stay in" the basis set if addition is modulo 2 (or restricted to finite sums of partial permutation matrices). When you start subtracting elements of \( \mathcal{S} \) from each other (adding negatives) you change the basis, then you aren't using a semigroup action to multiply (you have a semigroup only under multiplication of the basis).

    Going from a 4 x 4 to a 3 x 3 rep is exactly what this does, although then you're permuting dimensions in \( \mathbb R^4 \), it's hard (for me) to relate this to a braid group though.

    Happily, {-1, 1, 0} is also closed under multiplication, and since \( -1 \equiv 1\; (mod \;2) \) then \( \{-1, 1, 0\} \equiv \{1, 0\}\; (mod\; 2) \). So now there is: \( \{\rho_i,\;\overline{\rho_i},\; 0\} \) compared to \( \{-1, 1, 0\} \).

    The modulo 2 equivalence between signs in the latter set goes to equal (column) indices in the former set.
     
    Last edited: Jul 18, 2015
  11. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    That says that there's a "row equivalence", or that \( \rho_i \) is equivalent "mod rows" to \( \overline{\rho_j} \) if i = j, or something like that.
     
  12. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    There's a lot more. Certain elements of \( S_4 \) display a symmetry under the action of the subset which is the eight block matrices with only diagonal or off diagonal forms, another kind of equivalence emerges.

    That set of eight elements of \( S_4 \) again is:

    \( \Bigg\{\begin{pmatrix} I & 0 \\ 0 & I \end{pmatrix},\; \begin{pmatrix} I & 0 \\ 0 & X \end{pmatrix},\; \begin{pmatrix} X & 0 \\ 0 & I \end{pmatrix},\; \begin{pmatrix} X & 0 \\ 0 & X \end{pmatrix},\; \begin{pmatrix} 0 & I \\ I & 0 \end{pmatrix},\; \begin{pmatrix} 0 & I \\ X & 0 \end{pmatrix},\; \begin{pmatrix} 0 & X \\ I & 0 \end{pmatrix},\; \begin{pmatrix} 0 & X \\ X & 0 \end{pmatrix} \Bigg\} \), where \( I = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix},\; X = \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix} \).

    Now consider the matrix \( \begin{pmatrix} \rho_1 & \rho_2 \\ \rho_2 & \rho_1 \end{pmatrix}\) under the left action of the above set. Where X occurs, the \( \rho_i \) change signs in that row. If we take the two pairs: \( \Bigg\{\begin{pmatrix} I & 0 \\ 0 & I \end{pmatrix},\; \begin{pmatrix} X & 0 \\ 0 & X \end{pmatrix} \Bigg\} \) and \( \Bigg\{ \begin{pmatrix} 0 & I \\ I & 0 \end{pmatrix}, \; \begin{pmatrix} 0 & X \\ X & 0 \end{pmatrix} \Bigg\} \), the first pair does nothing or changes the overbar:

    \( \begin{pmatrix} X & 0 \\ 0 & X \end{pmatrix}\begin{pmatrix} \rho_1 & \rho_2 \\ \rho_2 & \rho_1 \end{pmatrix} = \begin{pmatrix} \overline{\rho_1} & \overline{\rho_2} \\ \overline{\rho_2} & \overline{\rho_1} \end{pmatrix} \).

    The second pair also swaps the rows, as well as either leaving the overbar sign unchanged or changing it:

    \( \begin{pmatrix} 0 & X \\ X & 0 \end{pmatrix}\begin{pmatrix} \overline{\rho_1} & \overline{\rho_2} \\ \overline{\rho_2} & \overline{\rho_1} \end{pmatrix} = \begin{pmatrix} \rho_2 & \rho_1 \\ \rho_1 & \rho_2 \end{pmatrix} \).

    Also \( \begin{pmatrix} \overline{\rho_1} & \overline{\rho_2} \\ \overline{\rho_2} & \overline{\rho_1} \end{pmatrix}^{ \intercal} = \begin{pmatrix} \overline{\rho_2} & \overline{\rho_1} \\ \overline{\rho_1} & \overline{\rho_2} \end{pmatrix} = \begin{pmatrix} 0 & I \\ I & 0 \end{pmatrix}\begin{pmatrix} \overline{\rho_1} & \overline{\rho_2} \\ \overline{\rho_2} & \overline{\rho_1} \end{pmatrix} = \begin{pmatrix} \overline{\rho_1}^{ \intercal} & \overline{\rho_2}^{ \intercal} \\ \overline{\rho_2}^{ \intercal} & \overline{\rho_1}^{ \intercal} \end{pmatrix} \)
     
    Last edited: Jul 19, 2015
  13. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Now we make the observation that, all the eight I-X matrices, except for two, are transpose symmetric in being equal to their transpose ("trivially" the partial transpose of I or X does nothing); the others considered above, have a non-equal transpose, but a different symmetry where the matrix transpose is an inverse.

    Note that, the only elements of P in a block matrix that can contribute to the trace are \( \{\rho_1,\; \rho_2 \} \), and only when they are on the block diagonal; these are the "fixed strands", if you will. Then the last eqn and the ones above it (in the previous post) describe the generation of a trace 0 matrix (from one that corresponds to a single transposition since two strands are fixed) and its transpose equivalence; this permutation corresponds to the cyclic permutation of 4 letters: abcd -> abdc -> adbc -> dabc or its inverse, strand-wise it corresponds to one strand crossing three others (this isn't possible unless the strand is the first or the nth).
     
    Last edited: Jul 19, 2015
  14. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Ok, time to come clean. The elements of P are a vector basis, they are the set of outer products of the standard basis of \( \mathbb R^2 \), as shown above. Outer products aren't generally n x n matrices but we get exactly four 2 x 2 matrices because the vectors \( \{e_1, e_2 \} \) are unit vectors with the same dimension.

    Outer products are also known as tensor products. That's usually what the "\( \otimes \)" symbol means. From wikipedia:
    .

    \( \mathbb R^4 \) is related to \( S_4 \) in that the latter is all permutations of the four basis vectors.

    We have a relation between \( \{e_1,\;e_2,\;e_3,\;e_4\} \) and \( \{\rho_1,\;\overline{\rho_1},\;\rho_2,\;\overline{\rho_2}\} \) which (for the identity element) is:

    \( I_4 = e_1e_1^{\intercal} + e_2e_2^{\intercal} + e_3e_3^{\intercal} + e_4e_4^{\intercal} = \rho_1 \otimes \rho_1 + \rho_1 \otimes \rho_2 + \rho_2 \otimes \rho_1 + \rho_2 \otimes \rho_2\).

    So \( \sum_{k = 1..4} e_k e_k^{\intercal} = \sum_{i,j = 1..2} \rho_i \otimes \rho_j \)
     
    Last edited: Jul 20, 2015
  15. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    I should consider again this underlying structure, or simply what multiplying elements of P together means in terms of vectors (since the "vectors" in P are really outer products of \( \{e_1,\,e_2\} \) hence they are rank 1 tensors as well as being rank 1 matrices). I should look at the nullspace of P, and where the (2 x 2) 0-matrix comes from.

    Clearly, \( \{ \rho_1,\; \overline{\rho_1}\} \) has the same nullspace: \( \{e_2\} \), likewise \( \{ \rho_2,\; \overline{\rho_2}\}\{e_1\} = 0 \).
    The 0 is the 2 x 2 0-matrix because you multiply some outer product by numerical zero as the scalar product of \( e_i^{\intercal} e_j | i \ne j \). Done.
     
    Last edited: Jul 21, 2015
  16. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Sorry I should say, the 0 in the left/right cosets of \( P \).
     
  17. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    So there it is. Of course, the notation I use is just that, notation that uniquely identifies each outer product of \( \{e_1,\, e_2\} \times \{e_1,\, e_2\} \). The blocks of 2 x 2 matrices in any permutation representation of a symmetric group of even order (the odd ones come along with) are there (they always were, remember).

    I just projected a sort of template over the permutation rep of \( S_4 \) and went from there. The notation is actually borrowed from some online course notes about quantum information.
     
  18. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    That borrowed notation is the definition of the set of outer products of a 2-dimensional vector basis, i.e. \( \{\rho_i, \,\overline{\rho_j}\; | i,j =1..2\} \).

    You could also define each outer product with a pair of indices, or with four different letters, or whatever you like. You could use the actual 2 x 2 matrices--these are after all just compositions of the 0-vector with \( \{e_1,\,e_2\} \) row-wise or column-wise.

    And to save having to use the \({}^{\intercal} \) symbol to mean the transpose of a column vector, just use the notion of a dual space: rows and columns of some permutation matrix are dual vector spaces, rows have an upper index = row vectors have an upper index.

    So we can write: \( e^i e_j = \delta^i_j \). (side-note: Kauffman's idea is to think of this object as a vertical strand with the upper end labeled i, the lower end j.)

    We then have: \( e_i e^j = \begin{cases} \rho_i, & \text{if }i = j \\ \overline{\rho_j}, & \text{if }i \ne j \end{cases} \).

    So instead of 4 x 4 row-column multiplication in the group, first "template" every matrix, or convert it into a tensor-like object and do 2 x 2 tensor (or template) multiplication, which is equivalent to simply multiplying two rows and columns "simultaneously". But pairs of rows correspond to pairs of bands, in the abstraction (so far, it only fits the subgroup of eight--the dihedral group).

    If nothing else is in it (the remaining 16 group elements, erm, remain outside the notion of a banded structure), it's at least a way to compare the symmetries of a square with those of a pair of twisted bands.

    One interesting topological quirk of twisting a flat band twice in the same direction, is that it's equivalent to looping the band over itself while keeping it flat--the number of crossings is the same.
     
    Last edited: Jul 22, 2015
  19. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    One more interesting detail and I'm done (maybe).

    Any symmetric group of even order has 2 x 2 blocks in its permutation matrix representation. So, for example take some element of \( S_6 \):

    \( \begin{pmatrix} 1 & 0 & 0 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 & 0 & 0 \\ 0 & 0 & 1 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 & 1 \end{pmatrix} \;\;\), which using my notation is equal to: \( \rho_1 \otimes ( \rho_1 \otimes I_2 + \rho_2 \otimes \rho_1) + \rho_2 \otimes ( \overline{\rho_1} \otimes\overline{\rho_2} + \overline{\rho_2} \otimes \overline{\rho_1} + \rho_2 \otimes \rho_2 ) \), which I can write as:

    \( \rho_1 \otimes^2 I_2 + \rho_1 \otimes \rho_2 \otimes \rho_1 + \rho_2 \otimes( \overline{\rho_1} \otimes\overline{\rho_2} + \overline{\rho_2} \otimes \overline{\rho_1}) + \rho_2 \otimes^3 = \rho_2 \otimes^3 + \rho_1 \otimes^2 I_2 + \rho_1 \otimes \rho_2 \otimes \rho_1 + \rho_2 \otimes( \overline{\rho_1} \otimes\overline{\rho_2} + \overline{\rho_2} \otimes \overline{\rho_1}) \), a polynomial.

     
    Last edited: Jul 24, 2015
  20. krash661 [MK6] transitioning scifi to reality Valued Senior Member

    Messages:
    2,973
    arfa,
    have you dabbled with ramanujan theorems ?
     
  21. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    No, I'm not familiar with them. And I've realised there's a mistake in my equation. But it's easy to fix.

    The problem is that I've got the overall dimensions wrong, what I should have done is first embed the element of \( S_6 \) in \( S_8 \), by adding two rows and columns to it,
    i.e. adding \( e_7 \) and \( e_8 \) , so I have a term \( \rho_2 \otimes \rho_2 \otimes I_2 \), which is a partial 8 x 8 matrix. Still get a polynomial in P over {0,1}, which is the whole matrix.

    That's a bit better, I assumed I could just make the zeros the size I needed, but it makes a pair of 2x 2 blocks overlap (in the first attempt) and they shouldn't be in two spaces, by golly.
     
  22. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    More playing around:

    If I redefine the notation for \( S_2 \) thusly: \( I = \mathbb{I},\; X = \mathbb{\overline{I}} \). Because I can't find a mathbb number font, so it's a doofer.

    Then there are four elements of \( S_4 \) with the following description:

    \( \{ (\mathbb{I}\otimes\mathbb{I}),\, (\mathbb{I}\otimes\mathbb{\overline{I}}),\, (\mathbb{\overline{I}}\otimes \mathbb{I}),\, (\mathbb{\overline{I}} \otimes \mathbb{\overline{I}}) \} \). The overbar sign tells you this is a Klein 4-group ( subgroup).

    The notation "picks out" this subset, and others. But these can all be written as is.
    Generally, the next subgroup looks like: \( \{\rho_i,\, \rho_j\} \otimes \{\mathbb{I},\, \mathbb{\overline{I}}\} + \{\rho_j,\, \rho_i\} \otimes \{\mathbb{I},\, \mathbb{\overline{I}}\} \;| i \ne j \).

    Multiplication in the Klein 4-group (there are four subgroups of \( S_4 \) isomorphic to this fairly ubiquitous group), goes:

    \( (\mathbb{I}\otimes\mathbb{I})(\mathbb{I}\otimes\mathbb{I}) = \mathbb{I}^2\otimes\mathbb{I}^2 = \mathbb{I}\otimes\mathbb{I} \)​

    \( (\mathbb{\overline{I}} \otimes \mathbb{\overline{I}})^2 = \mathbb{\overline{I}}^2\otimes\mathbb{\overline{I}}^2 = \mathbb{I}\otimes\mathbb{I}\)
    So because: \( \mathbb{I}\mathbb{\overline{I}} = \mathbb{\overline{I}} \), the overbar looks like it behaves just like a "-" sign under multiplication, in this subgroup.

    Math is notation, and how it's defined. A lot like writing a language along with an interpreter.
     
    Last edited: Jul 30, 2015
  23. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    The subgroup, I'll call it V because that's what everyone else calls it, is a normal subgroup of \( S_4 \) and of \( A_4 \) its subgroup of even permutations.

    That means if \( V = \{ (\mathbb{I}\otimes\mathbb{I}),\, (\mathbb{I}\otimes\mathbb{\overline{I}}),\, (\mathbb{\overline{I}}\otimes \mathbb{I}),\, (\mathbb{\overline{I}} \otimes \mathbb{\overline{I}}) \} \). Then if \( g \) is an element of \( S_4 \), we have \( gV = Vg \).

    Let's look at the rest of that dihedral subgroup with the pair of bands abstraction, V is also a subgroup of this embedding of Dih(4):

    \( W = \{(\rho_1\otimes \mathbb{I} + \rho_2\otimes \mathbb{\overline{I}}),\, (\rho_1\otimes \mathbb{\overline{I}} + \rho_2\otimes \mathbb{I}),\, (\overline{\rho_1}\otimes\mathbb{I} + \overline{\rho_2}\otimes\mathbb{\overline{I}}),\, (\overline{\rho_1}\otimes\mathbb{\overline{I}} + \overline{\rho_2}\otimes\mathbb{I})\} \), so that \( V \cup W \) is isomorphic to Dih(4).

    So given how multiplication is defined, it's easy to show that V commutes with every element of W.
     
    Last edited: Jul 30, 2015

Share This Page