Cracovians: The Twisted Twins of Matrices

  • > However, the multiplication of a cracovian by another cracovian is defined differently: the result of multiplying an element from column i of the left cracovian by an element from column j of the right cracovian is a term of the sum in column i and row j of the result.

    Am I the only one for whom this crucial explanation didn’t click? Admittedly, I might be stupid.

    Wikipedia is a bit more understandable: „The Cracovian product of two matrices, say A and B, is defined by A ∧ B = (B^T)A

  • I didnt get the explanation of the multiplication. After reading the wikipedia article it made mode sense:

    https://en.wikipedia.org/wiki/Cracovian

    The Cracovian product of two matrices, say A and B, is defined by

    A ∧ B = BT A,

    where BT and A are assumed compatible for the common (Cayley) type of matrix multiplication and BT is the transpose of B.

    Since (AB)T = BT AT, the products (A ∧ B) ∧ C and A ∧ (B ∧ C) will generally be different; thus, Cracovian multiplication is non-associative.

    A good reference how to use them and why they are useful is here (pdf):

    https://archive.computerhistory.org/resources/access/text/20...

  • > It turns out that multiplying cracovians by computers is not faster than multiplying matrices.

    That's very specific of Python. A few years ago we were multiplying a lot of matrices in Fortran and we tried to transpose one of the matrices before the multiplication. With -o0 it was a huge difference because the calculation used contiguous numbers and was more chache friendly. Anyway, with -o3 the compiler made some trick that made the difference disappear, but I never tried to understand what the compiler was doing.

  • I guess I'm skeptical of using a non-associative algebra instead of something that can trivially be made into a ring or field (i.e. matrix algebra). What advantages does this give us?

  • In Einstein notation this operation is Aij Bkj, which incidentally shows why Einstein notation is so useful.

  • What an interesting little nook of matrix analysis history! Thanks for sharing. :)

  • Read to the end and... what was the point of that? Where's the payoff?

    There was a claim near the top that some things are easier to compute when viewed as cracovians. then some explanation, then suddently it switches to numpy and showing the time is the same.

    New title: "Cracovians are a Waste of (the Reader's) Time"?

  • Uhh so it's just matrices where the left slot of matmul is transposed?

  • Missed a chance to call it the twisted sister!