dot
method, right?
Yes, I do know that, you smart internet person you. However, my problem was that I had a number of matrices for which I wanted to perform the same type of matrix multiplication. I had on the order of a hundred thousand five by two matrices that were to be transposed and multiplied with another hundred thousand five by two matrices.
Well, duh, you say. The
dot
method can handle more than two
dimensions, you know. Yeah, I know that as well. However, it doesn't handle it
the way I needed for this task. I wanted to end up with a hundred thousand two
by two matrices. Had I used dot
, I would have ended up with a
hundred thousand by two by hundred thousand by two matrix.
So, I had to improvise:
>>> A.shape
(100000, 2, 5)
>>> B.shape
(100000, 2, 5)
>>> result = np.sum(np.swapaxes(A, 1, 2)[:, np.newaxis, :, :] * B[:, :, :,
np.newaxis], 2)
Kind of involved, but it worked. I got the initial idea from here, but the solution given here only works for symmetrical matrices - for non-symmetrical ones you have to shift the newaxis one step to the left, or it will violate the broadcasting rules of NumPy.