2
votes

I've got a rank four tensor A (say indices (a, b, i, j)) and a rank two tensor B (say indices (i, j)) and I want to compute a kind of Hadamard multiplication of them.

That is, if we call the product C, I want C[a,b,i,j] == A[a,b,i,j] * B[i,j]. There is a fairly straightforward way to do this with einsum but I'm told there is a significant performance hit to using einsum as compared to methods like tensordot, but I couldn't find a good way to avoid it from the docs.

It's possible that I missed it; I'm new to tensors and not an expert on numpy.

1

1 Answers

4
votes
C = A * B

Following the broadcasting rules, NumPy will line up the shapes of A and B starting from the last axes:

A: (a, b, i, j)
B:       (i, j)

and multiply corresponding elements of A and B together to create C:

   A[a, b, i, j]
*  B[      i, j]
== C[a, b, i, j]