I've got a rank four tensor A (say indices (a, b, i, j)) and a rank two tensor B (say indices (i, j)) and I want to compute a kind of Hadamard multiplication of them.
That is, if we call the product C, I want C[a,b,i,j] == A[a,b,i,j] * B[i,j]. There is a fairly straightforward way to do this with einsum but I'm told there is a significant performance hit to using einsum as compared to methods like tensordot, but I couldn't find a good way to avoid it from the docs.
It's possible that I missed it; I'm new to tensors and not an expert on numpy.