While searching for this problem I haven´t found something related, so please let me know if something similar already exists.
So here is the problem. Imagine you have a stencil Tensor A containing the non-zero values of a column of B. Now I want to generate a Tensor B containing the values of A at the correct position. A should be of dim (n,3) and B of dim (2n+1,n). The different dimension arise from different Finite Element Grids. For example let n=3 say we have A
A = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
and want to create a Tensor B out of A like
B[0,0]=A[0,0], B[1,0]=A[0,1], B[2,0]=A[0,2], B[2,1]=A[1,0], B[3,1]=A[1,1], B[4,1]=A[1,2], B[4,2]=A[2,0], B[5,2]=A[2,1] and B[6,2]=A[2,2].
So for the example B would be
B = [[1, 0, 0], [2, 0, 0], [3, 4, 0], [0, 5, 0], [0, 6, 7], [0, 0, 8], [0, 0, 9]].
Additionaly this needs to be differentiable, if that is even possible. Since I want to train the Tensor A to some parameters, but use B to calculate a Matrix-vector product for the loss funktion. I won´t use batch learning at this point.
I tried to use the tf.assign() function to assign every single Value for the Tensor B. I am looking for a better option since this is getting really messy for large tensors and you have to run the tf.assign() op before the value is even set.
In numpy I would simply do something like
import numpy as np
A = np.array([[1,2,3],[4,5,6],[7,8,9]])
B = np.zeros((7,3))
for i in range(3):
B[2*i:2*i+3,i] = A[i,:]
is there some similar or even easier way to do this in Tensorflow at which the operation is still differentiable?