3
votes

I have been struggling with this for quite some time. All I want is a torch.diff() function. However, many matrix operations do not appear to be easily compatible with tensor operations.

I have tried an enormous amount of various pytorch operation combinations, yet none of them work.

Due to the fact that pytorch hasn't implemented this basic feature, I started by simply trying to subtract the element i+1 from element i along a specific axis.

However, you can't simply do this element-wise (due to the tensor limitations), so I tried to construct another tensor, with the elements shifted along one axis:

ix_plus_one = [0]+list(range(0,prediction.size(1)-1))
ix_differential_tensor = torch.LongTensor(ix_plus_one)
diff_one_tensor = prediction[:,ix_differential_tensor]

But now we have a different problem - indexing doesn't really work to mimic numpy in pytorch as it advertises, so you can't index with a "list-like" Tensor like this. I also tried using the tensor scatter functions

So I'm still stuck with this simple problem of trying to get a gradient on a pytoch tensor.

All of my searching leads to the marvelous capabilities of pytorchs' "autograd" function - which has nothing to do with this problem.

2

2 Answers

4
votes

A 1D convolution with a fixed filter should do the trick:

filter = torch.nn.Conv1d(in_channels=1, out_channels=1, kernel_size=2, stride=1, padding=1, groups=1, bias=False)
kernel = np.array([-1.0, 1.0])
kernel = torch.from_numpy(kernel).view(1,1,2)
filter.weight.data = kernel
filter.weight.requires_grad = False

Then use filter like you would any other layer in torch.nn.

Also, you might want to change padding to suit your specific needs.

3
votes

There appears to be a simpler solution to this (as I needed a similarly), referenced here: https://discuss.pytorch.org/t/equivalent-function-like-numpy-diff-in-pytorch/35327/2

diff = x[1:] - x[:-1]

which can be done along different dimensions such as

diff = polygon[:, 1:] - polygon[:, :-1]

I would recommend writing a unit test that verifies identical behavior though.