10
votes

I'm looking to back-propagate gradients through a singular value decomposition for regularisation purposes. PyTorch currently does not support backpropagation through a singular value decomposition.

I know that I could write my own custom function that operates on a Variable; takes its .data tensor, applies the torch.svd to it, wraps a Variable around its singular values and returns it in the forward pass, and in the backward pass applies the appropriate Jacobian matrix to the incoming gradients.

However, I was wondering whether there was a more elegant (and potentially faster) solution, where I could overwrite the "Type Variable doesn't implement stateless method svd" Error directly, call Lapack, etc. ?

If someone could guide me through the appropriate steps and source files I need to look at, I'd be very grateful. I suppose these steps would similarly apply to other linear algebra operations which have no associated backward method currently.

2

2 Answers

2
votes

torch.svd with forward and backward pass is now available in the Pytorch master:

http://pytorch.org/docs/master/torch.html#torch.svd

You need to install Pytorch from source: https://github.com/pytorch/pytorch/#from-source

0
votes

PyTorch's torch.linalg.svd operation supports gradient calculations, but note:

Gradients computed using U and Vh may be unstable if input is not full rank or has non-unique singular values.