Skip to content
Advertisement

How to create n-dimensional sparse tensor? (pytorch)

I want to initialize tensor to sparse tensor. When tensor’s dimensional is 2, I can use torch.nn.init.sparse(tensor, sparsity=0.1)

JavaScript

Result

JavaScript

But when tensor dimensions > 2, this function isn’t work.

JavaScript

Result

JavaScript

I need this because I want to use it to initialize the convolution weights.

torch.nn.init.sparse_() function’s def is below

JavaScript

How could I make n-dimensional sparse tensor?

Is there a way in pytorch to create this kind of tensor?

Or can I make it another way?

Advertisement

Answer

This function is an implementation of the following method:

The best random initialization scheme we found was one of our own design, “sparse initialization”. In this scheme we hard limit the number of non-zero incoming connection weights to each unit (we used 15 in our experiments) and set the biases to 0 (or 0.5 for tanh units).

  • Deep learning via Hessian-free optimization – Martens, J. (2010).

The reason it is not supported for higher order tensors is because it maintains the same proportion of zeros in each column, and it is not clear which [subset of] dimensions this condition should be maintained across for higher order tensors.

You can implement this initialization strategy with dropout or an equivalent function e.g:

JavaScript

If you wish to enforce column, channel, etc-wise proportions of zeros (as opposed to just total proportion) you can implement logic similar to the original function.

User contributions licensed under: CC BY-SA
6 People found this is helpful
Advertisement