This repository was archived by the owner on Nov 15, 2022. It is now read-only.
This repository was archived by the owner on Nov 15, 2022. It is now read-only.
Adapt code to use NestedTensor #313
Open
Description
I have a model where I would love to use NestedTensor, I have a lot of padding going on and nested tensors would save a lot of memory, the net where I would like to use them is composed by a linear layer followed by batchnorm and Relu, finally a max operation is done over the channels.
Foward looks like this
def forward(self, inputs):
x = self.linear(inputs)
x = self.norm(x.permute(0, 2, 1).contiguous()).permute(0, 2, 1).contiguous()
x = F.relu(x)
x_max = torch.max(x, dim=1, keepdim=True)[0]
return x_max
Is it possible to use Nested Tensors? The project supports python 3.6+, pytorch 0.4.1+.
Thank you in advance
Metadata
Metadata
Assignees
Labels
No labels