Skip to content

Commit

Permalink
Enabling inplace relu
Browse files Browse the repository at this point in the history
Summary: Pull Request resolved: pytorch#28710

Test Plan: Imported from OSS

Differential Revision: D18146120

Pulled By: z-a-f

fbshipit-source-id: d8f0982f5a2ae35f7deb34e67cdb64be700a9d6c
  • Loading branch information
z-a-f authored and soumith committed Nov 4, 2019
1 parent f7f5385 commit ee77ccb
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions torch/nn/quantized/modules/activation.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,10 +31,10 @@ class ReLU(torch.nn.ReLU):
"""
def __init__(self, inplace=False):
super(ReLU, self).__init__(inplace)
assert not inplace, 'torch.nn.quantized.ReLU does not support inplace'
self.inplace = inplace

def forward(self, input):
return torch.nn.quantized.functional.relu(input)
return torch.nn.quantized.functional.relu(input, inplace=self.inplace)

def _get_name(self):
return 'QuantizedReLU'
Expand Down

0 comments on commit ee77ccb

Please sign in to comment.