Skip to content

Commit

Permalink
UPDATE self attention module
Browse files Browse the repository at this point in the history
  • Loading branch information
cbokpark committed Jun 14, 2018
1 parent 0064e27 commit 8714a54
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion sagan_models.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ def forward(self,x):
attention = self.softmax(energy) # BX (N) X (N)
proj_value = self.value_conv(x).view(m_batchsize,-1,width*height) # B X C X N

out = torch.bmm(proj_value,attention.permute(0,1,2) )
out = torch.bmm(proj_value,attention.permute(0,2,1) )
out = out.view(m_batchsize,C,width,height)

out = self.gamma*out + x
Expand Down

0 comments on commit 8714a54

Please sign in to comment.