Skip to content

Commit 1ea4259

Browse files
authored
Use the embedding dropout
I don't know if the embedding dropout is defined by mistake (haven't check the tf implementation) but it's not used in the forward pass. This PR fixes it. If it shouldn't be there at all, then the dropout and the `embd_pdrop` in the default config should probably be removed.
1 parent d848a49 commit 1ea4259

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

Diff for: model_pytorch.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -166,7 +166,7 @@ def __init__(self, cfg, vocab=40990, n_ctx=512):
166166

167167
def forward(self, x):
168168
x = x.view(-1, x.size(-2), x.size(-1))
169-
e = self.embed(x)
169+
e = self.drop(self.embed(x))
170170
# Add the position information to the input embeddings
171171
h = e.sum(dim=2)
172172
for block in self.h:

0 commit comments

Comments
 (0)