-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Move trg_embedder inside decoder #560
Conversation
I can take a look at this, but would you mind adjusting the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks good and makes sense to me. Could you make sure the configs under the example/
folder (including the programmatic example) and the recipes/
folder are up to date? Also, the docs (experiment_config_files, translator_structure) would need some small updates. With those I think this will be good to merge!
LGTM, please feel free to merge! |
See Issue 556.
This PR moves the trg_embedder inside the decoder, and gives the decoder access to the raw sequence of tokens. This is particularly useful if the decoder is generating e.g. a series of RNNG actions, and not simply a linear sequence of words to be embedded.