-
Notifications
You must be signed in to change notification settings - Fork 73
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pytorch-transformer and Allennlp Compatibility #26
Comments
Hi there, sorry about the delay. what sort of errors are you getting? |
Same here. environment.yml is not working right now.... |
Late response, but in your environment.yml file, change the allennlp line to: |
Hi, I'm facing this issue too. Can you @ALL share your thoughts on this? Many thanks |
@mikeleatila Did my comment not fix it for you...? You obviously have to reinstall the package and/or environment |
Thank you for your reply. Actually I was following the very same advice w.r.t the conda environment and the scripts' executions, i.e. by running firstly python -m scripts.download_model and then python -m scripts.train but I am always getting this error: /home/mikeleatila/anaconda3/envs/domains/bin/python /home/mikeleatila/dont_stop_pretraining_master/scripts/train.py --config training_config/classifier.jsonnet --serialization_dir model_logs/citation-intent-dapt-dapt --hyperparameters ROBERTA_CLASSIFIER_SMALL --dataset citation_intent --model /home/mikeleatila/dont_stop_pretraining_master/pretrained_models/dsp_roberta_base_dapt_cs_tapt_citation_intent_1688 --device 0 --perf +f1 --evaluate_on_test Process finished with exit code 1 Many thanks in advance! |
Oh right - that's a different error. In (I'm recalling this from memory, but I believe these are the right files and steps to follow) |
@nihirv Thanks a lot, that worked :) |
As the authors mentioned in README.md,
pytorch-transformers 1.20 is not compatible with the specified branch in environment.yml.
I tried:
However, I couldn't get this working.
Has anyone been able to run the basic model(Roberta) recently?
The text was updated successfully, but these errors were encountered: