v0.9.0
The initial work-in-progress release coinciding with the launch of SuperGLUE.
Highlights:
We currently support two-phase training (pretraining and target task training) using various shared encoders, including:
- BERT
- OpenAI GPT
- Plain Transformer
- Ordered Neurons (ON-LSTM) Grammar Induction Model
- PRPN Grammar Induction Model
We also have support for SuperGLUE baselines, sentence encoder probing experiments, and STILTS-style training.
Examples
They can be found in https://github.com/nyu-mll/jiant/tree/master/config/examples