Skip to content

Latest commit

 

History

History
10 lines (7 loc) · 711 Bytes

File metadata and controls

10 lines (7 loc) · 711 Bytes

ModernBERT Fine-Tune: IMDB Example

This notebook contains a recipe for fine-tuning ModernBERT-base for a binary classification task. The IMDB reviews dataset is used as a toy example, but the code can adapted for other workflows.

Running the Code

  • The notebook can be run on free-tier Google Colab runtimes.
  • I used a batch size of 4 and an eval batch size of 2 on a free T4 GPU runtime.
  • While ModernBERT supports up to 8192 tokens, I kept this example to 1024 tokens.
    • I wanted to test the performance of the model using a moderately sized max length.