Skip to content

Conversation

@DrishtiShrrrma
Copy link
Contributor

@DrishtiShrrrma DrishtiShrrrma commented Oct 30, 2025

Refs DrishtiShrrrma#3

Summary

  • This PR updates the Transformers Version Recommendation so that Molmo-series models don’t default to “latest” or some other incompatible version. We pin to tested, working versions. This prevents a reproducible crash.

What changed

  • Add the bullet that suggests compatible transformers version for Molmo-series models as **“transformers==latest didn't work for Molmo-series model (molmo-7B-D-0924).
  • **Use transformers==4.46.1 or 4.50.3 or 4.51 or 4.53 ** for Molmo series (e.g., allenai/Molmo-7B-D-0924).

Tests

!python run.py --data CountBenchQA --model molmo-7B-D-0924 --verbose
!python run.py --data MMBench_DEV_EN MME SEEDBench_IMG --model molmo-7B-D-0924 --verbose

Observations (tested)

transformers Result Error / Notes
4.50.3 ✅ Works Stable across tested benchmarks
4.46.1 ✅ Works Stable across tested benchmarks
4.51 ✅ Works Stable across tested benchmarks
4.53 ✅ Works Stable across tested benchmarks
5.0.0.dev0 ❌ Fails AttributeError: 'NoneType' object has no attribute 'size'

Benchmarks tested

CountBenchQA, MMBench_DEV_EN, MME, SEEDBench_IMG

Model tested

allenai/Molmo-7B-D-0924 (key: molmo-7B-D-0924 )

Checklist

  • ✅ Code compiles
  • ✅ Local smoke tests pass

docs(readme): pin transformers==4.48.0 (or 4.46.0) for LLaVA-Next
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant