v9.0.0: better learning rate schedules, integration of thinc-apple-ops
The main new feature of Thinc v9 is the support for learning rate schedules that can take the training dynamics into account. For example, the new
plateau.v1
schedule scales the learning rate when no progress has been found after a given number of evaluation steps. Another visible change is thatAppleOps
is now part of Thinc, so it is not necessary anymore to installthinc-apple-ops
to use the AMX units on Apple Silicon.
✨ New features and improvements
- Learning rate schedules can now take the training step as well as an arbitrary set of keyword arguments. This makes it possible to pass information such a the parameter name and last evaluation score to determine the learning rate (#804).
- Added the
plateau.v1
schedule (#842). This schedule scales the learning rate if training was found to be stagnant for a given period. - The functionality of
thinc-apple-ops
is integrated into Thinc (#927). Starting with this version of Thinc, it is not necessary anymore to installthinc-apple-ops
.
🔴 Bug fixes
- Fix the use of thread-local storage (#917).
⚠️ Backwards incompatibilities
- Thinc v9.0.0 only support Python 3.9 and later.
- Schedules are not generators anymore, but implementations of the
Schedule
class (#804). thinc.backends.linalg
has been removed (#742). The same functionality is provided by implementations in BLAS that are better tested and more performant.thinc.extra.search
has been removed (#743). The beam search functionality in this module was strongly coupled to the spaCy transition parser and has therefore moved to spaCy in v4.
👥 Contributors
@adrianeboyd, @danieldk, @honnibal, @ines, @kadarakos, @shadeMe, @svlandeg