Skip to content

Add hyperparameter sweeps for trainers #226

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 47 commits into from
May 16, 2025
Merged

Add hyperparameter sweeps for trainers #226

merged 47 commits into from
May 16, 2025

Conversation

deep1401
Copy link
Contributor

@deep1401 deep1401 commented Apr 21, 2025

Sweeps added till now:

  • Llama Trainer
  • MLX LoRA Trainer
  • Multi GPU Llama Trainer

@deep1401 deep1401 requested review from aliasaria and dadmobile April 21, 2025 22:34
@deep1401 deep1401 marked this pull request as draft April 22, 2025 17:24
@deep1401 deep1401 marked this pull request as ready for review April 23, 2025 17:35
@deep1401 deep1401 marked this pull request as draft April 25, 2025 19:28
@deep1401 deep1401 requested a review from aliasaria May 2, 2025 15:56
@deep1401 deep1401 marked this pull request as ready for review May 2, 2025 20:11
Copy link

codecov bot commented May 12, 2025

Codecov Report

Attention: Patch coverage is 6.59898% with 184 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
transformerlab/shared/shared.py 3.94% 146 Missing ⚠️
transformerlab/routers/train.py 10.00% 18 Missing ⚠️
transformerlab/routers/jobs.py 16.66% 10 Missing ⚠️
transformerlab/db.py 28.57% 5 Missing ⚠️
transformerlab/plugin_sdk/transformerlab/plugin.py 16.66% 5 Missing ⚠️

📢 Thoughts on this report? Let us know!

@deep1401 deep1401 merged commit 586f83c into main May 16, 2025
6 of 7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants