You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[BootstrapFinetune] Preparing the student and teacher programs...
[BootstrapFinetune] Bootstrapping data...
Average Metric: 500.00 / 500 (100.0%): 100%|██████████| 500/500 [00:04<00:00, 115.30it/s]2025/02/05 21:40:32 INFO dspy.evaluate.evaluate: Average Metric: 500 / 500 (100.0%)
[BootstrapFinetune] Preparing the train data...
[BootstrapFinetune] Using 500 data points for fine-tuning the model: openai/local:meta-llama/Llama-3.2-1B-Instruct
[BootstrapFinetune] Starting LM fine-tuning...
[BootstrapFinetune] 1 fine-tuning job(s) to start
[BootstrapFinetune] Starting 1 fine-tuning job(s)...
[Local Provider] Data saved to /root/.dspy_cache/finetune/4bec8714ae43838b.jsonl
[Local Provider] Starting local training, will save to /root/.dspy_cache/finetune/4bec8714ae43838b__meta-llama__Llama-3.2-1B-Instruct
Using device: cuda
Adding pad token to tokenizer
Creating dataset
Map: 100%
500/500 [00:00<00:00, 2403.78 examples/s]
2025/02/05 21:40:57 ERROR dspy.clients.lm: name 'output_dir' is not defined
It seems to me that the problem is in dsp/dspy/clients/lm_local.py
it is missing in the code, in line 180:
output_dir = train_kwargs.get("output_dir", None)
Steps to reproduce
run notebook
DSPy version
2.6.2
The text was updated successfully, but these errors were encountered:
What happened?
When running https://dspy.ai/tutorials/classification_finetuning/ inside Google Colab, Line:
classify_ft = optimizer.compile(student_classify, teacher=teacher_classify, trainset=unlabeled_trainset)
got error:
[BootstrapFinetune] Preparing the student and teacher programs...
[BootstrapFinetune] Bootstrapping data...
Average Metric: 500.00 / 500 (100.0%): 100%|██████████| 500/500 [00:04<00:00, 115.30it/s]2025/02/05 21:40:32 INFO dspy.evaluate.evaluate: Average Metric: 500 / 500 (100.0%)
[BootstrapFinetune] Preparing the train data...
[BootstrapFinetune] Using 500 data points for fine-tuning the model: openai/local:meta-llama/Llama-3.2-1B-Instruct
[BootstrapFinetune] Starting LM fine-tuning...
[BootstrapFinetune] 1 fine-tuning job(s) to start
[BootstrapFinetune] Starting 1 fine-tuning job(s)...
[Local Provider] Data saved to /root/.dspy_cache/finetune/4bec8714ae43838b.jsonl
[Local Provider] Starting local training, will save to /root/.dspy_cache/finetune/4bec8714ae43838b__meta-llama__Llama-3.2-1B-Instruct
Using device: cuda
Adding pad token to tokenizer
Creating dataset
Map: 100%
500/500 [00:00<00:00, 2403.78 examples/s]
2025/02/05 21:40:57 ERROR dspy.clients.lm: name 'output_dir' is not defined
It seems to me that the problem is in dsp/dspy/clients/lm_local.py
it is missing in the code, in line 180:
output_dir = train_kwargs.get("output_dir", None)
Steps to reproduce
run notebook
DSPy version
2.6.2
The text was updated successfully, but these errors were encountered: