Skip to content

Conversation

@djpetti
Copy link

@djpetti djpetti commented Oct 17, 2025

LoRA layers are currently not pickleable due to the use of a lambda function. This matters because it breaks certain DDP configurations when training with Torch, given that they use pickle to pass a model between training processes.

I resolved this by replacing the lambda function with nn.Identity.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant