Replies: 1 comment 2 replies
-
Are you doing this from the Kohya webui? I tried it today and while it extracted a Lora (about 600mb at the default 128 rank), using that Lora in the A1111 webui under Extra Networks resulted in pretty much zero difference compared to the base model. I don't really understand why. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I recently discovered that from a db-trained model, without LoRA training, LoRA can be extracted. At the same time, the quality of the LoRA model is quite good and is almost as good as the original db model. I don't think I've seen this fact anywhere. Or I didn't search well. I'm wondering if there are any instructions for this, or maybe someone else has done this? I've settled on dim 300 for the time being, although it may make sense to take more, but at 300 the LoRA model already weighs ~ 700mb.
Beta Was this translation helpful? Give feedback.
All reactions