Replies: 2 comments 1 reply
-
Any reason as to why you're using In my app I can do this: The only LoRA that gets used in that image is the In short, I manage a list of LoRAs, to set which are loaded in the VRAM or used depends if the user adds, enables or adjusts their weights: if self.lora:
if isinstance(self.lora, list):
unzipped_list = zip(*self.lora)
reordered_list = [list(item) for item in unzipped_list]
self.model.set_adapters(*reordered_list)
else:
self.model.set_adapters([self.lora[0]], [self.lora[1]]) This is for my code which is a little more complex than just a list and I don't use pipelines, but with the Maybe I'm missing something in what you're trying to do, but all of this is in the documentation: https://huggingface.co/docs/diffusers/tutorials/using_peft_for_inference If you need help on how to maintain a list of LoRAs to use the Edit: if you're using the |
Beta Was this translation helpful? Give feedback.
-
Cc: @BenjaminBossan in case you have anything more to add. |
Beta Was this translation helpful? Give feedback.
-
I have a GPU 4090 with 24GB of memory, and I'm using SDXL and SDXL Lora. I want to achieve the following effect:
Within the limited 24GB memory, suppose I have 26 Loras named from A to Z. I want to implement pseudocode such that when a request passes the name of a Lora, if it's Z, for example, I'll load Z, set its weight to 0.3, perform inference, and then unload this Lora, but cuda would not release memory , and also this would waste a lot of loading time.
So, I thought of using the LFU mechanism to load 5 to 10 Loras at once. Then, by checking if the requested Lora name hits the currently loaded list, if it does, set its weight to 0.3 and others to 0. If it doesn't hit, load it and set the weights of others to 0. I referred to the official documentation of diffusers.
Below is my code:
My implementation is flawed because load_lora_weights cannot execute the same name twice. Is there a better way to implement this request?
thanks in advance
Beta Was this translation helpful? Give feedback.
All reactions