Load both lora and ip-adapter in StableDiffusionPipeline shows error: 'UNet2DConditionModel' object has no attribute 'attn_processors' #7280
-
I want to load multiple lora and ip-adapter models to StableDiffusionPipeline. And I want to set lora weights and adapter weights each time I call api. Here is my code:
However, method set_ip_adapter_scale throws an error:
I found the problem falls in method get_processor in class Attention in attention_processor.py (https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py) where it is trying to find class LoRAIPAdapterAttnProcessor2_0 where there is no such class. Then I found one implementation of class LoRAIPAdapterAttnProcessor2_0 in (https://github.com/huggingface/diffusers/blob/main/examples/community/ip_adapter_face_id.py)
Since I'm new on this and I have no idea which state_dict should I pass to to_k_ip.load_state_dic and to_v_ip.load_state_dict. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 6 replies
-
What version of |
Beta Was this translation helpful? Give feedback.
-
hi all, thanks Sayak for adding me here I don't think Unfortunately, I am not able to reproduce your problem as I don't have your checkpoints. Here you have a working code snippet : from diffusers import StableDiffusionPipeline
base_model_path ="runwayml/stable-diffusion-v1-5/v1-5-pruned.safetensors"
pipeline = StableDiffusionPipeline.from_single_file(
base_model_path,
torch_dtype=torch.float16,
use_safetensors=True
)
pipeline.to("cuda")
pipeline.load_lora_weights("sayakpaul/sd-model-finetuned-lora-t4", weight_name="pytorch_lora_weights.bin")
pipeline.load_ip_adapter("h94/IP-Adapter", subfolder="models", weight_name="ip-adapter-plus-face_sd15.safetensors")
pipeline.set_ip_adapter_scale(0.7) |
Beta Was this translation helpful? Give feedback.
I see, upgrading to the current diffusers will solve the issue
please run: pip install git+https://github.com/huggingface/diffusers