Open
Description
pipe = PixArtAlphaPipeline.from_pretrained(model_id, torch_dtype=torch.float16)
pipe.load_lora_weights("xxx")
When I want to load lora in PixArtAlphaPipeline, it throws this error:
AttributeError: 'PixArtAlphaPipeline' object has no attribute 'load_lora_weights'
May be we can add the lora support in this pipeline?
Using the peft method like this is not convenient, and the effect is not good...
transformer = Transformer2DModel.from_pretrained("PixArt-alpha/PixArt-LCM-XL-2-1024-MS", subfolder="transformer", torch_dtype=torch.float16)
transformer = PeftModel.from_pretrained(transformer, "Your-LoRA-Model-Path")