Skip to content

lack of support of loading lora weights in PixArtAlphaPipeline #9887

Open
@DaaadShot

Description

@DaaadShot

pipe = PixArtAlphaPipeline.from_pretrained(model_id, torch_dtype=torch.float16)
pipe.load_lora_weights("xxx")
When I want to load lora in PixArtAlphaPipeline, it throws this error:
AttributeError: 'PixArtAlphaPipeline' object has no attribute 'load_lora_weights'

May be we can add the lora support in this pipeline?
Using the peft method like this is not convenient, and the effect is not good...
transformer = Transformer2DModel.from_pretrained("PixArt-alpha/PixArt-LCM-XL-2-1024-MS", subfolder="transformer", torch_dtype=torch.float16)
transformer = PeftModel.from_pretrained(transformer, "Your-LoRA-Model-Path")

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions