Skip to content

[consistency distillation] LoRA scripts omits EMA update #6505

Closed
@jon-chuang

Description

@jon-chuang

Describe the bug

Consistency distillation scripts omit EMA update of teacher unet

I think that this will limit the quality in the training limit, as the teacher will not be updated to produce better one-step denoising.

I think the idea of EMA is that it can mimic something like progressive distillation, which in itself is shown to improve few-step diffusion over baseline.

Reproduction

Check the scripts

Logs

No response

System Info

main

Who can help?

cc @shuminghu regarding whether EMA was necessary to achieve high quality results for LCM.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingstaleIssues that haven't received updates

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions