Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Confusion between class and instance attributes #10744

Closed
christopher5106 opened this issue Feb 7, 2025 · 3 comments
Closed

Confusion between class and instance attributes #10744

christopher5106 opened this issue Feb 7, 2025 · 3 comments

Comments

@christopher5106
Copy link

It might not be a bug but it is still a problem.

The method load_lora_weights() uses the instance attritubes self.transformer_name and self._control_lora_supported_norm_keys but later on calls the class method self._load_norm_into_transformer in the same flow and since it's a class method, it's going to use the class attributes cls.transformer_name and cls._control_lora_supported_norm_keys .

Since they are in the same flow, in fact better use the same value everywhere, either one, or the other. In fact, there are lot of subclasses of FluxLoaderMixin and if an instance modifies either transformer_name or _control_lora_supported_norm_keys, it will lead to incoherencies.

@christopher5106
Copy link
Author

christopher5106 commented Feb 7, 2025

Better convert class methods into static methods and for example for _load_norm_into_transformer you can use the instance attribute transformer.transformer_name and transformer._control_lora_supported_norm_keys since you have transformer in the params

@DN6
Copy link
Collaborator

DN6 commented Feb 12, 2025

cc: @sayakpaul

@sayakpaul
Copy link
Member

Thanks for providing your inputs.

Since they are in the same flow, in fact better use the same value everywhere, either one, or the other. In fact, there are lot of subclasses of FluxLoaderMixin and if an instance modifies either transformer_name or _control_lora_supported_norm_keys, it will lead to incoherencies.

I am going to assume this modification will be done by the developer with everything in mind.

transformer.transformer_name and transformer._control_lora_supported_norm_keys

This I think we don't want as transformer_name and _control_lora_supported_norm_keys only become relevant during loading LoRAs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants