Skip to content

Doesn't work with Power Lora Loader (standard ComfyUI node from rgthree), crashes #96

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
jnpatrick99 opened this issue Jan 16, 2025 · 2 comments

Comments

@jnpatrick99
Copy link

jnpatrick99 commented Jan 16, 2025

It generates the following code that cannot be processed by black:

power_lora_loader_rgthree_631 = power_lora_loader_rgthree.load_loras(PowerLoraLoaderHeaderWidget={'type': 'PowerLoraLoaderHeaderWidget'}, lora_1={'on': False, 'lora': 'lora.safetensors', 'strength': 1.2}, lora_2={'on': False, 'lora': 'lora2.safetensors', 'strength': 0.7}, ➕ Add Lora="", model=get_value_at_index(any_switch_rgthree_633, 0), clip=get_value_at_index(dualcliploader_11, 0))

this is generated in assemble_python_code.py in def assemble_python_code:

final_code = black.format_str(final_code, mode=black.Mode())

Please notice the "➕ Add Lora" where ➕ is an illegal character.

@Tuich
Copy link

Tuich commented Jan 20, 2025

facing exactly the same problem

@mbylstra
Copy link

mbylstra commented Apr 4, 2025

You can try #114

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants