-
Notifications
You must be signed in to change notification settings - Fork 97
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
torch.OutOfMemoryError: Allocation on device #211
Comments
i think now its better to use gguf diffusion model loader with native comfyui clip loader (fp8 or fp16) |
The following CLIP Loader exists in Comfy Core.
However, none of them can select the GGUF (t5-v1_1-xxl-encoder-f16.gguf) model. I would appreciate it if you could tell me which node you are referring to. |
gguf clip models not supported in new comfyui versions |
I don't understand, I just tried to create an image using the DUALCLIPLoader(GGUF) node from ComfyUI-GGUF a while ago and it worked fine. |
try this trick to pypass negative prompt or using this workflow where NEG not needed (drop image to comfyui)https://comfyanonymous.github.io/ComfyUI_examples/flux/flux_dev_example.png |
Thank you very much. With the {ConditioningZeroOut], the excution time is reduced. Although the "OutOfMemoryError" appears at Positive prompt this time. :) |
My PC uses GTX3070.
When I start running the GGUF workflow, I get an OutOfMemoryError at the Negative Prompt, but what's even weirder is that if I press the "Queue" button again, it goes through and generates the image just fine.
This happens at least 5 times out of 10, and I think the owner need to find out why this is happening.
ComfyUI Error Report
Error Details
Stack Trace
The text was updated successfully, but these errors were encountered: