Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incredibly High VRAM Usage #43

Open
setothegreat opened this issue Apr 3, 2023 · 0 comments
Open

Incredibly High VRAM Usage #43

setothegreat opened this issue Apr 3, 2023 · 0 comments

Comments

@setothegreat
Copy link

When running 2 ControlNet models on a 768x768 SD 1.5 model image generation through WebUI, my VRAM usage sits at around 6gb, and then goes up to 8gb when using the hires fix to continue generating at 1420x1420.

When I do the same setup but with PWW enabled, my VRAM usage spikes up to 21gb during the 768x768 generation, and then crashes with an OOM error when trying to use the hires fix at 1420x1420, saying that it tried to allocate 29gb of VRAM (!!!), which is more than my 3090 has.

I don't recall having this same issue when running a generation at 512x512 and then hires fixing at 1024x1024 around a month ago; I'm just going based on memory, but I'm pretty sure it only used 12gb and 16gb respectively. I don't currently have a 512 model to test it on at the moment, but running a 768 model at 512x512 resulted in 14gb of VRAM usage, and a crash with an OOM error trying to allocate 2gb when hires fixed to 1024x1024.

Not sure if this is intentional behavior, but if it is, this would make the extension pretty much useless to anyone with any consumer grade GPU, even the highest end available.
But if it's not intentional behavior, I thought I'd bring it up.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant