Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ip adapter functionality for StableDiffusionXLControlNetUnionPipeline #55

Open
Sagnik2810Sarkar opened this issue Aug 9, 2024 · 0 comments

Comments

@Sagnik2810Sarkar
Copy link

Hey I am trying to add all the necessary code for IP Adapter support in StableDiffusionXLControlNetUnionPipeline and keep running into some bugs I can't seem to get to the root of.

Here's the stack trace for the bug I am encountering:
Traceback (most recent call last):
File "predict.py", line 221, in predict
output = pipe(**common_args, **cn_kwargs, **self.ip_adapter_kwargs)
File "/root/.pyenv/versions/3.9.19/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/src/ControlNetPlus/pipeline/pipeline_controlnet_union_sd_xl.py", line 1244, in call
down_block_res_samples, mid_block_res_sample = self.controlnet(
File "/root/.pyenv/versions/3.9.19/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/src/ControlNetPlus/models/controlnet_union.py", line 890, in forward
sample, res_samples = downsample_block(
File "/root/.pyenv/versions/3.9.19/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/root/.pyenv/versions/3.9.19/lib/python3.9/site-packages/diffusers/models/unets/unet_2d_blocks.py", line 1279, in forward
hidden_states = attn(
File "/root/.pyenv/versions/3.9.19/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/root/.pyenv/versions/3.9.19/lib/python3.9/site-packages/diffusers/models/transformers/transformer_2d.py", line 397, in forward
hidden_states = block(
File "/root/.pyenv/versions/3.9.19/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/root/.pyenv/versions/3.9.19/lib/python3.9/site-packages/diffusers/models/attention.py", line 366, in forward
attn_output = self.attn2(
File "/root/.pyenv/versions/3.9.19/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/root/.pyenv/versions/3.9.19/lib/python3.9/site-packages/diffusers/models/attention_processor.py", line 522, in forward
return self.processor(
File "/root/.pyenv/versions/3.9.19/lib/python3.9/site-packages/diffusers/models/attention_processor.py", line 1272, in call
query = query.view(batch_size, -1, attn.heads, head_dim).transpose(1, 2)
RuntimeError: shape '[3, -1, 10, 64]' is invalid for input of size 5242880

Any help in trying to debug this would be appreciated, even a pointer in terms what kind of inputs might be causing a shape mismatch like this. Feel free to ask what modules you need to look at to get a better idea of what might be causing this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant