Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model Merge Not Working Again #3847

Open
rking2981 opened this issue Jun 24, 2024 · 1 comment
Open

Model Merge Not Working Again #3847

rking2981 opened this issue Jun 24, 2024 · 1 comment
Labels
Potential Bug User is reporting a bug. This should be tested.

Comments

@rking2981
Copy link

Seems the model merge is giving the WARNING SHAPE MISMATCH error again. I looked into the merge_patcher file seen from other errors from back in March and see that the typo was corrected, so seems there is another issue somewhere.

Please look into this. Thanks.

WARNING SHAPE MISMATCH diffusion_model.output_blocks.8.0.out_layers.3.weight WEIGHT NOT MERGED torch.Size([320, 320, 3, 3]) != torch.Size([640, 640, 3, 3]) WARNING SHAPE MISMATCH diffusion_model.output_blocks.8.0.out_layers.3.bias WEIGHT NOT MERGED torch.Size([320]) != torch.Size([640]) WARNING SHAPE MISMATCH diffusion_model.output_blocks.8.0.skip_connection.weight WEIGHT NOT MERGED torch.Size([320, 640, 1, 1]) != torch.Size([640, 960, 1, 1]) WARNING SHAPE MISMATCH diffusion_model.output_blocks.8.0.skip_connection.bias WEIGHT NOT MERGED torch.Size([320]) != torch.Size([640]) 0%| | 0/20 [00:00<?, ?it/s]D:\ComfyUI\ComfyUI\comfy\ldm\modules\attention.py:407: UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at ..\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:455.) out = torch.nn.functional.scaled_dot_product_attention(q, k, v, attn_mask=mask, dropout_p=0.0, is_causal=False) C:\Users\ryank\AppData\Local\Programs\Python\Python310\lib\site-packages\torchsde\_brownian\brownian_interval.py:608: UserWarning: Should have tb<=t1 but got tb=14.614644050598145 and t1=14.614643. warnings.warn(f"Should have {tb_name}<=t1 but got {tb_name}={tb} and t1={self._end}.")

@mcmonkey4eva
Copy link
Collaborator

Can you post a workflow that replicates this bug?

@mcmonkey4eva mcmonkey4eva added the Potential Bug User is reporting a bug. This should be tested. label Jun 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Potential Bug User is reporting a bug. This should be tested.
Projects
None yet
Development

No branches or pull requests

2 participants