You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Seems the model merge is giving the WARNING SHAPE MISMATCH error again. I looked into the merge_patcher file seen from other errors from back in March and see that the typo was corrected, so seems there is another issue somewhere.
Please look into this. Thanks.
WARNING SHAPE MISMATCH diffusion_model.output_blocks.8.0.out_layers.3.weight WEIGHT NOT MERGED torch.Size([320, 320, 3, 3]) != torch.Size([640, 640, 3, 3]) WARNING SHAPE MISMATCH diffusion_model.output_blocks.8.0.out_layers.3.bias WEIGHT NOT MERGED torch.Size([320]) != torch.Size([640]) WARNING SHAPE MISMATCH diffusion_model.output_blocks.8.0.skip_connection.weight WEIGHT NOT MERGED torch.Size([320, 640, 1, 1]) != torch.Size([640, 960, 1, 1]) WARNING SHAPE MISMATCH diffusion_model.output_blocks.8.0.skip_connection.bias WEIGHT NOT MERGED torch.Size([320]) != torch.Size([640]) 0%| | 0/20 [00:00<?, ?it/s]D:\ComfyUI\ComfyUI\comfy\ldm\modules\attention.py:407: UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at ..\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:455.) out = torch.nn.functional.scaled_dot_product_attention(q, k, v, attn_mask=mask, dropout_p=0.0, is_causal=False) C:\Users\ryank\AppData\Local\Programs\Python\Python310\lib\site-packages\torchsde\_brownian\brownian_interval.py:608: UserWarning: Should have tb<=t1 but got tb=14.614644050598145 and t1=14.614643. warnings.warn(f"Should have {tb_name}<=t1 but got {tb_name}={tb} and t1={self._end}.")
The text was updated successfully, but these errors were encountered:
Seems the model merge is giving the WARNING SHAPE MISMATCH error again. I looked into the merge_patcher file seen from other errors from back in March and see that the typo was corrected, so seems there is another issue somewhere.
Please look into this. Thanks.
WARNING SHAPE MISMATCH diffusion_model.output_blocks.8.0.out_layers.3.weight WEIGHT NOT MERGED torch.Size([320, 320, 3, 3]) != torch.Size([640, 640, 3, 3]) WARNING SHAPE MISMATCH diffusion_model.output_blocks.8.0.out_layers.3.bias WEIGHT NOT MERGED torch.Size([320]) != torch.Size([640]) WARNING SHAPE MISMATCH diffusion_model.output_blocks.8.0.skip_connection.weight WEIGHT NOT MERGED torch.Size([320, 640, 1, 1]) != torch.Size([640, 960, 1, 1]) WARNING SHAPE MISMATCH diffusion_model.output_blocks.8.0.skip_connection.bias WEIGHT NOT MERGED torch.Size([320]) != torch.Size([640]) 0%| | 0/20 [00:00<?, ?it/s]D:\ComfyUI\ComfyUI\comfy\ldm\modules\attention.py:407: UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at ..\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:455.) out = torch.nn.functional.scaled_dot_product_attention(q, k, v, attn_mask=mask, dropout_p=0.0, is_causal=False) C:\Users\ryank\AppData\Local\Programs\Python\Python310\lib\site-packages\torchsde\_brownian\brownian_interval.py:608: UserWarning: Should have tb<=t1 but got tb=14.614644050598145 and t1=14.614643. warnings.warn(f"Should have {tb_name}<=t1 but got {tb_name}={tb} and t1={self._end}.")
The text was updated successfully, but these errors were encountered: