Skip to content

Forge was updated today - Errors on inference when Model Mixer isn't enabled #161

@CCpt5

Description

@CCpt5

Hey!

Forge lllyasviel was very kind and provided a much needed update to Forge today (including upgrading GRadio to 4.0). I installed MMixer from the installer and w/o even enabling it I've noticed errors in the console:

https://github.com/lllyasviel/stable-diffusion-webui-forge/commits/main/

---
To load target model SDXL
Begin to load 1 model
Reuse 1 loaded models
[Memory Management] Current Free GPU Memory (MB) =  15968.17822265625
[Memory Management] Model Memory (MB) =  0.0
[Memory Management] Minimal Inference Memory (MB) =  1024.0
[Memory Management] Estimated Remaining GPU Memory (MB) =  14944.17822265625
Moving model(s) has taken 0.02 seconds
100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 22/22 [00:05<00:00,  3.97it/s]
*** Error running before_process: D:\stable-diffusion-webui-forge\extensions\sd-webui-model-mixer\scripts\model_mixer.py                                                                                                                           | 132/56100 [00:37<3:54:53,  3.97it/s]
    Traceback (most recent call last):
      File "D:\stable-diffusion-webui-forge\modules\scripts.py", line 836, in before_process
        script.before_process(p, *script_args)
    TypeError: ModelMixerScript.before_process() missing 6 required positional arguments: 'enabled', 'model_a', 'base_model', 'mm_max_models', 'mm_finetune', and 'mm_states'

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions