forked from AUTOMATIC1111/stable-diffusion-webui
-
Notifications
You must be signed in to change notification settings - Fork 179
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
2085e77
commit d23a172
Showing
2 changed files
with
0 additions
and
6 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
d23a172
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, how to go back to the previous version, onnxruntime gives an error, instant-id doesn't work, adetailer doesn't work. What has changed for the better, I won't see?
From https://github.com/lshqqytiger/stable-diffusion-webui-directml
Already up to date.
venv "C:\AI\stable-diffusion-webui-directml\venv\Scripts\Python.exe"
WARNING: ZLUDA works best with SD.Next. Please consider migrating to SD.Next.
fatal: No names found, cannot describe anything.
Python 3.10.6 (tags/v3.10.6:9c7b4bd, Aug 1 2022, 21:53:49) [MSC v.1932 64 bit (AMD64)]
Version: 1.9.4
Commit hash: d23a172
Using ZLUDA in C:\AI\stable-diffusion-webui-directml.zluda
no module 'xformers'. Processing without...
no module 'xformers'. Processing without...
No module 'xformers'. Proceeding without it.
C:\AI\stable-diffusion-webui-directml\venv\lib\site-packages\pytorch_lightning\utilities\distributed.py:258: LightningDeprecationWarning:
pytorch_lightning.utilities.distributed.rank_zero_only
has been deprecated in v1.8.1 and will be removed in v2.0.0. You can import it frompytorch_lightning.utilities
instead.rank_zero_deprecation(
Launching Web UI with arguments: --autolaunch --theme dark --use-zluda
C:\AI\stable-diffusion-webui-directml\venv\lib\site-packages\diffusers\utils\outputs.py:63: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead.
torch.utils._pytree._register_pytree_node(
C:\AI\stable-diffusion-webui-directml\venv\lib\site-packages\diffusers\utils\outputs.py:63: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead.
torch.utils._pytree._register_pytree_node(
ONNX: version=1.18.0 provider=AzureExecutionProvider, available=['AzureExecutionProvider', 'CPUExecutionProvider']
[-] ADetailer initialized. version: 24.5.1, num models: 10
ControlNet preprocessor location: C:\AI\stable-diffusion-webui-directml\extensions\sd-webui-controlnet\annotator\downloads
2024-06-01 18:18:24,233 - ControlNet - INFO - ControlNet v1.1.449
Loading weights [1718b5bb2d] from C:\AI\stable-diffusion-webui-directml\models\Stable-diffusion\albedobaseXL_v21.safetensors
2024-06-01 18:18:24,762 - ControlNet - INFO - ControlNet UI callback registered.
Creating model from config: C:\AI\stable-diffusion-webui-directml\repositories\generative-models\configs\inference\sd_xl_base.yaml
Running on local URL: http://127.0.0.1:7860
To create a public link, set
share=True
inlaunch()
.Startup time: 21.6s (prepare environment: 18.7s, initialize shared: 5.4s, load scripts: 2.2s, create ui: 1.4s, gradio launch: 2.7s).
Applying attention optimization: Doggettx... done.
Model loaded in 9.2s (load weights from disk: 0.6s, create model: 1.2s, apply weights to model: 6.0s, move model to device: 0.2s, load textual inversion embeddings: 0.3s, calculate empty prompt: 0.8s).
2024-06-01 18:20:45,891 - ControlNet - INFO - unit_separate = False, style_align = False
2024-06-01 18:20:46,126 - ControlNet - INFO - Loading model: ip-adapter_instant_id_sdxl [eb2d3ec0]
2024-06-01 18:20:47,015 - ControlNet - INFO - Loaded state_dict from [C:\AI\stable-diffusion-webui-directml\models\ControlNet\ip-adapter_instant_id_sdxl.bin]
2024-06-01 18:20:49,975 - ControlNet - INFO - ControlNet model ip-adapter_instant_id_sdxl eb2d3ec0 loaded.
2024-06-01 18:20:49,988 - ControlNet - INFO - Using preprocessor: instant_id_face_embedding
2024-06-01 18:20:49,988 - ControlNet - INFO - preprocessor resolution = 512
*************** EP Error ***************
EP Error maximum recursion depth exceeded when using ['CUDAExecutionProvider', 'CPUExecutionProvider']
*** Error running process: C:\AI\stable-diffusion-webui-directml\extensions\sd-webui-controlnet\scripts\controlnet.py
Traceback (most recent call last):
File "C:\AI\stable-diffusion-webui-directml\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "C:\AI\stable-diffusion-webui-directml\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 437, in _create_inference_session
available_providers = C.get_available_providers()
File "C:\AI\stable-diffusion-webui-directml\modules\zluda.py", line 63, in
ort.capi._pybind_state.get_available_providers = lambda: [v for v in ort.get_available_providers() if v != 'CUDAExecutionProvider'] # pylint: disable=protected-access
File "C:\AI\stable-diffusion-webui-directml\modules\zluda.py", line 63, in
ort.capi._pybind_state.get_available_providers = lambda: [v for v in ort.get_available_providers() if v != 'CUDAExecutionProvider'] # pylint: disable=protected-access
File "C:\AI\stable-diffusion-webui-directml\modules\zluda.py", line 63, in
ort.capi._pybind_state.get_available_providers = lambda: [v for v in ort.get_available_providers() if v != 'CUDAExecutionProvider'] # pylint: disable=protected-access
[Previous line repeated 961 more times]
RecursionError: maximum recursion depth exceeded
d23a172
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed in 2c29feb