Skip to content

Commit

Permalink
remove dnn
Browse files Browse the repository at this point in the history
  • Loading branch information
lshqqytiger committed May 30, 2024
1 parent 2085e77 commit d23a172
Show file tree
Hide file tree
Showing 2 changed files with 0 additions and 6 deletions.
1 change: 0 additions & 1 deletion modules/cmd_args.py
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,6 @@
parser.add_argument("--use-cpu-torch", action="store_true", help="use torch built with cpu")
parser.add_argument("--use-directml", action="store_true", help="use DirectML device as torch device")
parser.add_argument("--use-zluda", action="store_true", help="use ZLUDA device as torch device")
parser.add_argument("--use-zluda-dnn", action="store_true", help="enable ZLUDA DNN")
parser.add_argument("--use-ipex", action="store_true", help="use Intel XPU as torch device")
parser.add_argument("--override-torch", type=str, help="override torch version", default=None)
parser.add_argument("--disable-model-loading-ram-optimization", action='store_true', help="disable an optimization that reduces RAM use when loading a model")
Expand Down
5 changes: 0 additions & 5 deletions modules/launch_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -567,11 +567,6 @@ def prepare_environment():
error = None
from modules import zluda_installer
try:
if args.use_zluda_dnn:
if zluda_installer.check_dnn_dependency():
zluda_installer.enable_dnn()
else:
print("Couldn't find the required dependency of ZLUDA DNN.")
zluda_path = zluda_installer.get_path()
zluda_installer.install(zluda_path)
zluda_installer.make_copy(zluda_path)
Expand Down

2 comments on commit d23a172

@Roninos
Copy link

@Roninos Roninos commented on d23a172 Jun 1, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi, how to go back to the previous version, onnxruntime gives an error, instant-id doesn't work, adetailer doesn't work. What has changed for the better, I won't see?
From https://github.com/lshqqytiger/stable-diffusion-webui-directml

  • branch master -> FETCH_HEAD
    Already up to date.
    venv "C:\AI\stable-diffusion-webui-directml\venv\Scripts\Python.exe"
    WARNING: ZLUDA works best with SD.Next. Please consider migrating to SD.Next.
    fatal: No names found, cannot describe anything.
    Python 3.10.6 (tags/v3.10.6:9c7b4bd, Aug 1 2022, 21:53:49) [MSC v.1932 64 bit (AMD64)]
    Version: 1.9.4
    Commit hash: d23a172
    Using ZLUDA in C:\AI\stable-diffusion-webui-directml.zluda
    no module 'xformers'. Processing without...
    no module 'xformers'. Processing without...
    No module 'xformers'. Proceeding without it.
    C:\AI\stable-diffusion-webui-directml\venv\lib\site-packages\pytorch_lightning\utilities\distributed.py:258: LightningDeprecationWarning: pytorch_lightning.utilities.distributed.rank_zero_only has been deprecated in v1.8.1 and will be removed in v2.0.0. You can import it from pytorch_lightning.utilities instead.
    rank_zero_deprecation(
    Launching Web UI with arguments: --autolaunch --theme dark --use-zluda
    C:\AI\stable-diffusion-webui-directml\venv\lib\site-packages\diffusers\utils\outputs.py:63: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead.
    torch.utils._pytree._register_pytree_node(
    C:\AI\stable-diffusion-webui-directml\venv\lib\site-packages\diffusers\utils\outputs.py:63: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead.
    torch.utils._pytree._register_pytree_node(
    ONNX: version=1.18.0 provider=AzureExecutionProvider, available=['AzureExecutionProvider', 'CPUExecutionProvider']
    [-] ADetailer initialized. version: 24.5.1, num models: 10
    ControlNet preprocessor location: C:\AI\stable-diffusion-webui-directml\extensions\sd-webui-controlnet\annotator\downloads
    2024-06-01 18:18:24,233 - ControlNet - INFO - ControlNet v1.1.449
    Loading weights [1718b5bb2d] from C:\AI\stable-diffusion-webui-directml\models\Stable-diffusion\albedobaseXL_v21.safetensors
    2024-06-01 18:18:24,762 - ControlNet - INFO - ControlNet UI callback registered.
    Creating model from config: C:\AI\stable-diffusion-webui-directml\repositories\generative-models\configs\inference\sd_xl_base.yaml
    Running on local URL: http://127.0.0.1:7860

To create a public link, set share=True in launch().
Startup time: 21.6s (prepare environment: 18.7s, initialize shared: 5.4s, load scripts: 2.2s, create ui: 1.4s, gradio launch: 2.7s).
Applying attention optimization: Doggettx... done.
Model loaded in 9.2s (load weights from disk: 0.6s, create model: 1.2s, apply weights to model: 6.0s, move model to device: 0.2s, load textual inversion embeddings: 0.3s, calculate empty prompt: 0.8s).
2024-06-01 18:20:45,891 - ControlNet - INFO - unit_separate = False, style_align = False
2024-06-01 18:20:46,126 - ControlNet - INFO - Loading model: ip-adapter_instant_id_sdxl [eb2d3ec0]
2024-06-01 18:20:47,015 - ControlNet - INFO - Loaded state_dict from [C:\AI\stable-diffusion-webui-directml\models\ControlNet\ip-adapter_instant_id_sdxl.bin]
2024-06-01 18:20:49,975 - ControlNet - INFO - ControlNet model ip-adapter_instant_id_sdxl eb2d3ec0 loaded.
2024-06-01 18:20:49,988 - ControlNet - INFO - Using preprocessor: instant_id_face_embedding
2024-06-01 18:20:49,988 - ControlNet - INFO - preprocessor resolution = 512
*************** EP Error ***************
EP Error maximum recursion depth exceeded when using ['CUDAExecutionProvider', 'CPUExecutionProvider']
*** Error running process: C:\AI\stable-diffusion-webui-directml\extensions\sd-webui-controlnet\scripts\controlnet.py
Traceback (most recent call last):
File "C:\AI\stable-diffusion-webui-directml\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "C:\AI\stable-diffusion-webui-directml\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 437, in _create_inference_session
available_providers = C.get_available_providers()
File "C:\AI\stable-diffusion-webui-directml\modules\zluda.py", line 63, in
ort.capi._pybind_state.get_available_providers = lambda: [v for v in ort.get_available_providers() if v != 'CUDAExecutionProvider'] # pylint: disable=protected-access
File "C:\AI\stable-diffusion-webui-directml\modules\zluda.py", line 63, in
ort.capi._pybind_state.get_available_providers = lambda: [v for v in ort.get_available_providers() if v != 'CUDAExecutionProvider'] # pylint: disable=protected-access
File "C:\AI\stable-diffusion-webui-directml\modules\zluda.py", line 63, in
ort.capi._pybind_state.get_available_providers = lambda: [v for v in ort.get_available_providers() if v != 'CUDAExecutionProvider'] # pylint: disable=protected-access
[Previous line repeated 961 more times]
RecursionError: maximum recursion depth exceeded

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\AI\stable-diffusion-webui-directml\modules\scripts.py", line 825, in process
    script.process(p, *script_args)
  File "C:\AI\stable-diffusion-webui-directml\extensions\sd-webui-controlnet\scripts\controlnet.py", line 1222, in process
    self.controlnet_hack(p)
  File "C:\AI\stable-diffusion-webui-directml\extensions\sd-webui-controlnet\scripts\controlnet.py", line 1207, in controlnet_hack
    self.controlnet_main_entry(p)
  File "C:\AI\stable-diffusion-webui-directml\extensions\sd-webui-controlnet\scripts\controlnet.py", line 941, in controlnet_main_entry
    controls, hr_controls, additional_maps = get_control(
  File "C:\AI\stable-diffusion-webui-directml\extensions\sd-webui-controlnet\scripts\controlnet.py", line 290, in get_control
    controls, hr_controls = list(zip(*[preprocess_input_image(img) for img in optional_tqdm(input_images)]))
  File "C:\AI\stable-diffusion-webui-directml\extensions\sd-webui-controlnet\scripts\controlnet.py", line 290, in <listcomp>
    controls, hr_controls = list(zip(*[preprocess_input_image(img) for img in optional_tqdm(input_images)]))
  File "C:\AI\stable-diffusion-webui-directml\extensions\sd-webui-controlnet\scripts\controlnet.py", line 242, in preprocess_input_image
    result = preprocessor.cached_call(
  File "C:\AI\stable-diffusion-webui-directml\extensions\sd-webui-controlnet\scripts\supported_preprocessor.py", line 196, in cached_call
    result = self._cached_call(input_image, *args, **kwargs)
  File "C:\AI\stable-diffusion-webui-directml\extensions\sd-webui-controlnet\scripts\utils.py", line 82, in decorated_func
    return cached_func(*args, **kwargs)
  File "C:\AI\stable-diffusion-webui-directml\extensions\sd-webui-controlnet\scripts\utils.py", line 66, in cached_func
    return func(*args, **kwargs)
  File "C:\AI\stable-diffusion-webui-directml\extensions\sd-webui-controlnet\scripts\supported_preprocessor.py", line 209, in _cached_call
    return self(*args, **kwargs)
  File "C:\AI\stable-diffusion-webui-directml\extensions\sd-webui-controlnet\scripts\preprocessor\legacy\legacy_preprocessors.py", line 105, in __call__
    result, is_image = self.call_function(
  File "C:\AI\stable-diffusion-webui-directml\extensions\sd-webui-controlnet\scripts\preprocessor\legacy\processor.py", line 725, in run_model_instant_id
    self.load_model()
  File "C:\AI\stable-diffusion-webui-directml\extensions\sd-webui-controlnet\scripts\preprocessor\legacy\processor.py", line 669, in load_model
    self.model = FaceAnalysis(
  File "C:\AI\stable-diffusion-webui-directml\venv\lib\site-packages\insightface\app\face_analysis.py", line 31, in __init__
    model = model_zoo.get_model(onnx_file, **kwargs)
  File "C:\AI\stable-diffusion-webui-directml\venv\lib\site-packages\insightface\model_zoo\model_zoo.py", line 96, in get_model
    model = router.get_model(providers=providers, provider_options=provider_options)
  File "C:\AI\stable-diffusion-webui-directml\venv\lib\site-packages\insightface\model_zoo\model_zoo.py", line 40, in get_model
    session = PickableInferenceSession(self.onnx_file, **kwargs)
  File "C:\AI\stable-diffusion-webui-directml\venv\lib\site-packages\insightface\model_zoo\model_zoo.py", line 25, in __init__
    super().__init__(model_path, **kwargs)
  File "C:\AI\stable-diffusion-webui-directml\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 432, in __init__
    raise fallback_error from e
  File "C:\AI\stable-diffusion-webui-directml\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 425, in __init__
    print(f"Falling back to {self._fallback_providers} and retrying.")
AttributeError: 'PickableInferenceSession' object has no attribute '_fallback_providers'

@lshqqytiger
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed in 2c29feb

Please sign in to comment.