Replies: 13 comments 49 replies
-
Don't look forward to it too much. |
Beta Was this translation helpful? Give feedback.
-
I made it work. It is much faster than expectation.
|
Beta Was this translation helpful? Give feedback.
-
very interesting, and thank you very much for investigating! i will take a stab at it with my rudimentary skills, and update the thread. |
Beta Was this translation helpful? Give feedback.
-
amd rx 580 rocBLAS error: Could not initialize Tensile host: guide for newbie. (If I've got it right)
|
Beta Was this translation helpful? Give feedback.
-
Go to discord sd next (search in google reddit) download zip file kobald
for rx6700. Sorry for english.
пт, 16 февр. 2024 г., 05:19 Nikos ***@***.***>:
… Followed every step from your reply for SD.Next and @lshqqytiger
<https://github.com/lshqqytiger> 's reply which resulted in the second
error below and i get the same error too with my 6700XT:
rocBLAS error: Cannot read C:\Program Files\AMD\ROCm\5.7\bin\/rocblas/library/TensileLibrary.dat: No such file or directory for GPU arch : gfx1031
rocBLAS error: Could not initialize Tensile host:
regex_error(error_backref): The expression contained an invalid back reference.
Also, only SD.Next seems to work while this webui just stops with this
error:
venv "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\venv\Scripts\Python.exe"
fatal: No names found, cannot describe anything.
Python 3.10.11 (tags/v3.10.11:7d4cc5a, Apr 5 2023, 00:38:17) [MSC v.1929 64 bit (AMD64)]
Version: 1.7.0
Commit hash: 835ee20
no module 'xformers'. Processing without...
no module 'xformers'. Processing without...
No module 'xformers'. Proceeding without it.
D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\venv\lib\site-packages\pytorch_lightning\utilities\distributed.py:258: LightningDeprecationWarning: `pytorch_lightning.utilities.distributed.rank_zero_only` has been deprecated in v1.8.1 and will be removed in v2.0.0. You can import it from `pytorch_lightning.utilities` instead.
rank_zero_deprecation(
2024-02-16 02:16:36.015885: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2024-02-16 02:16:36.529858: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
Olive: Failed to load olive-ai: Failed to import transformers.models.gpt2.modeling_tf_gpt2 because of the following error (look up to see its traceback):
No module named 'keras.__internal__'
Launching Web UI with arguments:
Style database not found: D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\styles.csv
D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\venv\lib\site-packages\diffusers\utils\outputs.py:63: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead.
torch.utils._pytree._register_pytree_node(
Traceback (most recent call last):
File "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\venv\lib\site-packages\transformers\utils\import_utils.py", line 1086, in _get_module
return importlib.import_module("." + module_name, self.__name__)
File "C:\Users\Nikos\AppData\Local\Programs\Python\Python310\lib\importlib\__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 883, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\venv\lib\site-packages\transformers\modeling_tf_utils.py", line 76, in <module>
from keras.__internal__ import KerasTensor
ModuleNotFoundError: No module named 'keras.__internal__'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\venv\lib\site-packages\transformers\utils\import_utils.py", line 1086, in _get_module
return importlib.import_module("." + module_name, self.__name__)
File "C:\Users\Nikos\AppData\Local\Programs\Python\Python310\lib\importlib\__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 883, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\venv\lib\site-packages\optimum\onnxruntime\modeling_diffusion.py", line 41, in <module>
from ..exporters.onnx import main_export
File "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\venv\lib\site-packages\optimum\exporters\__init__.py", line 16, in <module>
from .tasks import TasksManager # noqa
File "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\venv\lib\site-packages\optimum\exporters\tasks.py", line 52, in <module>
from transformers import TFPreTrainedModel
File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
File "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\venv\lib\site-packages\transformers\utils\import_utils.py", line 1076, in __getattr__
module = self._get_module(self._class_to_module[name])
File "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\venv\lib\site-packages\transformers\utils\import_utils.py", line 1088, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.modeling_tf_utils because of the following error (look up to see its traceback):
No module named 'keras.__internal__'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\launch.py", line 48, in <module>
main()
File "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\launch.py", line 44, in main
start()
File "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\modules\launch_utils.py", line 677, in start
import webui
File "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\webui.py", line 13, in <module>
initialize.imports()
File "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\modules\initialize.py", line 34, in imports
shared_init.initialize()
File "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\modules\shared_init.py", line 59, in initialize
initialize_onnx()
File "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\modules\onnx_impl\__init__.py", line 236, in initialize
from .pipelines.onnx_stable_diffusion_xl_pipeline import OnnxStableDiffusionXLPipeline
File "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\modules\onnx_impl\pipelines\onnx_stable_diffusion_xl_pipeline.py", line 8, in <module>
class OnnxStableDiffusionXLPipeline(CallablePipelineBase, optimum.onnxruntime.ORTStableDiffusionXLPipeline):
File "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\venv\lib\site-packages\transformers\utils\import_utils.py", line 1076, in __getattr__
module = self._get_module(self._class_to_module[name])
File "D:\AI\Applications\Stable Diffusion (ZLUDA)\stable-diffusion-webui-directml\venv\lib\site-packages\transformers\utils\import_utils.py", line 1088, in _get_module
raise RuntimeError(
RuntimeError: Failed to import optimum.onnxruntime.modeling_diffusion because of the following error (look up to see its traceback):
Failed to import transformers.modeling_tf_utils because of the following error (look up to see its traceback):
No module named 'keras.__internal__'
Press any key to continue . . .
—
Reply to this email directly, view it on GitHub
<#385 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AMHOF6YJK3DTJXJQPJVFL73YT2QYNAVCNFSM6AAAAABDGZ3QKSVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4DIOBWHA2DG>
.
You are receiving this because you commented.Message ID:
<lshqqytiger/stable-diffusion-webui-directml/repo-discussions/385/comments/8486843
@github.com>
|
Beta Was this translation helpful? Give feedback.
-
Hey guys. I've successfully used zluda (running with a 7900xt on windows). |
Beta Was this translation helpful? Give feedback.
-
unfortunately my GPU is not compatible with zluda, so i cannot contribute to the discussion any further. |
Beta Was this translation helpful? Give feedback.
-
Weird thing for me on RX 6600 generation goes fast (faster than DML) but get stuck at 95% progression for a minute or 2, the only way i found to prevent this is to change VAE Decoder to TAESD but the result look really bad with this option. |
Beta Was this translation helpful? Give feedback.
-
This is super important. RX6600 and RX6700 must use this instead. |
Beta Was this translation helpful? Give feedback.
-
I'm just here to highly recommend Zluda, it simply cut the image generation time in half. AMD RX 5600 XT 6G |
Beta Was this translation helpful? Give feedback.
-
Hey, i made Install Guides for AMD on Windows for every common Webui with ZLUDA and DirectML. Got it working on Auto1111, ComfyUI, Fooocus, with Forge to follow. To the Install Guides: |
Beta Was this translation helpful? Give feedback.
-
Same issue as bendybox. Setup environment vars already. This guide helped https://github.com/CS1o/Stable-Diffusion-Info/wiki/Installation-Guides#amd-automatic1111-with-zluda Stable diffusion launches webui but I am getting error and can't generate
|
Beta Was this translation helpful? Give feedback.
-
Heads up @lshqqytiger - seems the Radeon 25.3.1 driver has problems with Zluda. Seeing issues on PatientX Zluda ComfyUI, as well as your A1111 Repo when trying to generate images that worked on previous AMD Driver versions. Planning to submit a bug report, need to try testing a few things first! |
Beta Was this translation helpful? Give feedback.
-
https://github.com/vosen/ZLUDA
might we finally be able to dump dml??
Beta Was this translation helpful? Give feedback.
All reactions