Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error occurred when executing SUPIR_sample #134

Open
huayuangaung opened this issue May 31, 2024 · 2 comments
Open

Error occurred when executing SUPIR_sample #134

huayuangaung opened this issue May 31, 2024 · 2 comments

Comments

@huayuangaung
Copy link

Error occurred when executing SUPIR_sample:

No operator found for memory_efficient_attention_forward with inputs:
query : shape=(40, 1024, 1, 64) (torch.float16)
key : shape=(40, 1024, 1, 64) (torch.float16)
value : shape=(40, 1024, 1, 64) (torch.float16)
attn_bias :
p : 0.0
decoderF is not supported because:
xFormers wasn't build with CUDA support
attn_bias type is
operator wasn't built - see python -m xformers.info for more info
[email protected] is not supported because:
xFormers wasn't build with CUDA support
operator wasn't built - see python -m xformers.info for more info
tritonflashattF is not supported because:
xFormers wasn't build with CUDA support
operator wasn't built - see python -m xformers.info for more info
triton is not available
cutlassF is not supported because:
xFormers wasn't build with CUDA support
operator wasn't built - see python -m xformers.info for more info
smallkF is not supported because:
max(query.shape[-1] != value.shape[-1]) > 32
xFormers wasn't build with CUDA support
dtype=torch.float16 (supported: {torch.float32})
operator wasn't built - see python -m xformers.info for more info
unsupported embed per head: 64

File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-SUPIR\nodes_v2.py", line 494, in sample
_samples = self.sampler(denoiser, noised_z, cond=positive[i], uc=negative[i], x_center=sample.unsqueeze(0), control_scale=control_scale_end,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-SUPIR\sgm\modules\diffusionmodules\sampling.py", line 441, in call
x = self.sampler_step(
^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-SUPIR\sgm\modules\diffusionmodules\sampling.py", line 418, in sampler_step
denoised = self.denoise(x, denoiser, sigma_hat, cond, uc, control_scale=control_scale)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-SUPIR\sgm\modules\diffusionmodules\sampling.py", line 400, in denoise
denoised = denoiser(*self.guider.prepare_inputs(x, sigma, cond, uc), control_scale)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-SUPIR\nodes_v2.py", line 468, in
denoiser = lambda input, sigma, c, control_scale: SUPIR_model.denoiser(SUPIR_model.model, input, sigma, c, control_scale)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-SUPIR\sgm\modules\diffusionmodules\denoiser.py", line 73, in call
return network(input * c_in, c_noise, cond, control_scale) * c_out + input * c_skip
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-SUPIR\sgm\modules\diffusionmodules\wrappers.py", line 96, in forward
out = self.diffusion_model(
^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-SUPIR\SUPIR\modules\SUPIR_v0.py", line 654, in forward
h = self.project_modules[adapter_idx](control[control_idx], h, control_scale=control_scale)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-SUPIR\SUPIR\modules\SUPIR_v0.py", line 147, in forward
x = self.attn(x, context)
^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1541, in call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-SUPIR\sgm\modules\attention.py", line 365, in forward
out = xformers.ops.memory_efficient_attention(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\xformers\ops\fmha_init
.py", line 223, in memory_efficient_attention
return memory_efficient_attention(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\xformers\ops\fmha_init
.py", line 321, in _memory_efficient_attention
return memory_efficient_attention_forward(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\xformers\ops\fmha_init
.py", line 337, in _memory_efficient_attention_forward
op = _dispatch_fw(inp, False)
^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\xformers\ops\fmha\dispatch.py", line 120, in _dispatch_fw
return _run_priority_list(
^^^^^^^^^^^^^^^^^^^
File "E:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\xformers\ops\fmha\dispatch.py", line 63, in _run_priority_list
raise NotImplementedError(msg)

@xiao772677
Copy link

我也遇到这个问题了 好像是xFormers不支持CUDA,是要升级吗?

@huayuangaung
Copy link
Author

我也遇到这个问题了 好像是xFormers不支持CUDA,是要升级吗?

我不太了解,这个问题我还没解决

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants