FizzleDorf/AIT

Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu!

CyberTimon opened this issue · 9 comments

Hello

When I use AIT and ComfyUi (both on the latest branch), it gives me this error.
I start ComfyUi with this command: python3 main.py --listen --port 21128 --gpu-only and I'm on Ubuntu 22.04.

Please help! :)

Error occurred when executing KSampler:

Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu! (when checking argument for argument mat1 in method wrapper_CUDA_addmm)

File "/home/cybertimon/Repositories/ComfyUI/execution.py", line 152, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "/home/cybertimon/Repositories/ComfyUI/execution.py", line 82, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "/home/cybertimon/Repositories/ComfyUI/execution.py", line 75, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "/home/cybertimon/Repositories/ComfyUI/nodes.py", line 1236, in sample
return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
File "/home/cybertimon/Repositories/ComfyUI/custom_nodes/AIT/AITemplate/AITemplate.py", line 176, in common_ksampler
samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
File "/home/cybertimon/Repositories/ComfyUI/custom_nodes/AIT/AITemplate/AITemplate.py", line 310, in sample
samples = sampler.sample(noise, positive_copy, negative_copy, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
File "/home/cybertimon/Repositories/ComfyUI/comfy/samplers.py", line 785, in sample
return sample(self.model, noise, positive, negative, cfg, self.device, sampler(), sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
File "/home/cybertimon/Repositories/ComfyUI/comfy/samplers.py", line 690, in sample
samples = sampler.sample(model_wrap, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
File "/home/cybertimon/Repositories/ComfyUI/comfy/samplers.py", line 630, in sample
samples = getattr(k_diffusion_sampling, "sample_{}".format(sampler_name))(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **extra_options)
File "/home/cybertimon/.local/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/home/cybertimon/Repositories/ComfyUI/comfy/k_diffusion/sampling.py", line 137, in sample_euler
denoised = model(x, sigma_hat * s_in, **extra_args)
File "/home/cybertimon/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/home/cybertimon/Repositories/ComfyUI/comfy/samplers.py", line 323, in forward
out = self.inner_model(x, sigma, cond=cond, uncond=uncond, cond_scale=cond_scale, cond_concat=cond_concat, model_options=model_options, seed=seed)
File "/home/cybertimon/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/home/cybertimon/Repositories/ComfyUI/comfy/k_diffusion/external.py", line 125, in forward
eps = self.get_eps(input * c_in, self.sigma_to_t(sigma), **kwargs)
File "/home/cybertimon/Repositories/ComfyUI/comfy/k_diffusion/external.py", line 151, in get_eps
return self.inner_model.apply_model(*args, **kwargs)
File "/home/cybertimon/Repositories/ComfyUI/comfy/samplers.py", line 311, in apply_model
out = sampling_function(self.inner_model.apply_model, x, timestep, uncond, cond, cond_scale, cond_concat, model_options=model_options, seed=seed)
File "/home/cybertimon/Repositories/ComfyUI/comfy/samplers.py", line 289, in sampling_function
cond, uncond = calc_cond_uncond_batch(model_function, cond, uncond, x, timestep, max_total_area, cond_concat, model_options)
File "/home/cybertimon/Repositories/ComfyUI/comfy/samplers.py", line 265, in calc_cond_uncond_batch
output = model_function(input_x, timestep_, **c).chunk(batch_chunks)
File "/home/cybertimon/Repositories/ComfyUI/comfy/model_base.py", line 63, in apply_model
return self.diffusion_model(xc, t, context=context, y=c_adm, control=control, transformer_options=transformer_options).float()
File "/home/cybertimon/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/home/cybertimon/Repositories/ComfyUI/comfy/ldm/modules/diffusionmodules/openaimodel.py", line 622, in forward
emb = emb + self.label_emb(y)
File "/home/cybertimon/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/home/cybertimon/.local/lib/python3.10/site-packages/torch/nn/modules/container.py", line 217, in forward
input = module(input)
File "/home/cybertimon/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/home/cybertimon/.local/lib/python3.10/site-packages/torch/nn/modules/container.py", line 217, in forward
input = module(input)
File "/home/cybertimon/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/home/cybertimon/Repositories/ComfyUI/comfy/ops.py", line 18, in forward
return torch.nn.functional.linear(input, self.weight, self.bias)
asagi4 commented

If you want, you can try my branch: https://github.com/asagi4/AIT/tree/hacks

The fixes in it aren't ready for publication, but it restores compatibility with at least my installation of ComfyUI. It's very WIP though.
I have some local patches in comfyUI too that deal with tensor type issues, so that version of AIT might not be enough.

asagi4 commented

EDIT: removed the ComfyUI patch

I updated the hacks branch with a better fix. AIT no longer forces tensors on the CPU, which makes things work properly.

Thank you very much. Will try it tomorrow!

@asagi4 Thank you very much for your fix. Everything works including controlnet!! :)

@asagi4 I caught this kind of late. Thanks for putting this together. You can open the PR early if you like for some more visibility and if you need any help with anything or roll back some recent changes so the merge is more graceful let me know.

asagi4 commented

@FizzleDorf I'll make a PR once I have time to put some thought into it. There's probably more refactoring and cleanup that can be done with the recent updates to ComfyUI, and I'd rather not leave too much useless code lying around.

asagi4 commented

Looking at https://github.com/gameltb/AIT/ it seems it should be possible to refactor AIT to get rid of the sampler override and a large bunch of the other code in the repository and simplify things greatly, but that fork seems to completely remove downloading precompiled modules.

Rebasing AIT on top of that work and adding in the module downloading is what I'd like to see happen; it seems like it would remove a lot of confusing code that's really difficult to keep compatible with base ComfyUI.

asagi4 commented

I don't know if I'll have the motivation to do that though; Refactoring Python code is painful.

Yeah it would be cool if you would do that - if you have time and motivation of course. It's quite sad to see how little attention AiT gets.