You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Traceback (most recent call last):
File "/home/anaconda3/envs/mgm/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/home/anaconda3/envs/mgm/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/home/Projects/miniGemini/MGM/mgm/serve/cli.py", line 239, in
main(args)
File "/home/Projects/miniGemini/MGM/mgm/serve/cli.py", line 215, in main
output_img = pipe(prompt, negative_prompt=common_neg_prompt).images[0]
File "/home/anaconda3/envs/mgm/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/home/anaconda3/envs/mgm/lib/python3.10/site-packages/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl.py", line 1138, in call
) = self.encode_prompt(
File "/home/anaconda3/envs/mgm/lib/python3.10/site-packages/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl.py", line 406, in encode_prompt
prompt = self.maybe_convert_prompt(prompt, tokenizer)
File "/home/anaconda3/envs/mgm/lib/python3.10/site-packages/diffusers/loaders/textual_inversion.py", line 137, in maybe_convert_prompt
prompts = [self._maybe_convert_prompt(p, tokenizer) for p in prompts]
File "/home/anaconda3/envs/mgm/lib/python3.10/site-packages/diffusers/loaders/textual_inversion.py", line 137, in
prompts = [self._maybe_convert_prompt(p, tokenizer) for p in prompts]
File "/home/anaconda3/envs/mgm/lib/python3.10/site-packages/diffusers/loaders/textual_inversion.py", line 161, in _maybe_convert_prompt
tokens = tokenizer.tokenize(prompt)
AttributeError: 'NoneType' object has no attribute 'tokenize'
with torch==2.0.1 diffusions==0.26.3 but my cuda is 12.2, not match the torch==2.0.1+cu117, is it the question?
The text was updated successfully, but these errors were encountered:
I followed the shell in dissusion's readme.md "pip install --upgrade diffusers[torch]", to make sure that my versions match, but useless. I found the similiar questtion on https://github.com/huggingface/diffusers/issues, but no good solutions. Finally I replaced StableDiffusionXLPipeline with StableDiffusionPipeline, it works now.
Traceback (most recent call last):
File "/home/anaconda3/envs/mgm/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/home/anaconda3/envs/mgm/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/home/Projects/miniGemini/MGM/mgm/serve/cli.py", line 239, in
main(args)
File "/home/Projects/miniGemini/MGM/mgm/serve/cli.py", line 215, in main
output_img = pipe(prompt, negative_prompt=common_neg_prompt).images[0]
File "/home/anaconda3/envs/mgm/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/home/anaconda3/envs/mgm/lib/python3.10/site-packages/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl.py", line 1138, in call
) = self.encode_prompt(
File "/home/anaconda3/envs/mgm/lib/python3.10/site-packages/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl.py", line 406, in encode_prompt
prompt = self.maybe_convert_prompt(prompt, tokenizer)
File "/home/anaconda3/envs/mgm/lib/python3.10/site-packages/diffusers/loaders/textual_inversion.py", line 137, in maybe_convert_prompt
prompts = [self._maybe_convert_prompt(p, tokenizer) for p in prompts]
File "/home/anaconda3/envs/mgm/lib/python3.10/site-packages/diffusers/loaders/textual_inversion.py", line 137, in
prompts = [self._maybe_convert_prompt(p, tokenizer) for p in prompts]
File "/home/anaconda3/envs/mgm/lib/python3.10/site-packages/diffusers/loaders/textual_inversion.py", line 161, in _maybe_convert_prompt
tokens = tokenizer.tokenize(prompt)
AttributeError: 'NoneType' object has no attribute 'tokenize'
with torch==2.0.1 diffusions==0.26.3 but my cuda is 12.2, not match the torch==2.0.1+cu117, is it the question?
The text was updated successfully, but these errors were encountered: