Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No main folder found #1

Open
PurpleHycinth opened this issue Nov 18, 2023 · 2 comments
Open

No main folder found #1

PurpleHycinth opened this issue Nov 18, 2023 · 2 comments

Comments

@PurpleHycinth
Copy link

Entry Not Found for url: https://huggingface.co/meta-llama/Llama-2-7b-chat/resolve/main/config.json.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "C:\Users\HP\Dropbox\BDS 5th sem\legal chatbot\Law-GPT\app.py", line 4, in
chain = qa_pipeline()
File "C:\Users\HP\Dropbox\BDS 5th sem\legal chatbot\Law-GPT\utils.py", line 95, in qa_pipeline
llm = load_llm()
File "C:\Users\HP\Dropbox\BDS 5th sem\legal chatbot\Law-GPT\utils.py", line 42, in load_llm
model = AutoModelForCausalLM.from_pretrained(
File "C:\Users\HP\Dropbox\BDS 5th sem\legal chatbot\newvenv\lib\site-packages\transformers\models\auto\auto_factory.py", line 527, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
File "C:\Users\HP\Dropbox\BDS 5th sem\legal chatbot\newvenv\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1023, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "C:\Users\HP\Dropbox\BDS 5th sem\legal chatbot\newvenv\lib\site-packages\transformers\configuration_utils.py", line 620, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "C:\Users\HP\Dropbox\BDS 5th sem\legal chatbot\newvenv\lib\site-packages\transformers\configuration_utils.py", line 675, in _get_config_dict
resolved_config_file = cached_file(
File "C:\Users\HP\Dropbox\BDS 5th sem\legal chatbot\newvenv\lib\site-packages\transformers\utils\hub.py", line 480, in cached_file
raise EnvironmentError(
OSError: meta-llama/Llama-2-7b-chat does not appear to have a file named config.json. Checkout 'https://huggingface.co/meta-llama/Llama-2-7b-chat/main' for available files.

I'm getting the above error.
I'm working on VScode, and using the virtual environment on windows 11.

@suryanshgupta9933
Copy link
Owner

suryanshgupta9933 commented Jan 30, 2024

This repo is actually outdated, do you want me to update and add support for more llms or new feature integration?

@PurpleHycinth
Copy link
Author

PurpleHycinth commented Jan 30, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants