Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Latest models of Together AI - Llama-3-70b-chat-hf #1289

Open
MorphSeur opened this issue Jun 3, 2024 · 2 comments
Open

Latest models of Together AI - Llama-3-70b-chat-hf #1289

MorphSeur opened this issue Jun 3, 2024 · 2 comments

Comments

@MorphSeur
Copy link

Is your feature request related to a problem? Please describe.

Hello!

The issue is related to the use of Together AI models, such as CodeLlama-34b and Llama-3-70b-chat-hf.
Despite that CodeLlama-34b exists in this LiteLLM documentation, I got the following issue:

NotFoundError: Together_aiException - Error code: 404 - {'error': {'message': 'Unable to access model togethercomputer/CodeLlama-34b. Please visit https://api.together.xyz to see the list 
of supported models or contact the owner to request access.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}

Here is the issue related to Llama-3-70b-chat-hf:

NotFoundError: Together_aiException - Error code: 404 - {'error': {'message': 'Unable to access model togethercomputer/Llama-3-70b-chat-hf. Please visit https://api.together.xyz to see the
list of supported models or contact the owner to request access.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}

My question: is Llama-3-70b-chat-hf supported in Open Interpreter?
if yes, can you please provide me with its Function Call?

Describe the solution you'd like

Integrate latest models of Llama in Together AI to Open Interpreter.

Describe alternatives you've considered

No response

Additional context

No response

@nanowell
Copy link

nanowell commented Jun 19, 2024

I think it can be handled by defining custom API endpoint URL with API key.
Together API endpoint is openai compatible thus should not cause any problems.
Same goes for groq api.

@MorphSeur
Copy link
Author

Hello,

Thanks for your reply!

Can you please provide a Python code snippet to give a try of what you have proposed?

The problem is even with the current documentation, I got the following for CodeLlama-34b that exists in this LiteLLM documentation:

NotFoundError: Together_aiException - Error code: 404 - {'error': {'message': 'Unable to access model togethercomputer/CodeLlama-34b. Please visit https://api.together.xyz to see the list 
of supported models or contact the owner to request access.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants