Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a parameter to use LLM as a Judge Descriptor with privately hosted endpoint #1341

Open
syadavlinklaters opened this issue Oct 10, 2024 · 2 comments
Labels
enhancement New feature or request

Comments

@syadavlinklaters
Copy link

Description

Hi team,

I am exploring Evidently AI for LLM Evaluation and I came across custom LLM as a Judge Descriptor in which I am particularly interested in. The current API only allows openai models using openai or litellm wrapper. Could you please add an additional parameter wherein we can pass the endpoint of our privately hosted LLMs?

I have attached the API screenshot for your reference.

Thanks and regards,
Satendra

@elenasamuylova
Copy link
Collaborator

Hi @syadavlinklaters! This functionality is not yet available, but it's definitely on the roadmap.

@elenasamuylova elenasamuylova added the enhancement New feature or request label Oct 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants