Skip to content
This repository has been archived by the owner on Nov 13, 2024. It is now read-only.

[Bug] I cant find out how to configure a new LLM #349

Open
2 tasks done
augmentedstartups opened this issue Oct 8, 2024 · 0 comments
Open
2 tasks done

[Bug] I cant find out how to configure a new LLM #349

augmentedstartups opened this issue Oct 8, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@augmentedstartups
Copy link

Is this a new bug?

  • I believe this is a new bug
  • I have searched the existing issues, and I could not find an existing issue for this bug

Current Behavior

Right now its using the default LLM, I want to change between gpt4o and mini models and also test out some open sources alternatives

I need access to this.

Expected Behavior

0

Steps To Reproduce

0

Relevant log output

s

Environment

- **OS**:
- **Language version**:
- **Canopy version**:
s

Additional Context

s

@augmentedstartups augmentedstartups added the bug Something isn't working label Oct 8, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant