Skip to content
This repository was archived by the owner on Mar 13, 2025. It is now read-only.

Serve a new model without restarting RayLLM #130

Closed
k6l3 opened this issue Feb 2, 2024 · 1 comment
Closed

Serve a new model without restarting RayLLM #130

k6l3 opened this issue Feb 2, 2024 · 1 comment

Comments

@k6l3
Copy link

k6l3 commented Feb 2, 2024

Is it possible to start serving a new model when an event happens?

@samarth-contextual
Copy link

Bumping this.

@kouroshHakha kouroshHakha closed this as not planned Won't fix, can't repro, duplicate, stale Mar 13, 2025
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants