Replies: 2 comments
-
Try this: Line 130 in 82369c5 Also, what kind of performance are you getting on the MI100s? |
Beta Was this translation helpful? Give feedback.
0 replies
-
Yep, it defaults to hosting the server on |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I run LLMs via a server and I am testing exllama running Ubuntu 22.04 on a Dual Xeon server with 2 AMD MI100s. Installing exllama was very simple and works great from the console but I'd like to use it from my desktop PC. Is there an option like Ooobabooga's "--listen" to allow it to be accessed over the local network?
thanks
Beta Was this translation helpful? Give feedback.
All reactions