Browser LLMs #444
louis030195
started this conversation in
Ideas & Feedback
Replies: 1 comment
-
I don't think this is something we're particularly interested in supporting with the SDK any time soon. That being said, the field is rapidly evolving so I don't want to shut anything down. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey, have you considered client side LLMs, seems like the 'ai' lib is very server oriented? Been waiting for a while for this to arrive and it seems some people run llama2 in the browser now
I assume llama2 will never run in edge due to the size?
Beta Was this translation helpful? Give feedback.
All reactions