Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add llama-cpp-python #686

Closed
wants to merge 1 commit into from
Closed

Add llama-cpp-python #686

wants to merge 1 commit into from

Conversation

acon96
Copy link

@acon96 acon96 commented Mar 18, 2024

Adds wheels for llama-cpp-python to allow running Llama AI models as part of custom integrations.

Copy link

@home-assistant home-assistant bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @acon96

It seems you haven't yet signed a CLA. Please do so here.

Once you do that we will be able to review and accept this pull request.

Thanks!

@home-assistant
Copy link

Please take a look at the requested changes, and use the Ready for review button when you are done, thanks 👍

Learn more about our pull request process.

@home-assistant home-assistant bot marked this pull request as draft March 18, 2024 21:50
@acon96 acon96 marked this pull request as ready for review March 18, 2024 21:52
@@ -5,3 +5,4 @@ Shapely
smbus_cffi
spidev
aiortc
llama-cpp-python
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems that upstream already publishes musl linux wheels, why does it need to be added here as well?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you provide the link to the upstream musl wheels? PyPi still only provides a source tarball and no binaries at all.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm it seems that I looked wrongly. They are indeed not on Pypi. Maybe I've looked at the wrong package.

Have you contacted the author to ask if they are willing to publish Wheels?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like they're still working on it and don't plan to publish them to PyPi. Not sure if that is compatible with how Home Assistant installs packages so I'll figure something else out. Thanks

@home-assistant home-assistant bot marked this pull request as draft March 19, 2024 07:59
@acon96
Copy link
Author

acon96 commented Mar 19, 2024

hmm. that definitely didn't exist a month ago. thanks for the heads up

@acon96 acon96 closed this Mar 19, 2024
@abetlen
Copy link

abetlen commented Apr 4, 2024

Hey @acon96 @frenck with abetlen/llama-cpp-python#1247 now closed it's possible to install pre-built binary llama-cpp-python wheels using --extra-index-url similar to say a PyTorch installation. This method is compatible with requirements.txt files ie:

# requirements.txt
--extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cpu
llama-cpp-python

This specifies the basic cpu wheels for llama-cpp-python and allows you to install it without a compiler toolchain.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants