You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have read the paper PowerInfer-2: Fast Large Language Model Inference on a Smartphone. Will the related code open-sourced?
By the way, the core innovation of the work is how to use the heterogeneous compute system on mobile phone to run a large model. Are there any tutorials about how to use NPU/GPU of Snapdragon 8 Gen 3?
The text was updated successfully, but these errors were encountered:
I found this: https://x.com/hodlenx/status/1800788808272937297 they said they were working on open sourcing it edit after reading the powerinfer paper i believe the fastest way to open source it would be to create a different repository since from what i can see, i believe powerinfer-2 to be a big departure from the original powerinfer
Prerequisites
Before submitting your issue, please ensure the following:
Feature Description
I have read the paper PowerInfer-2: Fast Large Language Model Inference on a Smartphone. Will the related code open-sourced?
By the way, the core innovation of the work is how to use the heterogeneous compute system on mobile phone to run a large model. Are there any tutorials about how to use NPU/GPU of Snapdragon 8 Gen 3?
The text was updated successfully, but these errors were encountered: