Replies: 7 comments 2 replies
-
just found the way to run on intel import whisper |
Beta Was this translation helpful? Give feedback.
-
Large model works on ARC A770 16GB with the latest xpu-master branch from intel-extension-for-pytorch Some patching required, see: intel/intel-extension-for-pytorch#302 |
Beta Was this translation helpful? Give feedback.
-
Memory usage on Intel ARC A770 measured with lsgpu baseline usage
tiny.en
tiny
medium
large
|
Beta Was this translation helpful? Give feedback.
-
How is the performance? |
Beta Was this translation helpful? Give feedback.
-
PR for Intel GPU Support #1362 |
Beta Was this translation helpful? Give feedback.
-
A770 16G is basically the same price as 3060 12G, for the purpose of this topic, I'll use the later one as the large model works well on it with less headache of set up. |
Beta Was this translation helpful? Give feedback.
-
Does this mean that currently, Whisper does NOT work with Arc A770 as it is? I am using Arch Linux with an A770. I installed a local Python 3.11.9 with the packages below, and I could run ComfyUI and generate images with this Python. However, when I ran Whisper in the same Python, it seemed like it was using CPU.
|
Beta Was this translation helpful? Give feedback.
-
Has anyone got Whisper accelerated on Intel ARC GPU?
looking at ways to possibly build several smaller affordable dedicated Whisper workstations.
Could this best cost effective vs buying one expensive and fast Nvidia 4000 series system?
re:
https://medium.com/intel-analytics-software/running-tensorflow-stable-diffusion-on-intel-arc-gpus-e6ff0d2b7549
https://www.pcgamer.com/intels-new-dollar249-gpu-price-wipes-out-nvidia-at-the-entry-level/
Beta Was this translation helpful? Give feedback.
All reactions