It would be cool, if AMD GPUs would be supported as well. e.g.: https://community.amd.com/t5/ai/running-llms-locally-on-amd-gpus-with-ollama/ba-p/713266