Support for RKNPU Execution Provider on RK3562 Platform and On-Device Training Capabilities #21059
Unanswered
Leo5050xvjf
asked this question in
EP Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I have been reading the documentation for ONNX Runtime and came across the RKNPU Execution Provider (EP). I would like to clarify my understanding:
It appears that the RKNPU EP currently only supports the RK1808 Linux platform. Does this mean I cannot use the RKNPU EP on my RK3562 platform?
If the answer to the first question is yes, can I still perform ONNX model inference using the CPU on the RK3562?
To further clarify my requirements:
I intend to use ONNX Runtime with the RKNPU EP for model inference on the RK3562 platform.
I also plan to use ONNX Runtime's On-Device Training feature to train models on the RK3562 and would like to use the RKNPU EP for inference with the trained models.
Does the current version of ONNX Runtime support these requirements? Thank you for your clarification!
Beta Was this translation helpful? Give feedback.
All reactions