-
Notifications
You must be signed in to change notification settings - Fork 481
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inference on an embedded MCU (RP2040 / Raspberry Pico) #1067
Comments
There is one issue relating to Atomic locking that has not been resolved yet but this was mainly with Ndarray backend. Please see: #302. It might be possible with Candle CPU backend, which I haven't tried yet. I am willing to assist if someone can work on this. |
But generally we do support no_std and our CI is building with |
If everything is single threaded, then atomic is a non-issue. |
Oh wow thanks all for the comments. If there's some path to success on this then I'll also assist gladly to get this up. |
@mvniekerk Have you tried running a model with the Candle CPU backend and/or the NdArray backend? If there are some errors it might help to know what is missing. |
@nathanielsimard I'm trying to do the same thing as @mvniekerk and I get the same error messages, I tried using candle and ndarray, but both of those failed. Using this burn = { version = "0.13.0", default-features = false, features = ["ndarray"] } Gave the following errors
Using this burn = { version = "0.13.0", default-features = false, features = ["candle"] } I got the following errors along with hundreds of others, too much to add here.
I can create a repo to reproduce the errors if wanted. |
This should be fixed with the latest merge. In the |
Feature description
I'd like to run an inference model on an embedded MCU, specifically an RP2040. Like to TensorFlow Micro (https://lib.rs/crates/tfmicro).
Feature motivation
To do event sourcing remotely on electronic devices.
(Optional) Suggest a Solution
The text was updated successfully, but these errors were encountered: