[web] cannot run simple scikit-learn classifier in browser #9688
-
Hi, I am trying to experiment with ORT web, however I get error messages whenever running inference. I am then trying to run predictions in the browser with a modification of this example: https://github.com/microsoft/onnxruntime-inference-examples/tree/main/js/quick-start_onnxruntime-web-script-tag // prepare inputs. a tensor need its corresponding TypedArray as data
const dataA = Float32Array.from([1, 2, 3, 4]);
const tensorA = new ort.Tensor('float32', dataA, [1, 4]);
// prepare feeds. use model input names as keys.
const feeds = { float_input: tensorA }; However, with any model (except LinearRegression) I get cryptic error messages from Thanks in advance. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
I managed to solve my problem... First, I tried to run the pipeline with After searching around issues and documentation, I understood that the |
Beta Was this translation helpful? Give feedback.
I managed to solve my problem...
First, I tried to run the pipeline with
onnxruntime-node
, where I got a readable error message:failed to inference ONNX model: Error: Non tensor type is temporarily not supported..
After searching around issues and documentation, I understood that the
zipmap
option needs to be deactivated when converting Scikit-Learn classifiers withsklearn-onnx
, otherwise none of the classifiers will work with onnxruntime in JavaScript.