Replies: 1 comment
-
I have the same issue for LayerNorm and GroupNorm. Can something be done? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am trying to convert a pytorch model and run inference on an M1 device with C++ using the coreML EP. The model has an InstanceNorm2d layer which the coreML EP doesn't support according to docs and the
onnxruntime.tools.check_onnx_model_mobile_usability
verified thatUnsupported ops: ai.onnx:InstanceNormalization
I tried converting InstanceNormalization layers to BatchNormalization but the mobile usability outputs
ai.onnx:15:BatchNormalization
Here are the logs of attempting to load
Here is a small part of the model with the problem, exported to onnx for reproduction models.zip
Using prebuilt onnxruntime v1.18.0
python -m onnxruntime.tools.check_onnx_model_mobile_usability model_batchnorm.onnx
I converted the same model without changes using coremltools and InstanceNorm seem to be supported and verified all layers can execute on M1's NPU.
I don't know how to proceed at this point. Any tips would be appreciated.
Beta Was this translation helpful? Give feedback.
All reactions