Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NotImplementedError: Relay Parametric ReLU to XLayer not implemented #61

Open
nhphuong91 opened this issue Aug 6, 2021 · 2 comments
Open

Comments

@nhphuong91
Copy link

I'm trying to compile an MXNet model to XIR model but facing this error
NotImplementedError: Relay Parametric ReLU to XLayer not implemented
The model can be downloaded from here
Some useful script to import the model -> modify existing one from vitis-ai tvm tutorial here:

import mxnet
sym, arg_params, aux_params = mx.model.load_checkpoint('model', 0)
input_shape = (1,3,112,112)
mod, params = relay.frontend.from_mxnet(sym, shape_dict, arg_params=arg_params, aux_params=aux_params)

Everything run fine until partition_for_vitis_ai() and the error occurred.
I tried to compile it for both ZCU104 and KV260 (SOM on vitis version 1.3), but same error occur.
A quick search lead me to pyxir/python/pyxir/frontend/tvm/relay_tools/relay_l3_math_and_transform.py, where the error is raised. And pyxir/python/pyxir/graph/ops which tells me that PReLU is supported.
Can anyone support me with it? Thanks so much!

P/s: I know that PReLU is not supported by vai_c & DPU according to vitis ai user guide

@jtuyls
Copy link

jtuyls commented Aug 18, 2021

@nhphuong91 I created a PR to accept the Relay PReLU operation: #64. You could give it a go to check whether this works for your purposes. As you mentioned PReLU isn't supported by the DPUs in general, however, for the case where all alpha values are the same and equal to 0.1 it reduces to a Relay LeakyReLU with alpha=0.1, which is supported on some DPUs. That functionality isn't supported in this PR yet though. Do you expect to need that?

@nhphuong91
Copy link
Author

@jtuyls I managed to replace them with LeakyReLU and luckily, the result doesn't change. And even luckier that it helps my model to be fully executed on DPU without partition to subgraph.
Anyway, it would be great if you can implement PReLU supported. But this could come after PReLU has been fully supported on DPU

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants