You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After i figured out how to compile TFRT on windows... Next question - how to use it with real pretrained models...
I have several of them in tfjs and tf-lite formats.
Bazel build generated 2 executables: tfrt_translate and bef_executor.
First requires MLIR script. Second - BEF file. So I cannot inference my models directly.
But how can I convert my models to MLIR? Google had some troubles finding proper answers to this.
And without first step - I cannot use bef_executor...
Any suggestions please?
The text was updated successfully, but these errors were encountered:
After i figured out how to compile TFRT on windows... Next question - how to use it with real pretrained models...
I have several of them in tfjs and tf-lite formats.
Bazel build generated 2 executables: tfrt_translate and bef_executor.
First requires MLIR script. Second - BEF file. So I cannot inference my models directly.
But how can I convert my models to MLIR? Google had some troubles finding proper answers to this.
And without first step - I cannot use bef_executor...
Any suggestions please?
The text was updated successfully, but these errors were encountered: