diff --git a/docs/extensions/custom-build.md b/docs/extensions/custom-build.md index 5a28181c79165..fdff39098003d 100644 --- a/docs/extensions/custom-build.md +++ b/docs/extensions/custom-build.md @@ -4,6 +4,7 @@ description: Instructions for building ONNX Runtime with onnxruntime-extensions parent: Development nav_order: 2 --- + # Build ONNX Runtime with onnxruntime-extensions for Java package *The following step are demonstrated for Windows Platform only, the others like Linux and MacOS can be done similarly.* diff --git a/docs/extensions/development.md b/docs/extensions/development.md index ca60afb9b7fe2..886f1c2c9a02c 100644 --- a/docs/extensions/development.md +++ b/docs/extensions/development.md @@ -4,6 +4,7 @@ description: Instructions for building and developing ORT Extensions. parent: Extensions nav_order: 2 --- + # Build and Development This project supports Python and can be built from source easily, or a simple cmake build without Python dependency. @@ -21,7 +22,9 @@ Test: For a complete list of verified build configurations see [here](./development.md#dependencies) ## Java package -`bash ./build.sh -DOCOS_BUILD_JAVA=ON` to build jar package in out//Release folder +For instructions on building ONNX Runtime with onnxruntime-extensions for Java package, see [here](./custom-build.md) + +Run `bash ./build.sh -DOCOS_BUILD_JAVA=ON` to build jar package in out//Release folder ## Android package - pre-requisites: [Android Studio](https://developer.android.com/studio) @@ -32,7 +35,7 @@ Use `./tools/android/build_aar.py` to build an Android AAR package. Use `./tools/ios/build_xcframework.py` to build an iOS xcframework package. ## Web-Assembly -ONNXRuntime-Extensions will be built as a static library and linked with ONNXRuntime due to the lack of a good dynamic linking mechanism in WASM. Here are two additional arguments [–-use_extensions and --extensions_overridden_path](https://github.com/microsoft/onnxruntime/blob/860ba8820b72d13a61f0d08b915cd433b738ffdc/tools/ci_build/build.py#L416) on building onnxruntime to include ONNXRuntime-Extensions footprint in the ONNXRuntime package. +ONNXRuntime-Extensions will be built as a static library and linked with ONNXRuntime due to the lack of a good dynamic linking mechanism in WASM. Here are two additional arguments [–-use_extensions and --extensions_overridden_path](https://github.com/microsoft/onnxruntime/blob/860ba8820b72d13a61f0d08b915cd433b738ffdc/tools/ci_build/build.py#L416) on building onnxruntime to include ONNXRuntime-Extensions footprint in the ONNXRuntime package. ## The C++ shared library for any other cases, please run `build.bat` or `bash ./build.sh` to build the library. By default, the DLL or the library will be generated in the directory `out//`. There is a unit test to help verify the build. @@ -45,9 +48,8 @@ If you want to build the binary with VC Runtime static linkage, please add a par check this link https://docs.opensource.microsoft.com/releasing/general-guidance/copyright-headers/ for source file copyright header. ## Dependencies - -The matrix below lists the versions of individual dependencies of onnxruntime-extensions. These are the configurations that are routinely and extensively verified by our CI. +The matrix below lists the versions of individual dependencies of onxxruntime-extensions. These are the configurations that are routinely and extensively verified by our CI. Python | 3.8 | 3.9 | 3.10 | 3.11 | ---|---|---|---|--- -Onnxruntime |1.12.1 (Aug 4, 2022) |1.13.1(Oct 24, 2022) |1.14.1 (Mar 2, 2023) |1.15.0 (May 24, 2023) | \ No newline at end of file +Onnxruntime |1.12.1 (Aug 4, 2022) |1.13.1(Oct 24, 2022) |1.14.1 (Mar 2, 2023) |1.15.0 (May 24, 2023) | diff --git a/docs/extensions/index.md b/docs/extensions/index.md index d00c1f6542743..9ff4e315d5785 100644 --- a/docs/extensions/index.md +++ b/docs/extensions/index.md @@ -8,10 +8,10 @@ nav_order: 7 [![Build Status](https://dev.azure.com/onnxruntime/onnxruntime/_apis/build/status%2Fmicrosoft.onnxruntime-extensions?branchName=main)](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=209&branchName=main) - ## What's ONNXRuntime-Extensions -Introduction: ONNXRuntime-Extensions is a library that extends the capability of the ONNX models and inference with ONNX Runtime, via ONNX Runtime Custom Operator ABIs. It includes a set of [ONNX Runtime Custom Operators](https://onnxruntime.ai/docs/reference/operators/add-custom-op.html) to support the common pre- and post-processing operators for vision, text, and nlp models. And it supports multiple languages and platforms, like Python on Windows/Linux/macOS, some mobile platforms like Android and iOS, and Web-Assembly etc. The basic workflow is to enhance a ONNX model firstly and then do the model inference with ONNX Runtime and ONNXRuntime-Extensions package. +Introduction: ONNXRuntime-Extensions is a library that extends the capability of the ONNX models and inference with ONNX Runtime, via ONNX Runtime Custom Operator ABIs. It includes a set of [ONNX Runtime Custom Operator](https://onnxruntime.ai/docs/reference/operators/add-custom-op.html) to support the common pre- and post-processing operators for vision, text, and nlp models. And it supports multiple languages and platforms, like Python on Windows/Linux/macOS, some mobile platforms like Android and iOS, and Web-Assembly etc. The basic workflow is to enhance a ONNX model firstly and then do the model inference with ONNX Runtime and ONNXRuntime-Extensions package. + Pre and post-processing custom operators for vision, text, and NLP models This image was created using Combine.AI, which is powered by Bing Chat, Bing Image Creator, and EdgeGPT. @@ -24,7 +24,7 @@ pip install onnxruntime-extensions ```` -### **nightly build** +### **Nightly Build** #### on Windows ```cmd @@ -32,7 +32,7 @@ pip install --index-url https://aiinfra.pkgs.visualstudio.com/PublicPackages/_pa ``` Please ensure that you have met the prerequisites of onnxruntime-extensions (e.g., onnx and onnxruntime) in your Python environment. #### on Linux/macOS -the packages are not ready yet, so it could be installed from source. Please make sure the compiler toolkit like gcc(later than g++ 8.0) or clang, and the tool cmake are installed before the following command +Please make sure the compiler toolkit like gcc(later than g++ 8.0) or clang are installed before the following command ```bash python -m pip install git+https://github.com/microsoft/onnxruntime-extensions.git ``` @@ -40,12 +40,16 @@ python -m pip install git+https://github.com/microsoft/onnxruntime-extensions.gi ## Usage -## 1. Augment an ONNX model with a pre- and post-processing pipeline -Check [tutorial](https://github.com/microsoft/onnxruntime-extensions/tree/main/tutorials) for a couple of examples on how to do it. +## 1. Generate the pre-/post- processing ONNX model +With onnxruntime-extensions Python package, you can easily get the ONNX processing graph by converting them from Huggingface transformer data processing classes, check the following API for details. +```python +help(onnxruntime_extensions.gen_processing_models) +``` +### NOTE: These data processing model can be merged into other model [onnx.compose](https://onnx.ai/onnx/api/compose.html) if needed. ## 2. Using Extensions for ONNX Runtime inference ### Python - +There are individual packages for the following languages, please install it for the build. ```python import onnxruntime as _ort from onnxruntime_extensions import get_library_path as _lib_path @@ -76,34 +80,13 @@ var sess_opt = new OrtSession.SessionOptions(); sess_opt.registerCustomOpLibrary(OrtxPackage.getLibraryPath()); ``` -## Use exporters to generate graphs with custom operators - -The PyTorch and TensorFlow converters support custom operator generation if the operation from the original framework cannot be interpreted as a standard ONNX operators. Check the following two examples on how to do this. - -1. [CustomOp conversion by pytorch.onnx.exporter](https://github.com/microsoft/onnxruntime-extensions/blob/main/tutorials/pytorch_custom_ops_tutorial.ipynb) -2. [CustomOp conversion by tf2onnx](https://github.com/microsoft/onnxruntime-extensions/blob/main/tutorials/tf2onnx_custom_ops_tutorial.ipynb) - - -## Add a new custom operator to onnxruntime-extensions - -You can contribute customop C++ implementations directly in this repository if they have general applicability to other users. In addition, if you want to quickly verify the ONNX model with Python, you can wrap the custom operator with **[PyOp](pyop.md)**. - -```python -import numpy -from onnxruntime_extensions import PyOp, onnx_op - -# Implement the CustomOp by decorating a function with onnx_op -@onnx_op(op_type="Inverse", inputs=[PyOp.dt_float]) -def inverse(x): - # the user custom op implementation here: - return numpy.linalg.inv(x) - -# Run the model with this custom op -# model_func = PyOrtFunction(model_path) -# outputs = model_func(inputs) -# ... +### C# +```C# +SessionOptions options = new SessionOptions() +options.RegisterOrtExtensions() +session = new InferenceSession(model, options) ``` -Check [development.md](./development.md) for build and test + ## Contributing @@ -121,4 +104,4 @@ contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additio ## License -[MIT License](https://github.com/microsoft/onnxruntime-extensions/blob/main/LICENSE) +[MIT License](https://github.com/microsoft/onnxruntime-extensions/blob/main/LICENSE) \ No newline at end of file diff --git a/docs/extensions/pyop.md b/docs/extensions/pyop.md index 482c8325d9d3a..b2c32690b7976 100644 --- a/docs/extensions/pyop.md +++ b/docs/extensions/pyop.md @@ -4,6 +4,7 @@ description: Instructions to create a custom operator using Python functions and parent: Extensions nav_order: 4 --- + # Creating custom operators using Python functions Custom operators are a powerful feature in ONNX Runtime that allows users to extend the functionality of the runtime by implementing their own operators to perform specific operations not available in the standard ONNX operator set. @@ -30,4 +31,4 @@ Because ONNXRuntimme needs the custom operator schema on loading a model, please ## Step 2: Create an ONNX model with the custom operator Now that the custom operator is registered with ONNX Runtime, you can create an ONNX model that utilizes it. You can either modify an existing ONNX model to include the custom operator or create a new one from scratch. -To create a new ONNX model with the custom operator, you can use the ONNX Python API. Here is an example: [test_pyops.py](https://github.com/microsoft/onnxruntime-extensions/blob/main/test/test_pyops.py) +To create a new ONNX model with the custom operator, you can use the ONNX Python API. Here is an example: [test_pyops.py](https://github.com/microsoft/onnxruntime-extensions/blob/main/test/test_pyops.py) \ No newline at end of file diff --git a/tools/automate-docs.py b/tools/automate-docs.py new file mode 100644 index 0000000000000..43ec5bc285e30 --- /dev/null +++ b/tools/automate-docs.py @@ -0,0 +1,65 @@ +import requests + +# Note - generation of different markdown files cannot be modularized in different Python methods as indendation matters. + +with open('./docs/extensions/index.md', 'w') as f: + index_link = "https://raw.githubusercontent.com/microsoft/onnxruntime-extensions/main/README.md" + docs = requests.get(index_link) + intro = """--- +title: Extensions +has_children: true +nav_order: 7 +--- + +""" + img = """Pre and post-processing custom operators for vision, text, and NLP models +This image was created using Combine.AI, which is powered by Bing Chat, Bing Image Creator, and EdgeGPT. + +""" + md = intro + docs.text[:docs.text.index("## Quickstart")] + img + docs.text[docs.text.index("## Quickstart"):docs.text.index("(LICENSE)")] + "(https://github.com/microsoft/onnxruntime-extensions/blob/main/LICENSE)" + f.write(md) + +with open('./docs/extensions/development.md', 'w') as f: + development_link = "https://raw.githubusercontent.com/microsoft/onnxruntime-extensions/main/docs/development.md" + ci_matrix_link = "https://raw.githubusercontent.com/microsoft/onnxruntime-extensions/main/docs/ci_matrix.md" + docs = requests.get(development_link) + ci_matrix = requests.get(ci_matrix_link) + intro = """--- +title: Development +description: Instructions for building and developing ORT Extensions. +parent: Extensions +nav_order: 2 +--- + +""" + custom_build_intro = "For instructions on building ONNX Runtime with onnxruntime-extensions for Java package, see [here](./custom-build.md)\n\nRun " + md = intro + docs.text[:docs.text.index("(<./ci_matrix.md>)")] + "(./development.md#dependencies)\n" + docs.text[docs.text.index("(<./ci_matrix.md>)")+19:docs.text.index("## Java package")+16] + custom_build_intro + docs.text[docs.text.index("`bash ./build.sh -DOCOS_BUILD_JAVA=ON`"):] + "\n## Dependencies\n" + ci_matrix.text[ci_matrix.text.index("The matrix"):] + f.write(md) + +with open('./docs/extensions/custom-build.md', 'w') as f: + custom_build_link = "https://raw.githubusercontent.com/microsoft/onnxruntime-extensions/main/docs/custom_build.md" + docs = requests.get(custom_build_link) + intro = """--- +title: Custom Build +description: Instructions for building ONNX Runtime with onnxruntime-extensions for Java package. +parent: Development +nav_order: 2 +--- + +""" + md = intro + docs.text + f.write(md) + +with open('./docs/extensions/pyop.md', 'w') as f: + pyop_link = "https://raw.githubusercontent.com/microsoft/onnxruntime-extensions/main/docs/pyop.md" + docs = requests.get(pyop_link) + intro = """--- +title: Python Operators +description: Instructions to create a custom operator using Python functions and ORT inference integration. +parent: Extensions +nav_order: 4 +--- + +""" + md = intro + "# Creating custom operators using Python functions\n\n" + docs.text[docs.text.index("Custom operators"):docs.text.index("Here is an example")] + "Here is an example: [test_pyops.py](https://github.com/microsoft/onnxruntime-extensions/blob/main/test/test_pyops.py)" + f.write(md) \ No newline at end of file