Skip to content

Commit

Permalink
Merge pull request #16 from sayanshaw24/sayanshaw/automate-docs
Browse files Browse the repository at this point in the history
Add extensions docs auto-generation script
  • Loading branch information
sayanshaw24 authored Aug 4, 2023
2 parents cc431a7 + f5c06f6 commit a2de5c6
Show file tree
Hide file tree
Showing 5 changed files with 93 additions and 41 deletions.
1 change: 1 addition & 0 deletions docs/extensions/custom-build.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ description: Instructions for building ONNX Runtime with onnxruntime-extensions
parent: Development
nav_order: 2
---

# Build ONNX Runtime with onnxruntime-extensions for Java package

*The following step are demonstrated for Windows Platform only, the others like Linux and MacOS can be done similarly.*
Expand Down
12 changes: 7 additions & 5 deletions docs/extensions/development.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ description: Instructions for building and developing ORT Extensions.
parent: Extensions
nav_order: 2
---

# Build and Development

This project supports Python and can be built from source easily, or a simple cmake build without Python dependency.
Expand All @@ -21,7 +22,9 @@ Test:
For a complete list of verified build configurations see [here](./development.md#dependencies)

## Java package
`bash ./build.sh -DOCOS_BUILD_JAVA=ON` to build jar package in out/<OS>/Release folder
For instructions on building ONNX Runtime with onnxruntime-extensions for Java package, see [here](./custom-build.md)

Run `bash ./build.sh -DOCOS_BUILD_JAVA=ON` to build jar package in out/<OS>/Release folder

## Android package
- pre-requisites: [Android Studio](https://developer.android.com/studio)
Expand All @@ -32,7 +35,7 @@ Use `./tools/android/build_aar.py` to build an Android AAR package.
Use `./tools/ios/build_xcframework.py` to build an iOS xcframework package.

## Web-Assembly
ONNXRuntime-Extensions will be built as a static library and linked with ONNXRuntime due to the lack of a good dynamic linking mechanism in WASM. Here are two additional arguments [–-use_extensions and --extensions_overridden_path](https://github.com/microsoft/onnxruntime/blob/860ba8820b72d13a61f0d08b915cd433b738ffdc/tools/ci_build/build.py#L416) on building onnxruntime to include ONNXRuntime-Extensions footprint in the ONNXRuntime package.
ONNXRuntime-Extensions will be built as a static library and linked with ONNXRuntime due to the lack of a good dynamic linking mechanism in WASM. Here are two additional arguments [–-use_extensions and --extensions_overridden_path](https://github.com/microsoft/onnxruntime/blob/860ba8820b72d13a61f0d08b915cd433b738ffdc/tools/ci_build/build.py#L416) on building onnxruntime to include ONNXRuntime-Extensions footprint in the ONNXRuntime package.

## The C++ shared library
for any other cases, please run `build.bat` or `bash ./build.sh` to build the library. By default, the DLL or the library will be generated in the directory `out/<OS>/<FLAVOR>`. There is a unit test to help verify the build.
Expand All @@ -45,9 +48,8 @@ If you want to build the binary with VC Runtime static linkage, please add a par
check this link https://docs.opensource.microsoft.com/releasing/general-guidance/copyright-headers/ for source file copyright header.

## Dependencies

The matrix below lists the versions of individual dependencies of onnxruntime-extensions. These are the configurations that are routinely and extensively verified by our CI.
The matrix below lists the versions of individual dependencies of onxxruntime-extensions. These are the configurations that are routinely and extensively verified by our CI.

Python | 3.8 | 3.9 | 3.10 | 3.11 |
---|---|---|---|---
Onnxruntime |1.12.1 (Aug 4, 2022) |1.13.1(Oct 24, 2022) |1.14.1 (Mar 2, 2023) |1.15.0 (May 24, 2023) |
Onnxruntime |1.12.1 (Aug 4, 2022) |1.13.1(Oct 24, 2022) |1.14.1 (Mar 2, 2023) |1.15.0 (May 24, 2023) |
53 changes: 18 additions & 35 deletions docs/extensions/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,10 +8,10 @@ nav_order: 7

[![Build Status](https://dev.azure.com/onnxruntime/onnxruntime/_apis/build/status%2Fmicrosoft.onnxruntime-extensions?branchName=main)](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=209&branchName=main)


## What's ONNXRuntime-Extensions

Introduction: ONNXRuntime-Extensions is a library that extends the capability of the ONNX models and inference with ONNX Runtime, via ONNX Runtime Custom Operator ABIs. It includes a set of [ONNX Runtime Custom Operators](https://onnxruntime.ai/docs/reference/operators/add-custom-op.html) to support the common pre- and post-processing operators for vision, text, and nlp models. And it supports multiple languages and platforms, like Python on Windows/Linux/macOS, some mobile platforms like Android and iOS, and Web-Assembly etc. The basic workflow is to enhance a ONNX model firstly and then do the model inference with ONNX Runtime and ONNXRuntime-Extensions package.
Introduction: ONNXRuntime-Extensions is a library that extends the capability of the ONNX models and inference with ONNX Runtime, via ONNX Runtime Custom Operator ABIs. It includes a set of [ONNX Runtime Custom Operator](https://onnxruntime.ai/docs/reference/operators/add-custom-op.html) to support the common pre- and post-processing operators for vision, text, and nlp models. And it supports multiple languages and platforms, like Python on Windows/Linux/macOS, some mobile platforms like Android and iOS, and Web-Assembly etc. The basic workflow is to enhance a ONNX model firstly and then do the model inference with ONNX Runtime and ONNXRuntime-Extensions package.


<img src="../../images/combine-ai-extensions-img.png" alt="Pre and post-processing custom operators for vision, text, and NLP models" width="100%"/>
<sub>This image was created using <a href="https://github.com/sayanshaw24/combine" target="_blank">Combine.AI</a>, which is powered by Bing Chat, Bing Image Creator, and EdgeGPT.</sub>
Expand All @@ -24,28 +24,32 @@ pip install onnxruntime-extensions
````


### **nightly build**
### **Nightly Build**

#### <strong>on Windows</strong>
```cmd
pip install --index-url https://aiinfra.pkgs.visualstudio.com/PublicPackages/_packaging/ORT-Nightly/pypi/simple/ onnxruntime-extensions
```
Please ensure that you have met the prerequisites of onnxruntime-extensions (e.g., onnx and onnxruntime) in your Python environment.
#### <strong>on Linux/macOS</strong>
the packages are not ready yet, so it could be installed from source. Please make sure the compiler toolkit like gcc(later than g++ 8.0) or clang, and the tool <strong>cmake</strong> are installed before the following command
Please make sure the compiler toolkit like gcc(later than g++ 8.0) or clang are installed before the following command
```bash
python -m pip install git+https://github.com/microsoft/onnxruntime-extensions.git
```


## Usage

## 1. Augment an ONNX model with a pre- and post-processing pipeline
Check [tutorial](https://github.com/microsoft/onnxruntime-extensions/tree/main/tutorials) for a couple of examples on how to do it.
## 1. Generate the pre-/post- processing ONNX model
With onnxruntime-extensions Python package, you can easily get the ONNX processing graph by converting them from Huggingface transformer data processing classes, check the following API for details.
```python
help(onnxruntime_extensions.gen_processing_models)
```
### NOTE: These data processing model can be merged into other model [onnx.compose](https://onnx.ai/onnx/api/compose.html) if needed.
## 2. Using Extensions for ONNX Runtime inference

### Python

There are individual packages for the following languages, please install it for the build.
```python
import onnxruntime as _ort
from onnxruntime_extensions import get_library_path as _lib_path
Expand Down Expand Up @@ -76,34 +80,13 @@ var sess_opt = new OrtSession.SessionOptions();
sess_opt.registerCustomOpLibrary(OrtxPackage.getLibraryPath());
```

## Use exporters to generate graphs with custom operators

The PyTorch and TensorFlow converters support custom operator generation if the operation from the original framework cannot be interpreted as a standard ONNX operators. Check the following two examples on how to do this.

1. [CustomOp conversion by pytorch.onnx.exporter](https://github.com/microsoft/onnxruntime-extensions/blob/main/tutorials/pytorch_custom_ops_tutorial.ipynb)
2. [CustomOp conversion by tf2onnx](https://github.com/microsoft/onnxruntime-extensions/blob/main/tutorials/tf2onnx_custom_ops_tutorial.ipynb)


## Add a new custom operator to onnxruntime-extensions

You can contribute customop C++ implementations directly in this repository if they have general applicability to other users. In addition, if you want to quickly verify the ONNX model with Python, you can wrap the custom operator with **[PyOp](pyop.md)**.

```python
import numpy
from onnxruntime_extensions import PyOp, onnx_op
# Implement the CustomOp by decorating a function with onnx_op
@onnx_op(op_type="Inverse", inputs=[PyOp.dt_float])
def inverse(x):
# the user custom op implementation here:
return numpy.linalg.inv(x)
# Run the model with this custom op
# model_func = PyOrtFunction(model_path)
# outputs = model_func(inputs)
# ...
### C#
```C#
SessionOptions options = new SessionOptions()
options.RegisterOrtExtensions()
session = new InferenceSession(model, options)
```
Check [development.md](./development.md) for build and test


## Contributing

Expand All @@ -121,4 +104,4 @@ contact [[email protected]](mailto:[email protected]) with any additio

## License

[MIT License](https://github.com/microsoft/onnxruntime-extensions/blob/main/LICENSE)
[MIT License](https://github.com/microsoft/onnxruntime-extensions/blob/main/LICENSE)
3 changes: 2 additions & 1 deletion docs/extensions/pyop.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ description: Instructions to create a custom operator using Python functions and
parent: Extensions
nav_order: 4
---

# Creating custom operators using Python functions

Custom operators are a powerful feature in ONNX Runtime that allows users to extend the functionality of the runtime by implementing their own operators to perform specific operations not available in the standard ONNX operator set.
Expand All @@ -30,4 +31,4 @@ Because ONNXRuntimme needs the custom operator schema on loading a model, please
## Step 2: Create an ONNX model with the custom operator
Now that the custom operator is registered with ONNX Runtime, you can create an ONNX model that utilizes it. You can either modify an existing ONNX model to include the custom operator or create a new one from scratch.

To create a new ONNX model with the custom operator, you can use the ONNX Python API. Here is an example: [test_pyops.py](https://github.com/microsoft/onnxruntime-extensions/blob/main/test/test_pyops.py)
To create a new ONNX model with the custom operator, you can use the ONNX Python API. Here is an example: [test_pyops.py](https://github.com/microsoft/onnxruntime-extensions/blob/main/test/test_pyops.py)
65 changes: 65 additions & 0 deletions tools/automate-docs.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
import requests

# Note - generation of different markdown files cannot be modularized in different Python methods as indendation matters.

with open('./docs/extensions/index.md', 'w') as f:
index_link = "https://raw.githubusercontent.com/microsoft/onnxruntime-extensions/main/README.md"
docs = requests.get(index_link)
intro = """---
title: Extensions
has_children: true
nav_order: 7
---
"""
img = """<img src="../../images/combine-ai-extensions-img.png" alt="Pre and post-processing custom operators for vision, text, and NLP models" width="100%"/>
<sub>This image was created using <a href="https://github.com/sayanshaw24/combine" target="_blank">Combine.AI</a>, which is powered by Bing Chat, Bing Image Creator, and EdgeGPT.</sub>
"""
md = intro + docs.text[:docs.text.index("## Quickstart")] + img + docs.text[docs.text.index("## Quickstart"):docs.text.index("(LICENSE)")] + "(https://github.com/microsoft/onnxruntime-extensions/blob/main/LICENSE)"
f.write(md)

with open('./docs/extensions/development.md', 'w') as f:
development_link = "https://raw.githubusercontent.com/microsoft/onnxruntime-extensions/main/docs/development.md"
ci_matrix_link = "https://raw.githubusercontent.com/microsoft/onnxruntime-extensions/main/docs/ci_matrix.md"
docs = requests.get(development_link)
ci_matrix = requests.get(ci_matrix_link)
intro = """---
title: Development
description: Instructions for building and developing ORT Extensions.
parent: Extensions
nav_order: 2
---
"""
custom_build_intro = "For instructions on building ONNX Runtime with onnxruntime-extensions for Java package, see [here](./custom-build.md)\n\nRun "
md = intro + docs.text[:docs.text.index("(<./ci_matrix.md>)")] + "(./development.md#dependencies)\n" + docs.text[docs.text.index("(<./ci_matrix.md>)")+19:docs.text.index("## Java package")+16] + custom_build_intro + docs.text[docs.text.index("`bash ./build.sh -DOCOS_BUILD_JAVA=ON`"):] + "\n## Dependencies\n" + ci_matrix.text[ci_matrix.text.index("The matrix"):]
f.write(md)

with open('./docs/extensions/custom-build.md', 'w') as f:
custom_build_link = "https://raw.githubusercontent.com/microsoft/onnxruntime-extensions/main/docs/custom_build.md"
docs = requests.get(custom_build_link)
intro = """---
title: Custom Build
description: Instructions for building ONNX Runtime with onnxruntime-extensions for Java package.
parent: Development
nav_order: 2
---
"""
md = intro + docs.text
f.write(md)

with open('./docs/extensions/pyop.md', 'w') as f:
pyop_link = "https://raw.githubusercontent.com/microsoft/onnxruntime-extensions/main/docs/pyop.md"
docs = requests.get(pyop_link)
intro = """---
title: Python Operators
description: Instructions to create a custom operator using Python functions and ORT inference integration.
parent: Extensions
nav_order: 4
---
"""
md = intro + "# Creating custom operators using Python functions\n\n" + docs.text[docs.text.index("Custom operators"):docs.text.index("Here is an example")] + "Here is an example: [test_pyops.py](https://github.com/microsoft/onnxruntime-extensions/blob/main/test/test_pyops.py)"
f.write(md)

0 comments on commit a2de5c6

Please sign in to comment.