Skip to content

Commit

Permalink
Merge pull request #14 from sayanshaw24/sayanshaw/img
Browse files Browse the repository at this point in the history
add image to index.md
  • Loading branch information
sayanshaw24 authored Aug 3, 2023
2 parents 011bf58 + 6898238 commit 793c4db
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion docs/extensions/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,11 @@ nav_order: 7

[![Build Status](https://dev.azure.com/onnxruntime/onnxruntime/_apis/build/status%2Fmicrosoft.onnxruntime-extensions?branchName=main)](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=209&branchName=main)

![Pre and post-processing custom operators for vision, text, and NLP models](../../images/combine-ai-extensions-img.png "This image was created using Combine.AI, which is powered by Bing Chat and Bing Image Creator.")

## What's ONNXRuntime-Extensions

Introduction: ONNXRuntime-Extensions is a library that extends the capability of the ONNX models and inference with ONNX Runtime, via ONNX Runtime Custom Operator ABIs. It includes a set of [ONNX Runtime Custom Operator](https://onnxruntime.ai/docs/reference/operators/add-custom-op.html) to support the common pre- and post-processing operators for vision, text, and nlp models. And it supports multiple languages and platforms, like Python on Windows/Linux/macOS, some mobile platforms like Android and iOS, and Web-Assembly etc. The basic workflow is to enhance a ONNX model firstly and then do the model inference with ONNX Runtime and ONNXRuntime-Extensions package.
Introduction: ONNXRuntime-Extensions is a library that extends the capability of the ONNX models and inference with ONNX Runtime, via ONNX Runtime Custom Operator ABIs. It includes a set of [ONNX Runtime Custom Operators](https://onnxruntime.ai/docs/reference/operators/add-custom-op.html) to support the common pre- and post-processing operators for vision, text, and nlp models. And it supports multiple languages and platforms, like Python on Windows/Linux/macOS, some mobile platforms like Android and iOS, and Web-Assembly etc. The basic workflow is to enhance a ONNX model firstly and then do the model inference with ONNX Runtime and ONNXRuntime-Extensions package.


## Quickstart
Expand Down
Binary file added images/combine-ai-extensions-img.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 793c4db

Please sign in to comment.