Skip to content

Releases: Brainchip-Inc/MetaONNX

MetaONNX 0.7.0

20 Feb 11:05

Choose a tag to compare

Update Onnx2Akida to version 0.7.0

Aligned with MetaTF 2.19.1

New features

  • [API change] Removed initial_num_nodes parameter from convert (now transparently handled)
  • [API change] It is now possible to pass calibration samples to convert for an accurate quantization
  • Internal convert refactoring targeting both extended model support and faster processing
  • Reduced convert verbosity for a better user experience
  • Introduced HybridModel.summary that will print out the layer and data exchanges
  • It is now possible to perform a pure software inference on HybridModel to get the quantized accuracy. This is done with a simple call: hybrid_model(input)
  • [API change] Changed the order of the arguments in print_report to match convert output order
  • Improved print_report formatting and colors
  • Replaced custom inference ops with a new AkidaORT package for inference. This is transparent for the users
  • Removed onnx2akida-device CLI that is now accessible in Akida
  • Removed the custom experimental pattern feature since all have been integrated into the baseline algorithm

Bug fixes

  • HybridModel will now properly handle multi-output models and preserve output order
  • Fixed and issue were compatibility report would report 'None' reason

Introduce AkidaORT version 1.0.0

Initial release of custom ONNX Runtime operators for BrainChip Akida neural network hardware acceleration.
This library enables seamless integration of Akida FPGA v2 hardware with ONNX Runtime, allowing inference of neural network models with hardware acceleration.

Features

  • Custom ONNX Operators: Two custom operators in the com.brainchip domain: AkidaOpInt8 for INT8 output inference, AkidaOpInt32 for INT32 output inference
  • Flexible Input Types: Accepts both uint8 and int8 input tensors dynamically
  • Hardware Integration: Direct interface with Akida FPGA v2 devices
  • Based on ONNX Runtime 1.23.0 and released as a wheel python package

Documentation update

  • Introduced a new tutorial for advanced convert usage
  • Added version 0.6.0 archive in the changelog page

MetaONNX 0.6.0

16 Dec 17:04

Choose a tag to compare

Introducing first MetaONNX public release !

Aligned with MetaTF 2.18 and onnx2akida 0.6.0

Features

  • MetaONNX framework is a dedicated toolchain designed specifically for deploying ONNX models on the 2nd generation of Akida accelerator
  • MetaONNX allows to quantize, convert and map models with a single API call
  • MetaONNX introduces inference models supported by ONNXRuntime to deploy hybrid execution between Akida backend or CPU when fallback is required
  • This beta version of the documentation comes with an overview, an installation guide, a user guide associated to a notebook tutorial and API references