Releases: Brainchip-Inc/MetaONNX
Releases · Brainchip-Inc/MetaONNX
MetaONNX 0.7.0
Update Onnx2Akida to version 0.7.0
Aligned with MetaTF 2.19.1
New features
- [API change] Removed
initial_num_nodesparameter fromconvert(now transparently handled) - [API change] It is now possible to pass calibration samples to
convertfor an accurate quantization - Internal
convertrefactoring targeting both extended model support and faster processing - Reduced
convertverbosity for a better user experience - Introduced
HybridModel.summarythat will print out the layer and data exchanges - It is now possible to perform a pure software inference on HybridModel to get the quantized accuracy. This is done with a simple call:
hybrid_model(input) - [API change] Changed the order of the arguments in
print_reportto matchconvertoutput order - Improved print_report formatting and colors
- Replaced custom inference ops with a new AkidaORT package for inference. This is transparent for the users
- Removed onnx2akida-device CLI that is now accessible in Akida
- Removed the custom experimental pattern feature since all have been integrated into the baseline algorithm
Bug fixes
- HybridModel will now properly handle multi-output models and preserve output order
- Fixed and issue were compatibility report would report 'None' reason
Introduce AkidaORT version 1.0.0
Initial release of custom ONNX Runtime operators for BrainChip Akida neural network hardware acceleration.
This library enables seamless integration of Akida FPGA v2 hardware with ONNX Runtime, allowing inference of neural network models with hardware acceleration.
Features
- Custom ONNX Operators: Two custom operators in the com.brainchip domain: AkidaOpInt8 for INT8 output inference, AkidaOpInt32 for INT32 output inference
- Flexible Input Types: Accepts both uint8 and int8 input tensors dynamically
- Hardware Integration: Direct interface with Akida FPGA v2 devices
- Based on ONNX Runtime 1.23.0 and released as a wheel python package
Documentation update
- Introduced a new tutorial for advanced
convertusage - Added version 0.6.0 archive in the changelog page
MetaONNX 0.6.0
Introducing first MetaONNX public release !
Aligned with MetaTF 2.18 and onnx2akida 0.6.0
Features
- MetaONNX framework is a dedicated toolchain designed specifically for deploying ONNX models on the 2nd generation of Akida accelerator
- MetaONNX allows to quantize, convert and map models with a single API call
- MetaONNX introduces inference models supported by ONNXRuntime to deploy hybrid execution between Akida backend or CPU when fallback is required
- This beta version of the documentation comes with an overview, an installation guide, a user guide associated to a notebook tutorial and API references