Skip to content

Releases: RidgeRun/gst-inference

v0.12.0

05 Aug 19:14
Compare
Choose a tag to compare

Version v0.12.0 includes:

  • Fix for tflite runtime linking
  • Support for rosetta architecture

v0.11.0

05 Aug 18:28
Compare
Choose a tag to compare

Release v0.11.0 includes:

  • Benchmark extensions
  • Rename edgetpu to coral backend
  • Support for models with multiple output tensors
  • New inference string signal
  • Support for MobileNetV2 + SSD architecture
  • Remove autotools support
  • Remove deprecated overlay elements and meta

v0.10.1

28 Jul 00:17
Compare
Choose a tag to compare

Bug fixes:

  • Fix prediction memory leak if bypass and model pads are used

v0.10.0

02 Jul 22:49
d746c6d
Compare
Choose a tag to compare

Release v0.10.0 includes:

  • Add continuous integration workflow
  • Add benchmark script
  • Add meson support
  • Support for dynamic backend loading (automatic support for all backends available in R2Inference)

v0.9.0

13 Mar 17:02
Compare
Choose a tag to compare

Release v0.9.0 includes:

  • Fix Yolo probabilities in inferencemeta
  • Support for OpenCV 4
  • Support for doubles in backend properties
  • New inferencebin helper element

v0.8.0

28 Feb 22:03
Compare
Choose a tag to compare
  • Add new inferenceutils plugin to absorb inferencecrop, inferencefilter and inferencedebug
  • Show root prediction on inferenceoverlay
  • Fix prediction size test
  • Fix tinyyolo3 postprocess to use new meta

v0.7.1

10 Feb 22:22
Compare
Choose a tag to compare

Bug fixes

  • Revert hotfix in dev-0.6 that modify the number of predictions.

v0.7.0

07 Feb 02:06
Compare
Choose a tag to compare
  • Pkg-config support
  • License update to LGPL
  • New inference meta hierarchical approach
  • TFlite backend support
  • New elements using new meta:
    • Detectioncrop
    • Inferencefilter
    • Inferencedebug
    • Inferenceoverlay
  • Bug fixes

v0.6.1

06 Feb 17:34
Compare
Choose a tag to compare

Bug fixes

  • Wrong number predictions obtained from R2I

v0.6.0

27 Jun 22:01
Compare
Choose a tag to compare

Introduced features:

  • Improved ResNet50V1 inference result
  • Preprocess and Postprocess factorized as general files.
  • Debug information factorized as general file.
  • Tests added.
  • Improved internal caps negotiation on bypass pad.
  • Copyright license LGPL added.

Supported platforms:

  • Intel Movidius Neural Compute Stick (version 1)
  • NVIDIA Jetson AGX Xavier
  • NVIDIA TX2
  • NVIDIA Nano
  • x86 systems
  • i.MX8

Known issues:

  • NCSDK does not support multiple calls to the inference engine from the same thread. This causes the NCSDK backend to fail after the second start.