Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

After int8 quantification, the test result is not correct #85

Open
hx2009302823 opened this issue Nov 23, 2020 · 2 comments
Open

After int8 quantification, the test result is not correct #85

hx2009302823 opened this issue Nov 23, 2020 · 2 comments

Comments

@hx2009302823
Copy link

hello:
首先感谢作者的开源代码,我这边用自己训练了无dcn的dla模型,转成int8的engine,全程没有报错,但是模型做预测的时候未检出任何的目标框,另外我转成fp16是正常的,校准器生成的calib.table 如下:
TRT-7000-EntropyCalibration
input.1: 3c5ac7b7
329: 3cd4e429
332: 3cc89d77
335: 3d09e62d
336: 3d64f1f1
341: 3d129680
343: 3d3cec1f
345: 3d0203c2
348: 3d240730
352: 3da317bb
356: 3d32ce6c
357: 3d31114b
358: 3d31114b
363: 3d26b22f
365: 3d566a4f
367: 3d7b78a4
370: 3d4abf4e
374: 3db31532
378: 3d3d2e55
381: 3d39ccca
385: 3d967946
388: 3d3ebd65
392: 3dfb9835
396: 3d99a608
397: 3da4b1a3
398: 3da4b1a3
403: 3d9385c8
405: 3da83625
407: 3d9dc1fe
410: 3d506e0f
414: 3dbf5ee7
418: 3d5686e7
421: 3d4bda23
425: 3d908db7
428: 3d42eb15
432: 3dc39d53
436: 3dd2a53e
437: 3dd3a6f4
442: 3da13010
444: 3d788a67
446: 3da68c9c
449: 3d79ba99
453: 3de44c07
457: 3e6b37d2
460: 3de3a3e2
461: 3e441ce7
465: 3e1bf8a2
468: 3dd1a4e6
469: 3d89d39f
472: 3de446c3
473: 3eaee9d9
477: 3e124981
481: 3e0bbc9f
484: 3d348f83
485: 3d1d73d0
488: 3db7466f
489: 3ecf9618
492: 3e32719f
493: 402998be
497: 3db67d50
501: 3e050e2a
505: 3d1b21c7
507: 423adadd
508: 46a17cc4
510: 3f115811
511: 3dbb2a1b
513: 3fcc0ebf
514: 40a39ee3

@zhanggd001
Copy link

你好请问你的这个问题解决了吗?我也遇到了同样的问题,int8推理输出结果完全不对。

@Lenan22
Copy link

Lenan22 commented Nov 2, 2022

你好请问你的这个问题解决了吗?我也遇到了同样的问题,int8推理输出结果完全不对。

tensorrt自带的量化算法比较简陋,难以解决一些模型的精度下降问题, 请参考我们的开源量化工具ppq,可以协助解决tensorrt的量化问题,有量化的精度推理问题欢迎在我们的社区提问,我们可协助帮你解决。 https://github.com/openppl-public/ppq/blob/master/md_doc/deploy_trt_by_OnnxParser.md

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants