-
Hi, I noticed there're two Usage.md under docs/usage/training_time_compression, one for QAT and one for other algorithms.
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hello @EmonLu, At the first, I would lke to note that
QAT uses the model loss that is used for training and does not require additional compression losses.
PTQ provides good initialization for QAT to have a better performance.
|
Beta Was this translation helpful? Give feedback.
Hello @EmonLu,
At the first, I would lke to note that
create_compressed_model(...)
API will be deprecated in the next release andnncf.quantize
is recommended API for QAT.QAT uses the model loss that is used for training and does not require additional compression losses.
PTQ provides good initialization for QAT to have a better performance.
PTInitialDataloader
will be deprecated in the next release as well. Please take a look at Jupyt…