Skip to content

The code of wav2vec2 knowledge distillation hyperparameter tuning task for AINL AutoML Workshop 2024 paper

License

Notifications You must be signed in to change notification settings

dangrebenkin/w2v2_distillation_hyperparameter_auto_ml

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Student Hyperparameter Tuning in ASR Knowledge Distillation

The code of knowledge distillation hyperparameter tuning task for AINL AutoML Workshop 2024 paper "Size Matters: About Optimal Amount of Speech Data for Student Hyperparameter Tuning in ASR Knowledge Distillation".

Abstract:

The knowledge distillation (KD) methods of end-to-end automatic speech recognition (ASR) models are increasingly in demand nowadays, they are used to improve the performance of the models and to make their usage possible on devices with modest technical characteristics. However, it is still hard to find the right balance between the size, the perfomance and the inference quality of the compressed model. In this work we use the automated machine learning framework for the knowledge distillation hyperparameter tuning task to discover how much prepared audio data is necessary to get the best hyperparameters for the efficient KD of Wav2Vec2 model. The proposed approach can be also used for tuning of the compressed model's parameters, depending on the desired performance and speech recognition error rate indicators.

Conference: https://ainlconf.ru/

Workshop: https://ainlconf.ru/2024/automl

About

The code of wav2vec2 knowledge distillation hyperparameter tuning task for AINL AutoML Workshop 2024 paper

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published