Project based on the paper Distilling the Knowledge in a Neural Network of Geoffrey Hinton, Oriol Vinyals, Jeff Dean. The idea is to use a big artificial neural network and to distill its knowledges to a smaller one. Knowledge transfer increases performance of small models, enables the efficient usage in production phase where computation power is limited.
-
Notifications
You must be signed in to change notification settings - Fork 0
ouioui199/knowledge_distillation
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published