Skip to content

ouioui199/knowledge_distillation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

KNOWLEDGE DISTILLATION PROJECT

Project based on the paper Distilling the Knowledge in a Neural Network of Geoffrey Hinton, Oriol Vinyals, Jeff Dean. The idea is to use a big artificial neural network and to distill its knowledges to a smaller one. Knowledge transfer increases performance of small models, enables the efficient usage in production phase where computation power is limited.

QUICK START

DESCRIPTION

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages