Example: confidence
Compression Compressing Deep Neural
Found 1 free book(s)arXiv:2006.05525v7 [cs.LG] 20 May 2021
arxiv.orgor compressing the convolutional filters (Zhai et al., 2016). • Knowledge distillation (KD): These methods distill the knowledge from a larger deep neural network into a small network (Hinton et al., 2015). A comprehensive review on model compression and acceleration is outside the scope of this paper. The