李宏毅2020机器学习深度学习 43-44 network compression: network pruning

network compression意义:一些设备,比如可穿戴设备资源(内存和算力)有限,因此需要compress这些network to fit these devices

李宏毅2020机器学习深度学习 43-44 network compression: network pruning_第1张图片

李宏毅2020机器学习深度学习 43-44 network compression: network pruning_第2张图片

李宏毅2020机器学习深度学习 43-44 network compression: network pruning_第3张图片

李宏毅2020机器学习深度学习 43-44 network compression: network pruning_第4张图片

李宏毅2020机器学习深度学习 43-44 network compression: network pruning_第5张图片

Q:why larger network is easier to optimize?

A plausible explanation is that a large network contains many small networks. Every group of initial parameters is a lottery ticket. If you use small network, you have little tickets and the change for best result is small. If you use large network, you have more ticket and the probability for optimization is large.

李宏毅2020机器学习深度学习 43-44 network compression: network pruning_第6张图片

Another hypothesis: small network also can get good results, which is be diametrically opposed to lottery ticket hypothesis

李宏毅2020机器学习深度学习 43-44 network compression: network pruning_第7张图片

GPU only speed matrix computation. Irrugular structure is hard to compute. So in practice, we set these pruned weights to zero. But if you use this, your real network is still large

李宏毅2020机器学习深度学习 43-44 network compression: network pruning_第8张图片

Using pruning presents no gain for GPU speed up.

李宏毅2020机器学习深度学习 43-44 network compression: network pruning_第9张图片

李宏毅2020机器学习深度学习 43-44 network compression: network pruning_第10张图片

你可能感兴趣的:(李宏毅2020机器学习)