Compression of dnns using magnitude pruning and nonlinear information bottleneck training

Morten Østergaard Nielsen, Jan Østergaard, Jesper Jensen, Zheng-Hua Tan

Publikation: Bidrag til bog/antologi/rapport/konference proceedingKonferenceartikel i proceedingForskningpeer review

Abstract

As Deep Neural Networks (DNNs) have achieved state-of- the-art performance in various scientific fields and applica- tions, the memory and computational complexity of DNNs have increased concurrently. The increased complexity re- quired by DNNs prohibits them from running on platforms with limited computational resources. This has sparked a re- newed interest in parameter pruning. We propose to replace the standard cross-entropy objective – typically used in clas- sification problems – with the Nonlinear Information Bottle- neck (NIB) objective to improve the accuracy of a pruned net- work. We demonstrate, that our proposal outperforms cross- entropy combined with global magnitude pruning for high compression rates on VGG-nets trained on CIFAR10. With approximately 97% of the parameters pruned, we obtain an accuracy of 87.63% and 88.22% for VGG-16 and VGG-19, respectively, where the baseline accuracy is 91.5% for the un- pruned networks. We observe that the majority of biases are pruned completely, and pruning parameters globally outper- forms layer-wise pruning.
OriginalsprogEngelsk
Titel2021 IEEE 31st International Workshop on Machine Learning for Signal Processing (MLSP)
Antal sider6
ForlagIEEE
Publikationsdatookt. 2021
Sider1-6
Artikelnummer9596128
ISBN (Trykt)978-1-6654-1184-4
ISBN (Elektronisk)978-1-7281-6338-3
DOI
StatusUdgivet - okt. 2021
Begivenhed2021 IEEE 31st International Workshop on Machine Learning for Signal Processing (MLSP) - Gold Coast, Australien
Varighed: 25 okt. 202128 okt. 2021

Konference

Konference2021 IEEE 31st International Workshop on Machine Learning for Signal Processing (MLSP)
Land/OmrådeAustralien
ByGold Coast
Periode25/10/202128/10/2021
NavnIEEE Workshop on Machine Learning for Signal Processing
ISSN1551-2541

Fingeraftryk

Dyk ned i forskningsemnerne om 'Compression of dnns using magnitude pruning and nonlinear information bottleneck training'. Sammen danner de et unikt fingeraftryk.

Citationsformater