Residential College | false |
Status | 已發表Published |
Super sparse convolutional neural networks | |
Lu,Yao1; Lu,Guangming1; Zhang,Bob2; Xu,Yuanrong1; Li,Jinxing3 | |
2019-02 | |
Conference Name | 33rd AAAI Conference on Artificial Intelligence |
Source Publication | 33rd AAAI Conference on Artificial Intelligence, AAAI 2019, 31st Innovative Applications of Artificial Intelligence Conference, IAAI 2019 and the 9th AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019 |
Pages | 4440-4447 |
Conference Date | JAN 27-FEB 01, 2019 |
Conference Place | Honolulu, HI |
Abstract | To construct small mobile networks without performance loss and address the over-fitting issues caused by the less abundant training datasets, this paper proposes a novel super sparse convolutional (SSC) kernel, and its corresponding network is called SSC-Net. In a SSC kernel, every spatial kernel has only one non-zero parameter and these non-zero spatial positions are all different. The SSC kernel can effectively select the pixels from the feature maps according to its non-zero positions and perform on them. Therefore, SSC can preserve the general characteristics of the geometric and the channels' differences, resulting in preserving the quality of the retrieved features and meeting the general accuracy requirements. Furthermore, SSC can be entirely implemented by the “shift” and “group point-wise” convolutional operations without any spatial kernels (e.g., “3 × 3”). Therefore, SSC is the first method to remove the parameters' redundancy from the both spatial extent and the channel extent, leading to largely decreasing the parameters and Flops as well as further reducing the img2col and col2img operations implemented by the low leveled libraries. Meanwhile, SSC-Net can improve the sparsity and overcome the over-fitting more effectively than the other mobile networks. Comparative experiments were performed on the less abundant CIFAR and low resolution ImageNet datasets. The results showed that the SSC-Nets can significantly decrease the parameters and the computational Flops without any performance losses. Additionally, it can also improve the ability of addressing the over-fitting problem on the more challenging less abundant datasets. |
URL | View the original |
Indexed By | CPCI-S |
Language | 英語English |
WOS Research Area | Computer Science ; Engineering |
WOS Subject | Computer Science, Artificial Intelligence ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic |
WOS ID | WOS:000485292604057 |
Scopus ID | 2-s2.0-85086994344 |
Fulltext Access | |
Citation statistics | |
Document Type | Conference paper |
Collection | DEPARTMENT OF COMPUTER AND INFORMATION SCIENCE |
Corresponding Author | Lu,Yao; Lu,Guangming; Zhang,Bob; Xu,Yuanrong; Li,Jinxing |
Affiliation | 1.Department of Computer Science and Technology,Harbin Institute of Technology (Shenzhen),Shenzhen,China 2.Department of Computer and Information Science,University of Macau,Macao 3.Department of Computing,Hong Kong Polytechnic University,Hung Hom, Kowloon,Hong Kong |
Corresponding Author Affilication | University of Macau |
Recommended Citation GB/T 7714 | Lu,Yao,Lu,Guangming,Zhang,Bob,et al. Super sparse convolutional neural networks[C], 2019, 4440-4447. |
APA | Lu,Yao., Lu,Guangming., Zhang,Bob., Xu,Yuanrong., & Li,Jinxing (2019). Super sparse convolutional neural networks. 33rd AAAI Conference on Artificial Intelligence, AAAI 2019, 31st Innovative Applications of Artificial Intelligence Conference, IAAI 2019 and the 9th AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019, 4440-4447. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment