Classification of Air-Cured Tobacco Leaf Pests Using Pruning Convolutional Neural Networks and Transfer Learning

Dwiretno Istiyadi Swasono, Handayani Tjandrasa, Chastine Fatichah

Abstract


Convolutional Neural Network (CNN) usually uses a large image dataset with many parameters. Small datasets require a small number of parameters. Existing standard (pre-trained) models such as Alexnet, VGG, Inception, and Resnet have been tested with high accuracy but have many parameters. For small datasets, too many parameters become less efficient and increase computation costs. The high computational costs make the model unsuitable for computers with limited resources such as embedded devices and mobile phones. This research proposes pruning on the depth of resnet50 architecture and adds a dimensionality reduction layer after the pruning point. This approach does not require a complex pruning criteria algorithm, so it is easy to implement. Resnet50 was chosen because it is a good performance with batch normalization and skip connections. We use transfer learning for Resnet50 weight. Pruning is carried out at a depth of the network by cutting at the layer of the activation function. Several pruning points were selected to produce several models with certain parameters. The more networks layer pruned, the smaller the number of parameters produced. We add a layer for channel reduction after pruned network to reduce the number of feature maps before entering the fully connected (FC) layer as a classifier. We retrained a new network using a 2000 tobacco leaf pest dataset split into 1600 training and 400 validation images with 4-classes. The result shows that the accuracy could be maintained equal to the unpruned network up to 100% accuracy and 74.38% reduction rate for the number of parameters. A higher reduction rate of the number of parameters up to 90.62% still provides high accuracy of validation data around 99.3%. These prove that our proposed method effectively maintained accuracy and reduced the number of parameters.

Keywords


Convolutional Neural Network; Transfer learning; Pruning; Tobacco leaf pest.

Full Text:

PDF

References


C. S. Marzan and C. R. Ruiz, “Automated tobacco grading using image processing techniques and a convolutional neural network,†Int. J. Mach. Learn. Comput., vol. 9, no. 6, pp. 807–813, 2019, doi: 10.18178/ijmlc.2019.9.6.877.

A. Harjoko, A. Prahara, T. W. Supardi, I. Candradewi, R. Pulungan, and S. Hartati, “Image processing approach for grading tobacco leaf based on color and quality,†Int. J. Smart Sens. Intell. Syst., vol. 12, no. 1, pp. 1–10, 2019, doi: 10.21307/ijssis-2019-010.

D. S. Guru, P. B. Mallikarjuna, S. Manjunath, and M. M. Shenoi, “Machine Vision Based Classification Of Tobacco Leaves For Automatic Harvesting,†Intell. Autom. Soft Comput., vol. 18, no. 5, pp. 581–590, 2012, doi: 10.1080/10798587.2012.10643267.

Y. Sun, H. Q. Wang, Z. Y. Xia, J. H. Ma, and M. Z. Lv, “Tobacco-disease Image Recognition via Multiple-Attention Classification Network,†J. Phys. Conf. Ser., vol. 1584, no. 1, 2020, doi: 10.1088/1742-6596/1584/1/012008.

D. I. Swasono, H. Tjandrasa, and C. Fatichah, “Classification of tobacco leaf pests using VGG16 transfer learning,†Proc. 2019 Int. Conf. Inf. Commun. Technol. Syst. ICTS 2019, pp. 176–181, 2019, doi: 10.1109/ICTS.2019.8850946.

Y. LeCun, L. Bottou, Y. Bengio, P. Haffner, “Gradient-based learning applied to document recognitionâ€, Proc. IEEE 86 (11) (1998) 2278–2324, 1998.

C. Szegedy, W. Liu, Y. Jia, P. Sermanet, S. Reed, D. Anguelov, D. Erhan, V. Vanhoucke, A. Rabinovich, “Going deeper with convolutionsâ€, In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1–9, 2015.

K. He, X. Zhang, S. Ren, and J. Sun, “Deep Residual Learning for Image Recognitionâ€, IEEE Conf. Comput. Vis. Pattern Recognit. 770–778 (2016). doi:10.1109/CVPR.2016.90, 2016.

J. Zou, T. Rui, Y. Zhou, C. Yang, and S. Zhang, “Convolutional neural network simplification via feature map pruning,†Comput. Electr. Eng., vol. 70, pp. 950–958, 2018, doi: 10.1016/j.compeleceng.2018.01.036.

G. Li, J. Wang, H. W. Shen, K. Chen, G. Shan, and Z. Lu, “CNNPruner: Pruning convolutional neural networks with visual analytics,†IEEE Trans. Vis. Comput. Graph., vol. 27, no. 2, pp. 1364–1373, 2021, doi: 10.1109/TVCG.2020.3030461.

B. O. Ayinde, T. Inanc, and J. M. Zurada, “Redundant feature pruning for accelerated inference in deep neural networks,†Neural Networks, vol. 118, pp. 148–158, 2019, doi: 10.1016/j.neunet.2019.04.021.

C. Yang et al., “Structured Pruning of Convolutional Neural Networks via L1 Regularization,†IEEE Access, vol. 7, pp. 106385–106394, 2019, doi: 10.1109/ACCESS.2019.2933032.

P. Singh, V. K. Verma, P. Rai, and V. P. Namboodiri, “Acceleration of Deep Convolutional Neural Networks Using Adaptive Filter Pruning,†IEEE J. Sel. Top. Signal Process., vol. 14, no. 4, pp. 838–847, 2020, doi: 10.1109/JSTSP.2020.2992390.

C. Liu and H. Wu, “Channel pruning based on mean gradient for accelerating Convolutional Neural Networks,†Signal Processing, vol. 156, pp. 84–91, 2019, doi: 10.1016/j.sigpro.2018.10.019.

A. H. Ashouri, T. S. Abdelrahman, and A. Dos Remedios, “Retraining-free methods for fast on-the-fly pruning of convolutional neural networks,†Neurocomputing, vol. 370, no. xxxx, pp. 56–69, 2019, doi: 10.1016/j.neucom.2019.08.063.

F. E. Fernandes and G. G. Yen, “Pruning Deep Convolutional Neural Networks Architectures with Evolution Strategy,†Inf. Sci. (Ny)., vol. 552, pp. 29–47, 2021, doi: 10.1016/j.ins.2020.11.009.

S. K. Yeom et al., “Pruning by explaining: A novel criterion for deep neural network pruning,†Pattern Recognit., vol. 115, 2021, doi: 10.1016/j.patcog.2021.107899.

A. Jordao, M. Lie, and W. R. Schwartz, “Discriminative Layer Pruning for Convolutional Neural Networks,†IEEE J. Sel. Top. Signal Process., vol. 14, no. 4, pp. 828–837, 2020, doi: 10.1109/JSTSP.2020.2975987.

F. Tung and G. Mori, “Deep Neural Network Compression by In-Parallel Pruning-Quantization,†IEEE Trans. Pattern Anal. Mach. Intell., vol. 42, no. 3, pp. 568–579, 2020, doi: 10.1109/TPAMI.2018.2886192.

Y. Liang, W. Liu, S. Yi, H. Yang, and Z. He, “Filter pruning-based two-step feature map reconstruction,†Signal, Image Video Process., vol. 15, no. 7, pp. 1555–1563, 2021, doi: 10.1007/s11760-021-01888-4.

C. Qi et al., “An efficient pruning scheme of deep neural networks for Internet of Things applications,†EURASIP J. Adv. Signal Process., vol. 2021, no. 1, 2021, doi: 10.1186/s13634-021-00744-4.

E. Jeczmionek and P. A. Kowalski, “Flattening layer pruning in convolutional neural networks,†Symmetry (Basel)., vol. 13, no. 7, pp. 1–13, 2021, doi: 10.3390/sym13071147.

S. Zhang, G. Wu, J. Gu, and J. Han, “Pruning convolutional neural networks with an attention mechanism for remote sensing image classification,†Electron., vol. 9, no. 8, pp. 1–19, 2020, doi: 10.3390/electronics9081209.




DOI: http://dx.doi.org/10.18517/ijaseit.12.3.15950

Refbacks

  • There are currently no refbacks.



Published by INSIGHT - Indonesian Society for Knowledge and Human Development