Novel method for identifying wheat leaf disease images based on differential amplification convolutional neural network
Keywords:
convolutional neural network, differential amplification, wheat leaf diseases, image identificationAbstract
In this study, a differential amplification convolutional neural network (DACNN) was proposed and used in the identification of wheat leaf disease images with ideal accuracy. The branches added between the deep convolutional layers can amplify small differences between the real output and the expected output, which made the weight updating more sensitive to the light errors return in the backpropagation pass and significantly improved the fitting capability. Firstly, since there is no large-scale wheat leaf disease images dataset at present, the wheat leaf disease dataset was constructed which included eight kinds of wheat leaf images, and five kinds of data augmentation methods were used to expand the dataset. Secondly, DACNN combined four classifiers: Softmax, support vector machine (SVM), K-nearest neighbor (KNN) and Random Forest to evaluate the wheat leaf disease dataset. Finally, the DACNN was compared with the models: LeNet-5, AlexNet, ZFNet and Inception V3. The extensive results demonstrate that DACNN is better than other models. The average recognition accuracy obtained on the wheat leaf disease dataset is 95.18%. Keywords: convolutional neural network, differential amplification, wheat leaf diseases, image identification DOI: 10.25165/j.ijabe.20201304.4826 Citation: Dong M P, Mu S M, Shi A J, Mu W Q, Sun W J. Novel method for identifying wheat leaf disease images based on differential amplification convolutional neural network. Int J Agric & Biol Eng, 2020; 13(4): 205–210.References
[1] Lecun Y, Bottou L, Bengio Y, Haffner P. Gradient-based learning applied to document recognition. Proc.IEEE, 1998; 86(11): 2278–2324.
[2] Krizhevsky A, Sutskever I, Hinton G E. ImageNet classification with deep convolutional neural networks. Proceedings of Advances in Neural Information Processing Systems, 2012; 25: 1097−1105.
[3] Zeiler M D, Fergus R. Visualizing and understanding convolutional networks. Computer Vision-ECCV, IEEE, 2014; 8689: 818–833.
[4] Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition. arXiv preprint, 2014; 6: 1–47. arXiv 1409.1556.
[5] Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov D, et al. Going deeper with convolutions. In: 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston: IEEE, 2015; pp.1–9.
[6] Szegedy C, Vanhoucke V, Ioffe S, Shlens J, Wojna Z. Rethinking the inception architecture for computer vision. IEEE Conference on Computer Vision and Pattern Recognition, IEEE, 2016; pp.2818–2826.
[7] He K M, Zhang X Y, Ren S Q, Sun J. Deep residual learning for image recognition. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas: IEEE, 2016; pp.770–778.
[8] Huang G, Liu Z, Laurens V D M, Weinberger K Q. Densely connected convolutional networks. Computer Vision and Pattern Recognition, IEEE, 2017; pp.4700-4708. doi: 10.1109/CVPR.2017.243.
[9] Khan A, Sohail A, Ali A. A new channel boosted convolutional neural network using transfer learning. arXiv preprint, 2018. arXiv:1804.08528.
[10] Hou S H, Wang Z L. Weighted channel dropout for regularization of deep convolutional neural network. AAAI Conference on Artificial Intelligence, 2019; 33: 8425–8432.
[11] Zeng W H, Li M, Li Z, Xiong Y. High-order residual and parameter-sharing feedback convolutional neural network for crop disease recognition. Acta Electronica Sinica, 2019; 47(9): 1979–1986.
[12] Zhang K, Guo Y R, Wang X S, Yuan J S, Ding Q L. Multiple feature reweight Dense Net for image classification. IEEE Access, 2019; 7: 9872–9880.
[13] Amanda R, Kelsee B, Peter M, Babuali A, James L, David P. Deep learning for image-based cassava disease detection. Frontiers in Plant Science, 2017; 8: 1852. doi: 10.3389/fpls.2017.01852.
[14] Mohanty S P, Hughes D P, Salathé M. Using deep learning for image-based plant disease detection. Frontiers in Plant Science, 2016; 7: 1419. doi: 10.3389/fpls.2016.01419.
[15] Lu Y, Yi S J, Zeng N Y, Liu Y R, Zhang Y. Identification of rice diseases using deep convolutional neural networks. Neurocomputing, 2017; 267(Dec.6): 378–384.
[16] Zhang S W, Xie Z Q, Zhang Q Q. Application research on convolutional neural network for cucumber leaf disease recognition. Jiangsu Journal of Agricultural Sciences, 2018; 34(1): 56 – 61.
[17] Huang S P, Sun C, Qi L, Ma X, Wang W J. Rice panicle blast identification method based on deep convolution neural network. Transactions of the CSAE, 2017; 33(20): 169 – 176.
[18] Sabour S, Frosst N, Hinton G E. Dynamic routing between capsules. Neural Information Processing Systems, 2017; pp.3856–3866.
[19] Deng F, Pu S L, Chen X H, Shi Y S, Yuan T, Pu S Y. Hyperspectral image classification with capsule network using limited training samples. Sensors, 2018; 18(9): 3153. doi: 10.3390/s18093153.
[20] Gan H M, Yue X J, Hong T S, Ling K J, Wang L H, Cen Z Z. A hyperspectral inversion model for predicting chlorophyll content of Longan leaves based on deep learning. Journal of South China Agricultural University, 2018; 39(3): 102–110.
[21] Zhu, X L, Zhu M, Ren H. Method of plant leaf recognition based on improved deep convolutional neural network. Cognitive Systems Research, 2018; 52(Dec.): 223–233.
[22] Srivastava N, Hinton G, Krizhevsky A Sutskever I, Salakhutdinov R. Dropout: A simple way to prevent neural networks from overfitting. Journal of Machine Learning Research, 2014; 15: 1929–1958.
[23] Hu J L, Lu J W, Tan Y P, Zhou J. Deep transfer metric learning. IEEE Transactions on Image Processig, 2016; 25(12): 5576–5588.
[2] Krizhevsky A, Sutskever I, Hinton G E. ImageNet classification with deep convolutional neural networks. Proceedings of Advances in Neural Information Processing Systems, 2012; 25: 1097−1105.
[3] Zeiler M D, Fergus R. Visualizing and understanding convolutional networks. Computer Vision-ECCV, IEEE, 2014; 8689: 818–833.
[4] Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition. arXiv preprint, 2014; 6: 1–47. arXiv 1409.1556.
[5] Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov D, et al. Going deeper with convolutions. In: 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston: IEEE, 2015; pp.1–9.
[6] Szegedy C, Vanhoucke V, Ioffe S, Shlens J, Wojna Z. Rethinking the inception architecture for computer vision. IEEE Conference on Computer Vision and Pattern Recognition, IEEE, 2016; pp.2818–2826.
[7] He K M, Zhang X Y, Ren S Q, Sun J. Deep residual learning for image recognition. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas: IEEE, 2016; pp.770–778.
[8] Huang G, Liu Z, Laurens V D M, Weinberger K Q. Densely connected convolutional networks. Computer Vision and Pattern Recognition, IEEE, 2017; pp.4700-4708. doi: 10.1109/CVPR.2017.243.
[9] Khan A, Sohail A, Ali A. A new channel boosted convolutional neural network using transfer learning. arXiv preprint, 2018. arXiv:1804.08528.
[10] Hou S H, Wang Z L. Weighted channel dropout for regularization of deep convolutional neural network. AAAI Conference on Artificial Intelligence, 2019; 33: 8425–8432.
[11] Zeng W H, Li M, Li Z, Xiong Y. High-order residual and parameter-sharing feedback convolutional neural network for crop disease recognition. Acta Electronica Sinica, 2019; 47(9): 1979–1986.
[12] Zhang K, Guo Y R, Wang X S, Yuan J S, Ding Q L. Multiple feature reweight Dense Net for image classification. IEEE Access, 2019; 7: 9872–9880.
[13] Amanda R, Kelsee B, Peter M, Babuali A, James L, David P. Deep learning for image-based cassava disease detection. Frontiers in Plant Science, 2017; 8: 1852. doi: 10.3389/fpls.2017.01852.
[14] Mohanty S P, Hughes D P, Salathé M. Using deep learning for image-based plant disease detection. Frontiers in Plant Science, 2016; 7: 1419. doi: 10.3389/fpls.2016.01419.
[15] Lu Y, Yi S J, Zeng N Y, Liu Y R, Zhang Y. Identification of rice diseases using deep convolutional neural networks. Neurocomputing, 2017; 267(Dec.6): 378–384.
[16] Zhang S W, Xie Z Q, Zhang Q Q. Application research on convolutional neural network for cucumber leaf disease recognition. Jiangsu Journal of Agricultural Sciences, 2018; 34(1): 56 – 61.
[17] Huang S P, Sun C, Qi L, Ma X, Wang W J. Rice panicle blast identification method based on deep convolution neural network. Transactions of the CSAE, 2017; 33(20): 169 – 176.
[18] Sabour S, Frosst N, Hinton G E. Dynamic routing between capsules. Neural Information Processing Systems, 2017; pp.3856–3866.
[19] Deng F, Pu S L, Chen X H, Shi Y S, Yuan T, Pu S Y. Hyperspectral image classification with capsule network using limited training samples. Sensors, 2018; 18(9): 3153. doi: 10.3390/s18093153.
[20] Gan H M, Yue X J, Hong T S, Ling K J, Wang L H, Cen Z Z. A hyperspectral inversion model for predicting chlorophyll content of Longan leaves based on deep learning. Journal of South China Agricultural University, 2018; 39(3): 102–110.
[21] Zhu, X L, Zhu M, Ren H. Method of plant leaf recognition based on improved deep convolutional neural network. Cognitive Systems Research, 2018; 52(Dec.): 223–233.
[22] Srivastava N, Hinton G, Krizhevsky A Sutskever I, Salakhutdinov R. Dropout: A simple way to prevent neural networks from overfitting. Journal of Machine Learning Research, 2014; 15: 1929–1958.
[23] Hu J L, Lu J W, Tan Y P, Zhou J. Deep transfer metric learning. IEEE Transactions on Image Processig, 2016; 25(12): 5576–5588.
Downloads
Published
2020-08-07
How to Cite
Dong, M., Mu, S., Shi, A., Mu, W., & Sun, W. (2020). Novel method for identifying wheat leaf disease images based on differential amplification convolutional neural network. International Journal of Agricultural and Biological Engineering, 13(4), 205–210. Retrieved from https://ijabe.migration.pkpps03.publicknowledgeproject.org/index.php/ijabe/article/view/4826
Issue
Section
Information Technology, Sensors and Control Systems
License
IJABE is an international peer reviewed open access journal, adopting Creative Commons Copyright Notices as follows.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).