PERTANIKA JOURNAL OF SCIENCE AND TECHNOLOGY

 

e-ISSN 2231-8526
ISSN 0128-7680

Home / Regular Issue / JST Vol. 30 (2) Apr. 2022 / JST-3037-2021

 

Sago Palm Detection and its Maturity Identification Based on Improved Convolution Neural Network

Zulhakim Wahed, Annie Joseph, Hushairi Zen and Kuryati Kipli

Pertanika Journal of Science & Technology, Volume 30, Issue 2, April 2022

DOI: https://doi.org/10.47836/pjst.30.2.20

Keywords: Convolution neural network (CNN), deep learning, sago palm

Published on: 1 April 2022

Sago palms are mainly cultivated in Sarawak, especially in the Mukah and Betong division, for consumption and export purposes. The starches produced from the sago are mostly for food products such as noodles, traditional food such as tebaloi, and animal feeds. Nowadays, the sago palm and its maturity detection are done manually, and it is crucial to ensure the productivity of starch. The existing detection methods are very laborious and time-consuming since the plantation areas are vast. The improved CNN model has been developed in this paper to detect the maturity of the sago palm. The detection is done by using drone photos based on the shape of the sago palm canopy. The model is developed by combining the architecture of three existing CNN models, AlexNet, Xception, and ResNet. The proposed model, CraunNet, gives 85.7% accuracy with 11 minutes of learning time based on five-fold-validation. Meanwhile, the training time of the CraunNet is almost two times faster than the existing models, ResNet and Xception. It shows that the computation cost in the CraunNet is much faster than the established model.

  • Browne, M. W. (2000). Cross-validation methods. Journal of Mathematical Psychology, 44(1), 108-132. https://doi.org/10.1006/jmps.1999.1279

  • Chollet, F. (2017). Xception: Deep learning with depthwise separable convolutions. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 1251-1258). IEEE Publishing. https://doi.org/10.1109/cvpr.2017.195

  • DJI. (2016). Phantom 4 - Product information. DJI Official. https://www.dji.com/phantom-4/info

  • Ehara, H., Toyoda, Y., & Johnson, D. V. (2018). Sago palm: Multiple contributions to food security and sustainable livelihoods. Springer Nature. https://doi.org/10.1007/978-981-10-5269-9

  • Farooq, A., Jia, X., Hu, J., & Zhou, J. (2019). Knowledge transfer via convolution neural networks for multi-resolution lawn weed classification. In 2019 10th Workshop on Hyperspectral Imaging and Signal Processing: Evolution in Remote Sensing (WHISPERS) (pp. 01-05). IEEE Publishing. https://doi.org/10.1109/whispers.2019.8920832

  • Flach, M. (1997). Sago palm: Metroxylon sagu Rottb.-Promoting the conservation and use of underutilized and neglected crops. 13. International Plant Genetic Resources Institute.

  • Habaragamuwa, H., Ogawa, Y., Suzuki, T., Shiigi, T., Ono, M., & Kondo, N. (2018). Detecting greenhouse strawberries (mature and immature), using deep convolutional neural network. Engineering in Agriculture, Environment and Food, 11(3), 127-138. https://doi.org/10.1016/j.eaef.2018.03.001

  • He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 770-778). IEEE Publishing. https://doi.org/10.1109/cvpr.2016.90

  • Hidayat, S., Matsuoka, M., Baja, S., & Rampisela, D. A. (2018). Object-based image analysis for sago palm classification: The most important features from high-resolution satellite imagery. Remote Sensing, 10(8), Article 1319. https://doi.org/10.3390/rs10081319

  • Howell, P. S. A. (2017). Effect of sucker prunning on sago palm (Metroxylon sagu Rottb.) growth performance (Master Thesis). Universiti Putra Malaysia, Malaysia. http://psasir.upm.edu.my/id/eprint/83269/1/t%20FSPM%202017%205%20%281800001036%29.pdf

  • Kavukcuoglu, K., Ranzato, M. A., Fergus, R., & LeCun, Y. (2009). Learning invariant features through topographic filter maps. In 2009 IEEE Conference on Computer Vision and Pattern Recognition (pp. 1605-1612). IEEE Publishing. https://doi.org/10.1109/cvpr.2009.5206545

  • Khvostikov, A., Aderghal, K., Benois-Pineau, J., Krylov, A., & Catheline, G. (2018). 3D CNN-based classification using sMRI and MD-DTI images for Alzheimer disease studies. arXiv Preprint. https://doi.org/10.1109/cbms.2018.00067

  • Kohavi, R. (1995). A study of cross-validation and bootstrap for accuracy estimation and model selection. In International Joint Conference on Artificial Intelligence (IJCAI, 1995) (Vol. 14, No. 2, pp. 1137-1145). ACM Publishing.

  • Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2012). Imagenet classification with deep convolutional neural networks. In F. Pereira, C. J. C. Burges, L. Bottou, & K. Q. Weinberger (Eds.), Advances in Neural Information Processing Systems 25 (pp. 1-9). NeurIPS Proceedings.

  • Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2017). Imagenet classification with deep convolutional neural networks. Communications of the ACM, 60(6), 84-90. https://doi.org/10.1145/3065386

  • Lawrence, S., Giles, C. L., Tsoi, A. C., & Back, A. D. (1997). Face recognition: A convolutional neural-network approach. IEEE Transactions on Neural Networks, 8(1), 98-113. https://doi.org/10.1109/72.554195

  • LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436-444. https://doi.org/10.1038/nature14539

  • LeCun, Y., Bottou, L., Bengio, Y., & Haffner, P. (1998). Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11), 2278-2324. https://doi.org/10.1109/5.726791

  • Li, M., & Jin, Y. (2020). An hybrid parallel network structure for image classification. In Journal of Physics: Conference Series (Vol. 1624, No. 5, p.052005). IOP Publishing.

  • Mubin, N. A., Nadarajoo, E., Shafri, H. Z. M., & Hamedianfar, A. (2019). Young and mature oil palm tree detection and counting using convolutional neural network deep learning method. International Journal of Remote Sensing, 40(19), 7500-7515. https://doi.org/10.1080/01431161.2019.1569282

  • Samala, R. K., Chan, H. P., Hadjiiski, L. M., Helvie, M. A., Cha, K. H., & Richter, C. D. (2017). Multi-task transfer learning deep convolutional neural network: Application to computer-aided diagnosis of breast cancer on mammograms. Physics in Medicine & Biology, 62, Article 8894. https://doi.org/10.1088/1361-6560/aa93d4

  • Yu, J., Schumann, A. W., Cao, Z., Sharpe, S. M., & Boyd, N. S. (2019). Weed detection in perennial ryegrass with deep learning convolutional neural network. Frontiers in Plant Science, 10, Article 1422. https://doi.org/10.3389/fpls.2019.01422

  • Zhang, M., Li, L., Wang, H., Liu, Y., Qin, H., & Zhao, W. (2019). Optimized compression for implementing convolutional neural networks on FPGA. Electronics, 8(3), Article 295. https://doi.org/10.3390/electronics8030295

ISSN 0128-7680

e-ISSN 2231-8526

Article ID

JST-3037-2021

Download Full Article PDF

Share this article

Recent Articles