Open Access
Subscription Access
Identification of Indian Butterflies and Moths with Deep Convolutional Neural Networks x
This paper reports our efforts to use artificial intelligence based on deep convolutional neural network (CNN) as a tool to identify Indian butterflies and moths. We compiled a dataset of over 170,000 images for 800 Indian butterfly species and 500 Indian moth species from diverse sources. We adopted the Effi-cientNet-B6 architecture for our CNN model, with about 44 million learnable parameters. We trained an ensemble of 5 such models on different subsets of the images in our data, employing artificial image augmentation techniques and transfer learning. This ensemble achieved a balanced top-1 accuracy of 86.5%, top-3 accuracy of 94.7%, and top-5 accuracy of 96.4% on the 1300 species, and a mean F1score of 0.867. Thus, our efforts demonstrate artificial intelligence can be effectively used for identifying these biological species that would substantially enhance the work efficiency of field level biologists in several spheres of investigations.
Keywords
Artificial Intelligence, Butterfly Identification, Convolutional Neural Network, Moth Identification.
User
Font Size
Information
- Gerven, M. V., Computational foundations of natural intelligence. In Artificial Neural Networks as Models of Neural Information Processing, 2017, vol. 11, 112, pp. 7–30.
- Schmidhuber, J., Deep learning in neural networks: an overview. Neural Networks, 2015, 61, 85–117.
- Bengio, Y., Learning deep architectures for AI. Found. Trends Mach. Learn., 2009, 2, 1–127.
- Hornik, K., Approximation capabilities of multilayer feedforward networks. Neural Networks, 1991, 4, 251–257.
- Csáji, B. C., Approximation with Artificial Neural Networks. M Sc thesis submitted to the Faculty of Sciences, Eötvös Loránd University, Hungary, 2001.
- Ruder, S., An overview of gradient descent optimization algorithms, 2016, vol. 9; arXiv:1609.04747 [cs].
- Fukushima, K., Neocognitron: a self-organizing neural network for a mechanism of pattern recognition unaffected by shift in position. Biol. Cybernetics, 1980, 36(4), 193–202.
- Ioffe, S. and Szegedy, C., Batch normalization: accelerating deep network training by reducing internal covariate shift. In Proceedings of the 32nd International Conference on Machine Learning, Lille, France, 2015, vol. 37.
- He, K., Zhang, X., Ren, S. and Sun, J., Delving deep into rectifiers: surpassing human-level performance on imagenet classification. In Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV), Washington, USA, 2015.
- Xie, Q., Luong, M.-T., Hovy, E. and Le, Q. V., Self-training with Noisy Student improves ImageNet classification, 2020; arXiv:abs/1911.04252.
- Russakovsky, O. et al., ImageNet large scale visual recognition challenge. Int. J. Comput. Vision, 2015, 115(12), 211–252.
- Schroff, F., Kalenichenko, D. and Philbin, J., FaceNet: a unified embedding for face recognition and clustering. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, USA, 2015.
- Wang, M. and Deng, W., Deep face recognition: a survey, 2018, vol. 4; arXiv, abs/1804.06655.
- Chang, Q., Qu, H., Wu, P. and Yi, J., Fine-Grained Butterfly and Moth Classification Using Deep Convolutional Neural Networks, Machine Learning course project report, submitted to the Department of Computer Science, Rutgers University, 2017; https:// pdfs.seman-ticscholar.org/4cf2/045b811c9e0807f9c94fc991566a6f5adbf4.pdf
- Poremski, A., Introducing LepSnap, 7, 2017; https://medium.com/@andrporemski/introducing-lepsnap-ff356c4c9da6(accessed on December 2018).
- Indian Bioresource Information Network; http://www.ibin.gov.in (accessed in 2019).
- Krause, J. et al., The unreasonable effectiveness of noisy data for fine-grained recognition. In Computer Vision – ECCV 2016(eds Leibe, B. et al.), Springer International Publishing, Cham, Switzerland, 2016, vol. 9907, pp. 301–320.
- Tan, M. and Le, Q. V., EfficientNet: rethinking model scaling for convolutional neural networks. InInternational Conference on Machine Learning (ICML), Long Beach, California, 2019.
- Yosinski, J., Clune, J., Bengio, Y. and Lipson, H., How transferable are features in deep neural networks? In Advances in Neural Information Processing Systems 27 (NIPS), Montreal, Canada, 2014, pp. 3320–3328.
- Melas-Kyriazi, L., EfficientNet PyTorch; https://github.com/lukemelas/EfficientNet-PyTorch(accessed on 15 October 2019).
- CS231n Convolutional Neural Networks for Visual Recognition, http://cs231n.stanford.edu; http://cs231n.github.io/neural-networks-3/#anneal(accessed in 2019).
- Loshchilov, I. and Hutter, F., SGDR: stochastic gradient descent with warm restarts. In International Conference on Learning Representations (ICLR), Toulon, France, 2017.
- Balanced Accuracy Score, scikit-learn; https://scikit-learn.org/stable/modules/generated/sklearn.metrics.balanced_accuracy_score.html (accessed on 10 January 2020).
Abstract Views: 355
PDF Views: 134