Refine your search
Collections
Co-Authors
Journals
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z All
Chandrashekara, K.
- Occurrence of the New Invasive Pest, Fall Armyworm, Spodoptera frugiperda (J.E. Smith) (Lepidoptera: Noctuidae), in the Maize Fields of Karnataka, India
Abstract Views :439 |
PDF Views:92
Authors
P. C. Ganiger
1,
H. M. Yeshwanth
2,
K. Muralimohan
2,
N. Vinay
2,
A. R. V. Kumar
2,
K. Chandrashekara
2
Affiliations
1 AICRP on Small Millets, IN
2 Department of Entomology, University of Agricultural Sciences, GKVK, Bengaluru - 560 065, IN
1 AICRP on Small Millets, IN
2 Department of Entomology, University of Agricultural Sciences, GKVK, Bengaluru - 560 065, IN
Source
Current Science, Vol 115, No 4 (2018), Pagination: 621-623Abstract
We report here the occurrence of the fall armyworm, Spodoptera frugiperda (J.E. Smith) (Lepidoptera: Noctuidae) in India, which is a devastating pest in American continent on several crops1. S. frugiperda is a polyphagous pest that causes significant losses to agricultural crops.References
- Sparks, A. N., Florida Entomol., 1986, 69(3), 603–614.
- Pogue, M. G., Mem. Am. Entomol. Soc., 2002, 43, 1–202.
- CABI, Data sheet. Spodoptera frugiperda (fall army worm). Invasive species compendium, 2016; http://www.cabi.org/isc/datasheet/29810
- Lima, M. S., Silva, P. S. L., Oliveira, O. F., Silva, K. M. B. and Freitas, F. C. L., Planta Daninha, 2010, 28(1), 103–111.
- Figueiredo, M. L. C., Penteado-Dias, A. M. and Cruz, I., Danos provocados por Spodoptera frugiperda na producao de materia seca e nos rendimentos de graos, na cultura do milho (Comunicado Tecnico, 130). Embrapa/CNPMS, Sete Lagoas, Brazil, 2005, p. 6.
- Young, J. R., Florida Entomol., 1979, 62, 130–133.
- Todd, E. L. and Poole, R. W., Ann. Entomol. Soc. Am., 1980, 73, 722–738.
- Goergen, G., Kumar, P. L., Sankung, S. B., Togola, A. and Tamò, M., PLoS ONE, 2016, 11(10), e0165632; doi:10.1371/ journal.pone.0165632.
- Cock, M. J. W., Beseh, P. K., Buddie, A. G., Cafa1, G. and Crozier, J., Sci. Rep., 2017, 7, 4103; doi:10.1038/s41598-01704238-y.
- Bulletin OEPP/EPPO Bulletin, PM 7/124, EPPO Bull., 2015, 45, 410–444; doi:10.1111/epp.12258.
- Passoa, S., Insecta Mundi., 1991, 5(3–4), 185–195.
- Gilligan, T. M. and Passoa, S. C., LepIntercept – An Identification Resource for Intercepted Lepidoptera Larvae. Identification Technology Program (ITP), USDA-APHIS-PPQ-S&T, Fort Collins (US), 2014; www.lepintercept.org (accessed on 1 September 2014).
- Peterson, A., Larvae of Insects, Edwards Brothers Inc. Arbor, Michigan, 1962, p. 732.
- Levy, R. and Habeck, D. H., Ann. Entomol. Soc. Am., 1976, 69(4), 585–588.
- Conservation of Species, All, Big and Small
Abstract Views :235 |
PDF Views:77
Authors
Affiliations
1 Department of Entomology, UAS, GKVK, Bengaluru 560 065, IN
1 Department of Entomology, UAS, GKVK, Bengaluru 560 065, IN
Source
Current Science, Vol 118, No 1 (2020), Pagination: 7-8Abstract
Six-Legged Science by Brian Hocking is a little gem of a book that should adorn the bookshelf of every biologist. I was completely unaware of this book until a colleague hunting for it, found it hidden with yet another colleague! The book of 200 pages is a passionate eulogy, offered in eighteen charming essays on all matters concerning insects. The essay, ‘Gall enough in thy ink’, beautifully dramatizes the two important inventions of mankind that have had a huge impact on human civilization for over two thousand years – ink and paper. The essay dramatizes the discovery of paper by a Chinese character observing a wasp as it repeatedly visits a tree to collect fibre to build its paper envelope nest. In the fantasy spun by Hocking, the character Leng fu painstakingly observes the nest construction by the wasp and proceeds to make paper from plant fibre.- Identification of Indian Butterflies and Moths with Deep Convolutional Neural Networks x
Abstract Views :224 |
PDF Views:87
Authors
Affiliations
1 Independent Researcher, Bengaluru 560 024, IN
2 Department of Forestry and Environment Science, University of Agricultural Sciences, GKVK Campus, Bengaluru 500 065, IN
3 Department of Entomology, University of Agricultural Sciences, GKVK Campus, Bengaluru 500 065, IN
4 School of Ecology and Conservation, IN
1 Independent Researcher, Bengaluru 560 024, IN
2 Department of Forestry and Environment Science, University of Agricultural Sciences, GKVK Campus, Bengaluru 500 065, IN
3 Department of Entomology, University of Agricultural Sciences, GKVK Campus, Bengaluru 500 065, IN
4 School of Ecology and Conservation, IN
Source
Current Science, Vol 118, No 9 (2020), Pagination: 1456-1462Abstract
This paper reports our efforts to use artificial intelligence based on deep convolutional neural network (CNN) as a tool to identify Indian butterflies and moths. We compiled a dataset of over 170,000 images for 800 Indian butterfly species and 500 Indian moth species from diverse sources. We adopted the Effi-cientNet-B6 architecture for our CNN model, with about 44 million learnable parameters. We trained an ensemble of 5 such models on different subsets of the images in our data, employing artificial image augmentation techniques and transfer learning. This ensemble achieved a balanced top-1 accuracy of 86.5%, top-3 accuracy of 94.7%, and top-5 accuracy of 96.4% on the 1300 species, and a mean F1score of 0.867. Thus, our efforts demonstrate artificial intelligence can be effectively used for identifying these biological species that would substantially enhance the work efficiency of field level biologists in several spheres of investigations.Keywords
Artificial Intelligence, Butterfly Identification, Convolutional Neural Network, Moth Identification.References
- Gerven, M. V., Computational foundations of natural intelligence. In Artificial Neural Networks as Models of Neural Information Processing, 2017, vol. 11, 112, pp. 7–30.
- Schmidhuber, J., Deep learning in neural networks: an overview. Neural Networks, 2015, 61, 85–117.
- Bengio, Y., Learning deep architectures for AI. Found. Trends Mach. Learn., 2009, 2, 1–127.
- Hornik, K., Approximation capabilities of multilayer feedforward networks. Neural Networks, 1991, 4, 251–257.
- Csáji, B. C., Approximation with Artificial Neural Networks. M Sc thesis submitted to the Faculty of Sciences, Eötvös Loránd University, Hungary, 2001.
- Ruder, S., An overview of gradient descent optimization algorithms, 2016, vol. 9; arXiv:1609.04747 [cs].
- Fukushima, K., Neocognitron: a self-organizing neural network for a mechanism of pattern recognition unaffected by shift in position. Biol. Cybernetics, 1980, 36(4), 193–202.
- Ioffe, S. and Szegedy, C., Batch normalization: accelerating deep network training by reducing internal covariate shift. In Proceedings of the 32nd International Conference on Machine Learning, Lille, France, 2015, vol. 37.
- He, K., Zhang, X., Ren, S. and Sun, J., Delving deep into rectifiers: surpassing human-level performance on imagenet classification. In Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV), Washington, USA, 2015.
- Xie, Q., Luong, M.-T., Hovy, E. and Le, Q. V., Self-training with Noisy Student improves ImageNet classification, 2020; arXiv:abs/1911.04252.
- Russakovsky, O. et al., ImageNet large scale visual recognition challenge. Int. J. Comput. Vision, 2015, 115(12), 211–252.
- Schroff, F., Kalenichenko, D. and Philbin, J., FaceNet: a unified embedding for face recognition and clustering. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, USA, 2015.
- Wang, M. and Deng, W., Deep face recognition: a survey, 2018, vol. 4; arXiv, abs/1804.06655.
- Chang, Q., Qu, H., Wu, P. and Yi, J., Fine-Grained Butterfly and Moth Classification Using Deep Convolutional Neural Networks, Machine Learning course project report, submitted to the Department of Computer Science, Rutgers University, 2017; https:// pdfs.seman-ticscholar.org/4cf2/045b811c9e0807f9c94fc991566a6f5adbf4.pdf
- Poremski, A., Introducing LepSnap, 7, 2017; https://medium.com/@andrporemski/introducing-lepsnap-ff356c4c9da6(accessed on December 2018).
- Indian Bioresource Information Network; http://www.ibin.gov.in (accessed in 2019).
- Krause, J. et al., The unreasonable effectiveness of noisy data for fine-grained recognition. In Computer Vision – ECCV 2016(eds Leibe, B. et al.), Springer International Publishing, Cham, Switzerland, 2016, vol. 9907, pp. 301–320.
- Tan, M. and Le, Q. V., EfficientNet: rethinking model scaling for convolutional neural networks. InInternational Conference on Machine Learning (ICML), Long Beach, California, 2019.
- Yosinski, J., Clune, J., Bengio, Y. and Lipson, H., How transferable are features in deep neural networks? In Advances in Neural Information Processing Systems 27 (NIPS), Montreal, Canada, 2014, pp. 3320–3328.
- Melas-Kyriazi, L., EfficientNet PyTorch; https://github.com/lukemelas/EfficientNet-PyTorch(accessed on 15 October 2019).
- CS231n Convolutional Neural Networks for Visual Recognition, http://cs231n.stanford.edu; http://cs231n.github.io/neural-networks-3/#anneal(accessed in 2019).
- Loshchilov, I. and Hutter, F., SGDR: stochastic gradient descent with warm restarts. In International Conference on Learning Representations (ICLR), Toulon, France, 2017.
- Balanced Accuracy Score, scikit-learn; https://scikit-learn.org/stable/modules/generated/sklearn.metrics.balanced_accuracy_score.html (accessed on 10 January 2020).