Open Access Open Access  Restricted Access Subscription Access

Analysing the Stability of India Rankings


Affiliations
1 Department of Computer Science, Institute of Science, Banaras Hindu University, Varanasi 221 005, India
 

India Rankings released by the National Institutional Ranking Framework that ranks the higher education institutions in India has been in existence since 2016. It plays an important role in providing proper assessment of a large number of higher education institutions in India, that fail to get represented in international rankings. In the present study, I have analysed the stability of the India Rankings. In particular, the uncertainty and sensitivity analyses are used for analysing the stability of the ranking results produced by India Rankings 2020. The results indicate that the rankings are highly volatile. The rankings of only top 10–15 institutions are found to be relatively stable while for most of the other institutions, the ranks assigned to them are found to be unstable. The results of the study give useful inputs to policy makers and stakeholders for improving the ranking methodology. It will also help general readers in understanding to what extent they can believe in the ranking results.

Keywords

India Rankings, NIRF, Scientometrics, University Rankings, Uncertainty and Sensitivity.
User
Notifications
Font Size

  • Shehatta, I. and Mahmood, K., Correlation among top 100 universities in the major six global rankings: policy implications. Scientometrics, 2016, 109(2), 1231–1254; doi:10.1007/s11192016-2065-4.
  • Johnes, J., University rankings: What do they really show? Scientometrics, 2018, 115(1), 585–606; doi:10.1007/s11192-018-2666-1
  • http://www.shanghairanking.com/index.html (accessed on 17 February 2021).
  • https://www.timeshighereducation.com/world-university-rankings (accessed on 17 February 2021).
  • https://www.topuniversities.com/ (accessed on 17 February 2021).
  • https://www.leidenranking.com/ (accessed on 17 February 2021).
  • https://www.scimagojr.com/countryrank.php (accessed on 17 February 2021).
  • http://nturanking.csti.tw/ (accessed on 17 February 2021).
  • https://www.urapcenter.org/ (accessed on 17 February 2021).
  • Robinson-García, N., Torres-Salinas, D., Delgado López-Cózar, E. and Herrera, F., An insight into the importance of national university rankings in an international context: the case of the I-UGR rankings of Spanish universities. Scientometrics, 2014, 101(2), 1309–1324; doi:10.1007/s11192-014-1263-1.
  • van Raan, A. F. J., Fatal attraction: conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 2005, 62(1), 133–143; doi:10.1007/s11192-0050008-6.
  • Safón, V., What do global university rankings really measure? the search for the X factor and the X entity. Scientometrics, 2013, 97(2), 223–244; doi:10.1007/s11192-013-0986-8.
  • Kaycheng, S., Multicolinearity and indicator redundancy problem in world university rankings: an example using times higher education world university ranking 2013–2014 data. High Educ. Q., 2015, 69(2), 158–174; doi:10.1111/hequ.12058.
  • Kivinen, O. and Hedman, J., World-wide university rankings: a scandinavian approach. Scientometrics, 2008, 74(3), 391–408; doi:10.1007/s11192-007-1820-y.
  • Bornmann, L., Mutz, R. and Daniel, H.-D., Multilevel-statistical reformulation of citation-based university rankings: the Leiden ranking 2011/2012. J. Am. Soc. Inf. Sci. Technol., 2013, 64(8), 1649–1658; doi:10.1002/asi.22857.
  • Morphew, C. C. and Swanson, C., On the efficacy of raising your university’s rankings. In University Rankings, Springer, The Netherlands, 2011, pp. 185–199; doi:10.1007/978-94-007-11167_10.
  • Bookstein, F. L., Seidler, H., Fieder, M. and Winckler, G., Too much noise in the Times Higher Education rankings. Scientometrics, 2010, 85(1), 295–299; doi:10.1007/s11192-010-0189-5.
  • Bowman, N. A. and Bastedo, M. N., Anchoring effects in world university rankings: exploring biases in reputation scores. High. Educ., 2011, 61(4), 431–444; doi:10.1007/s10734-010-9339-1.
  • Soh, K., The seven deadly sins of world university ranking: a summary from several papers. J. High. Educ. Policy Manage., 2017, 39(1), 104–115; doi:10.1080/1360080X.2016.1254431.
  • Soh, K., Rectifying an honest error in world university rankings: a solution to the problem of indicator weight discrepancies. J. High Educ. Policy Manage., 2013, 35(6), 574–585; doi:10.1080/1360080X.2013.844670.
  • Soh, K., Misleading university rankings: cause and cure for discrepancies between nominal and attained weights. J. High. Educ. Policy Manage., 2013, 35(2), 206–214; doi:10.1080/1360080X.2013.775929.
  • Moed, H. F., A critical comparative analysis of five world university rankings. Scientometrics, 2017, 110(2), 967–990; doi:10.1007/s11192-016-2212-y.
  • Kivinen, O., Hedman, J. and Artukka, K., Scientific publishing and global university rankings. How well are top publishing universities recognized? Scientometrics, 2017, 112(1), 679–695; doi:10.1007/s11192-017-2403-1.
  • Saisana, M., Saltelli, A. and Tarantola, S., Uncertainty and sensitivity analysis techniques as tools for the quality assessment of composite indicators. J. Roy. Stat. Soc. Ser. A. Stat. Soc., 2005, 168(2), 307–323; doi:10.1111/j.1467-985X.2005.00350.x.
  • Paruolo, P., Saisana, M. and Saltelli, A., Ratings and rankings: Voodoo or science? J. Roy. Stat. Soc. Ser. A. Stat. Soc., 2013, 176(3), 609–634; doi:10.1111/j.1467-985X.2012.01059.x
  • Stefano, T., Saisana, M., Andrea, S., Frank, S. and Nicholas, L., Statistical Techniques and Participatory Approaches for the Composition of the European Internal Market Index 1992–2001, European Commission, Joint Research Centre, Italy, 2002; https://publications.jrc.ec.europa.eu/repository/bitstream/JRC24431/EUR%2020547%20EN.pdf (accessed on 18 February 2021).
  • Saltelli, A. et al., Global Sensitivity Analysis. The Primer, Wiley. John Wiley, Chichester, UK, 2007; https://doi.org/10.1002/ 9780470725184.
  • Saisana, M., D’Hombres, B. and Saltelli, A., Rickety numbers: volatility of university rankings and policy implications. Res. Policy, 2011, 40(1), 165–177; doi: 10.1016/j.respol.2010.09.003
  • Dobrota, M. and Dobrota, M., ARWU ranking uncertainty and sensitivity: what if the award factor was excluded? J. Assoc. Inf.Sci. Technol., 2016, 67(2), 480–482; doi:10.1002/asi.23527.
  • Dobrota, M. and Jeremic, V., Shedding the light on the stability of University Rankings in the ICT field. IETE Tech. Rev., 2017, 34(1), 75–82; doi:10.1080/02564602.2016.1144487
  • Dobrota, M., Bulajic, M., Bornmann, L. and Jeremic, V., A new approach to the QS university ranking using the composite I-distance indicator: uncertainty and sensitivity analyses. J. Assoc. Inf. Sci. Technol., 2016, 67(1), 200–211; doi:10.1002/asi.23355
  • All India Survey on Higher Education 2018–19, 2019 (accessed on 18 February 2021).
  • Kumar, M. J., Global university rankings: What should India do? IETE Tech. Rev., 2015, 32(2), 81–83; doi:10.1080/02564602.2015.1026048.
  • https://www.nirfindia.org/2020/Ranking2020.html (accessed on 18 February 2021).
  • Jeremic, V. and Jovanovic-Milenkovic, M., Evaluation of Asian university rankings: position and perspective of leading Indian higher education institutions. Curr Sci., 2014, 106(12), 1647– 1653.
  • Banshal, S. K., Solanki, T. and Singh, V. K., Research performance of the National Institutes of Technology in India. Curr. Sci., 2018, 115(11), 2025–2036; doi:10.18520/cs/v115/i11/2025-2036.
  • Nishy, P., Panwar, Y., Prasad, S., Mandal, G. K. and Prathap, G., An impact-citations-exergy (iCX) trajectory analysis of leading research institutions in India. Scientometrics, 2012, 91(1), 245– 251; doi:10.1007/s11192-011-0594-4.
  • Kaur, H. and Mahajan, P., Ranking of medical institutes of India for quality and quantity: a case study. Scientometrics, 2015, 105(2), 1129–1139; doi:10.1007/s11192-015-1720-5.
  • Marisha, Banshal, S. K. and Singh, V. K., Research performance of Central Universities in India. Curr Sci., 2017, 112(11), 2198– 2207; doi:10.18520/cs/v112/i11/2198-2207.
  • Banshal, S. K., Singh, V. K. and Mayr, P., Comparing research performance of private universities in India with IITs, central universities and NITs. Curr Sci., 2019, 116(8), 1304–1313; doi:10.18520/cs/v116/i8/1304-1313.
  • Banshal, S. K., Singh, V. K., Basu, A. and Muhuri, P. K., Research performance of Indian Institutes of Technology. Curr. Sci., 2017, 112(05), 923–932; doi:10.18520/cs/v112/i05/923-932.
  • Prathap, G., Making scientometric sense out of NIRF scores. Curr. Sci., 2017, 112(06), 1240–1242; doi:10.18520/cs/v112/i06/12401242.
  • Prathap, G., Making scientometric and econometric sense out of Nirf 2017 data. Curr. Sci., 2017, 113(07), 1420–1423; doi:10.18520/cs/v113/i07/1420-1423.
  • Prathap, G., Danger of a single score: NIRF rankings of colleges. Curr. Sci., 2017, 113(4), 550–553.
  • Mandhirasalam, M., NIRF India Rankings 2016: Ranking of Engineering Institutions in Tamil Nadu. In Creativity, Innovation and Transformation in Libraries (SALIS 2016), Tamil Nadu, 2016, pp. 25–30.
  • Udaiyar, N., National Institutional Ranking Framework (NIRF) 2018: an analysis of engineering colleges. J. Indian Libr. Assoc., 2020, 55(4), 9–15.
  • Biswas, T. K., Chaki, S. and Das, M. C., MCDM technique application to the selection of an Indian Institute of Technology. Oper. Res. Eng. Sci. Theor. Appl., 2019, 2(3), 65–76.
  • Mukherjee, B., Ranking Indian Universities through Research and Professional Practices of National Institutional Ranking Framework (NIRF): A case study of Selected Central Universities in India. J. Indian Libr. Assoc., 2019, 52(4), 93–107.
  • Panneerselvam, P., Performance of Indian Institute of Technology in National Institutional Ranking Framework (NIRF): a comparative study. Int. J. Inf. Dissem. Technol., 2019, 9(3), 121; doi:10.5958/2249-5576.2019.00025.6.
  • Mandhirasalam, M., NIRF India Rankings 2016: Ranking of Universities and other Higher Education Institutions in Tamil Nadu.
  • In National Conference on Electronic Resources and Academic Libraries: Empowering New Trends, Technologies, Practices, Services and Management, St Xavier’s College of Education, Palayamkottai, Tamil Nadu, 2016, pp. 577–583.
  • Brahma, K. and Verma, M. K., Evaluation of selected universities library websites listed by National Institutional Ranking Framework (NIRF) during the year 2017: a webometric analysis. J. Scientometric. Res., 2019, 7(3), 173–180; doi:10.5530/jscires.7.3.28.
  • Kumar, A., Tiwari, S., Chauhan, A. K. and Ahirwar, R., Impact of NIRF on research publications: A study on top 20 (ranked) Indian Universities. Collnet J. Scientometr. Inf. Manage., 2019, 13(2), 219–229; doi:10.1080/09737766.2020.1741194.
  • Saisana, M. and Hombres, B. D., Higher Education Rankings: Robustness Issues and Critical Assessment, Italy: European Commission, Joint Research Centre, 2008, ISBN: 978-92-79-09704-1; https://publications.jrc.ec.europa.eu/repository/bitstream/JRC47028/eur23487_saisana_dhombres.pdf (accessed on 18 February 2021).
  • OECD/European Union/JRC, Handbook on Constructing Composite Indicators: Methodology and User Guide, OECD Publishing, Paris, 2008; https://doi.org/10.1787/9789264043466-en.

Abstract Views: 279

PDF Views: 112




  • Analysing the Stability of India Rankings

Abstract Views: 279  |  PDF Views: 112

Authors

Marisha
Department of Computer Science, Institute of Science, Banaras Hindu University, Varanasi 221 005, India

Abstract


India Rankings released by the National Institutional Ranking Framework that ranks the higher education institutions in India has been in existence since 2016. It plays an important role in providing proper assessment of a large number of higher education institutions in India, that fail to get represented in international rankings. In the present study, I have analysed the stability of the India Rankings. In particular, the uncertainty and sensitivity analyses are used for analysing the stability of the ranking results produced by India Rankings 2020. The results indicate that the rankings are highly volatile. The rankings of only top 10–15 institutions are found to be relatively stable while for most of the other institutions, the ranks assigned to them are found to be unstable. The results of the study give useful inputs to policy makers and stakeholders for improving the ranking methodology. It will also help general readers in understanding to what extent they can believe in the ranking results.

Keywords


India Rankings, NIRF, Scientometrics, University Rankings, Uncertainty and Sensitivity.

References





DOI: https://doi.org/10.18520/cs%2Fv120%2Fi7%2F1144-1151