Most Read
Most Commented
Read more like this
mk-logo
From Our Readers
Risky and unwise to depend on university rankings

LETTER | Last week, we celebrated the success of the country’s first university (Universiti Malaya) earning its place in the Top 100 Universities in the QS World University Ranking. This has been regarded as a good sign of improvement in the government’s efforts to strengthen research and higher education in the country.

Whilst it is certainly an improvement, we should, however, be more cautious of these university rankings.

A report recently published in May 2018 by the Universitas 21 on 50 countries reveals more about the rankings which may lead us to re-think the cause of celebration. The Universitas 21 (aka U21) is an international consortium of research-intensive universities represented by members from across all continents.

It was founded in 1997 for the purpose of “fostering global citizenship and institutional innovation through research-inspired teaching and learning”. Led by researchers from the University of Melbourne, the report “U21 Ranking of National Higher Education Systems 2018”, or so-called the “U21 Ranking” is the only one in the world to assess a country’s higher education system.

Coupled with university ranking information provided by the other four renowned ranking bodies, namely QS World University Ranking, Times Higher Education World University Ranking (THE), Academic Ranking of World Universities (ARWU) and Center for World University Rankings (CWUR), a further scrutiny of all these ranking statistics will reveal some flaws and shortcomings on these ranking stories:

1. Among the four indicators (Resources, Environment, Connectivity and Output) used by the U21 Report 2018 (a total of 50 countries’ universities were evaluated), Malaysia is ranked number 12 for “Resources”, which includes attributes such as “Government expenditure on tertiary education institutions as a percentage of GDP” and “Expenditure in tertiary education institutions for research and development as a percentage of GDP”.

Malaysia’s “Resources” ranking is better than that of peers such as Hong Kong (13), Australia (14), UK (16), Germany (18), Korea (19), Japan (23), Taiwan (32), Argentina (40), South Africa (41) and China (44). Variables are standardised for population size and GDP.

Nevertheless, Malaysia is placed at 42nd when it comes to U21’s “Output” ranking (variables include “Total articles produced”, “Average impact of articles as measured by citations”, “The excellence of a nation’s best universities based on the ARWU scores”, “Number of researchers in the nation”, “Unemployment rates among tertiary-educated aged 25–64 years”).

This would mean that Malaysia is much weaker in research output when compared with, again, the abovementioned peers: Hong Kong (21), UK (2), Australia (3), Germany (11), Korea (18), Japan (17), Taiwan (23), China (22), South Africa (36) and Argentina (38).

The U21 also reports that Malaysia’s “Total expenditure on higher education as a percentage of GDP” is ranked number 5 (after Ukraine, Saudi Arabia, Finland, and Austria), indicating an investment-output inefficiency among the public universities in Malaysia. Such a dark side reflected by the U21 implicates that Malaysian public universities would need to improve their expenditure efficiency.

2. All four ranking bodies (QS, THE, ARWU and CWUR) emphasise research outputs. All the “teaching” criteria used do not directly or effectively reflect actual teaching quality, especially at undergraduate level. At the 2016 Australian International Education Conference, the data and analytics director of THE agreed that there’s a strong inverse correlation between research productivity and student engagement, and that “rankings put teaching-focused universities at a disadvantage”.

Similarly, in June 2017, UK’s Teaching Excellence Framework (TEF, a system assesses the quality of teaching) ranking shows that its gold-rated universities are traditionally ranked low in other university rankings. In this regard, choosing a university for undergraduate study based on world university rankings (e.g. QS, THE, ARWU, CWUR) is risky and unwise.

3. Universities which are ranked on par with our top university, UM (ranked 87th), as QS world top 100, are also ranked highly or moderately in THE, ARWU and CWUR rankings.

The list includes University of Auckland (QS 85th, THE 192th, ARWU 201-250, CWUR 236th), Korea University (86th, 201-250, 201-300, 183th), Rice University (87th, 40th, 74th,129th), Ohio State University (89th, 70th, 80th, 40th), Moscow State University (90th, 194th, 93th, 126th), University of Western Australia (91th, 111th, 91th, 145th).

However, UM’s QS ranking comparing THE, ARWU and CWUR shows much more discrepancies when compared with the aforementioned universities: QS 87th, THE 351-400, ARWU 401-500 and CWUR 451th. A possible explanation for this could be the strategies that UM applied in the past few years, to focus more on QS ranking measurements for the purpose of obtaining a higher place in this particular ranking system.

By over-emphasising rankings, a university may sacrifice quality of education as a whole. It may use its limited resources to earn ‘points’ on measurements which are easier to quantify and evaluate, such as “student-to-faculty ratio”, “proportion of international students” and “citation per faculty”.

It is good that over the past few years the government has been positive in supporting research in local universities, including encouraging publication in academic journals. It is, however, crucial to note that these research activities, though highly important, are only a part of a university’s missions, alongside other equally important “teaching quality”, “community engagement” elements etc. Unfortunately, these elements are not effectively measured in any of the main rankings systems.

In December 2016, the president of the UK-based Higher Education Policy Institute, Bahram Bekhradnia, published a report “International university rankings: For good or ill?”

Based on statistics and analyses, he suggested that a government or university should not use world university rankings for making decisions on higher education. On the other hand, he argued that the data used to compile the QS and THE is unreliable because a university normally supplies its own data (except for survey-based “reputation” indicators), and the ranking body agency accept the data as supplied. There is no effective attempt to ensure the quality of the data supplied.

We appreciate the government’s effort in promoting research, which is good for improving higher education. However, the authorities (especially now with the new government after May 9) need to be careful when investing taxpayers’ money by not pursuing the measurements of the ranking bodies at the expense of teaching quality, as well as other important elements of an university, such as community engagement, impact of published knowledge, financial and academic autonomy, knowledge transfer between university and industry, etc.

Most importantly, cost efficiency and productivity play a crucial role in ensuring investment-output efficiency among the universities in Malaysia. While the above-mentioned peer governments could spend less but achieve a lot more ‘Boleh’ than Malaysia, there is no reason why Malaysia ‘tak boleh’ in this matter.

In December 2016, a research paper entitled “How efficient are Malaysian public universities?” was published in the Asian Academy of Management Journal. The researchers from UUM and USM compared expenditure efficiency between the public and private/foreign universities in Malaysia and revealed inefficiencies in managing inputs (input slacks), including the government operating grants, among Malaysian public universities.

Other statistics such as the annual auditor general's report echoed the similar concern of these slacks in using public funding and other resource inputs in public universities.

For instance, the 2012, 2014 and 2015 AG’s Report revealed that, among others, hundreds of thousands of ringgit have been ineffectively spent on equipment sitting unused in public universities and delays in construction projects in campus.

Unfortunately, all these years rakyat and taxpayers only observed the audit process and its recommendations have served as a mere formality, rather than helping to ensure accountability of the government.

Taken together with the abovementioned ‘dark sides’ embedded within the world university ranking systems, these findings provide concrete evidence that our public universities and Education Ministry are in desperate need for a well-tailored transformation and effective management to improve expenditure efficiency on public funds.


The views expressed here are those of the author/contributor and do not necessarily represent the views of Malaysiakini.

ADS