It is “virtually impossible to capture the quality of an entire university with all the different courses and disciplines in one figure," the university said.
Utrecht University said it did not provide any data to the makers of the British trade magazine Times Higher Education’s World University Rankings, so they couldn’t determine a score. According to Utrecht University, the rankings place “too much emphasis on scoring and competition,” while the university says it considers collaboration and openness of scientific research to be more important.
University rankings are prime examples of “When a measure becomes a target, it ceases to be a good measure.”
The amount of worthless papers, international exchanges etc. I’ve seen at my university is astounding.
MedUni near me used to be like that, which ended up deteriorating the quality of doctors they put out so considerably, the hospitals in the country that historically would employ the graduates coming out of there instantly, were starting to hire primarily elsewhere.
They since switched to being far less “churning out papers” focused and putting a lot of time and effort in applicable research and experience training and that turned everything around again.
Their ranking on that shitlist Times puts out dropped (considerably) as a result, yet the request from foreign students to study there surged.
Your reputation in your region and sector is still more important than some fictitious score some media outlet gives you.
Nothing is perfect, but it seems reasonable to rank universities based on their performance in education attainment and research?
How do you rank research output consistently. Every university is expected to create their own exam content, how do you effectively measure education attainment across universities?
Push for research outputs can also create perverse incentives for rapid publishing in whatever is considered quality journals - which are usually themselves associated with universities - rather than the pursuit of quality research and academic integrity.
It leads to increased strain on researvhers and a unfruitful obsession with any kind of academic output that can be easily counted. Of course research outputs are inherently important, but the whole academic publishing industry has just gotten weird the last decade or so.
Yes, in the business world this often comes with the Icarus paradox, a phenomenon that eventually leads to a business’s failure by the very elements that brought its temporary success before.
The “integrity” part is a very valid criticism. A big part of these rankings is how many papers the faculty publish per year. So you see a lot of higher ranked universities’ professors adding their names to a lot of reasearches where they didn’t really do anything except review/advise.