It is “virtually impossible to capture the quality of an entire university with all the different courses and disciplines in one figure," the university said.

Utrecht University said it did not provide any data to the makers of the British trade magazine Times Higher Education’s World University Rankings, so they couldn’t determine a score. According to Utrecht University, the rankings place “too much emphasis on scoring and competition,” while the university says it considers collaboration and openness of scientific research to be more important.

    • Endorkend@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      MedUni near me used to be like that, which ended up deteriorating the quality of doctors they put out so considerably, the hospitals in the country that historically would employ the graduates coming out of there instantly, were starting to hire primarily elsewhere.

      They since switched to being far less “churning out papers” focused and putting a lot of time and effort in applicable research and experience training and that turned everything around again.

      Their ranking on that shitlist Times puts out dropped (considerably) as a result, yet the request from foreign students to study there surged.

      Your reputation in your region and sector is still more important than some fictitious score some media outlet gives you.

      • oroboros@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        How do you rank research output consistently. Every university is expected to create their own exam content, how do you effectively measure education attainment across universities?

        • sab@kbin.social
          link
          fedilink
          arrow-up
          2
          arrow-down
          2
          ·
          1 year ago

          Push for research outputs can also create perverse incentives for rapid publishing in whatever is considered quality journals - which are usually themselves associated with universities - rather than the pursuit of quality research and academic integrity.

          It leads to increased strain on researvhers and a unfruitful obsession with any kind of academic output that can be easily counted. Of course research outputs are inherently important, but the whole academic publishing industry has just gotten weird the last decade or so.

    • 0x815@feddit.deOP
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      2
      ·
      edit-2
      1 year ago

      Yes, in the business world this often comes with the Icarus paradox, a phenomenon that eventually leads to a business’s failure by the very elements that brought its temporary success before.

  • ratskrad@kbin.social
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    The “integrity” part is a very valid criticism. A big part of these rankings is how many papers the faculty publish per year. So you see a lot of higher ranked universities’ professors adding their names to a lot of reasearches where they didn’t really do anything except review/advise.