top of page

Are University Rankings Biased (2022)?

Updated: Nov 7, 2022


University ranking publisher logos

Yes, university rankings are biased, as rankers offer paid-for analytical and advertising services to universities. Thus, there is a conflict of interest: rankers not only objectively evaluate universities' performance, but they also receive fees from universities in exchange for services. In fact, recent research (Chirikov, 2022) shows that universities having frequent fee-based service exchanges with the renowned university ranker QS improved their rankings by - on average - 191 positions (2016 - 2021). In contrast, universities that never or rarely used rankers' paid services improved their ranking by 74 positions, on average.


This has important implications not only for students but also universities, official institutions and governments, as universities constitute important factors of attractiveness for nations.


How to choose the right ranking system

We all know most global university rankings, including QS, Times Higher Education (THE), US News & World Report, and the Academic Ranking of World Universities (“Shanghai”). Whether researching undergraduate, graduate programs, or PhDs, we've all, as prospective students, checked out these college rankings to get a sense of where we should be applying.


But how reliable are these global rankings exactly? Which ranking systems should we consider, if any? This article will provide an overview of how you should go about university rankings and suggest the extent to which you should consider them.


We have also written an article about whether university rankings matter, make sure to check it out as it complements this content!


5 most popular university-ranking publishers

Logos of university rankings

1. Quacquarelli Symonds (QS)

Logo of QS World University Rankings

Quacquarelli Symonds, more commonly known through its QS World University ranking platform, is one of the most popular university ranking systems worldwide. Despite the previously underlined bias and conflict of interest, QS was officially audited and approved by the Observatory on Academic Ranking and Excellence (IREG Observatory) in 2013.


Among its "IREG approved rankings":

  • QS World University Rankings

  • QS University Rankings: Asia

  • QS University Rankings: Latin America

Ranking factors (weighted differently)
  • Academic reputation

  • Employer reputation

  • Research citations per paper

  • H-index (impact of research and citations)

  • "International Research Network"

Pitfalls

It's important to underline that this university ranking is produced by a for-profit organisation, making university rankings accessible to students without a fee but generating revenue by providing services to universities ranked on the platform.


2. Times Higher Education (THE)

Logo of Times Higher Education Rankings

Founded in 2014, Times Higher Education (THE) is another university ranker you have probably heard of. Like QS, THE is a for-profit organisation and generates revenue through subscription-based content and services to universities, such as access to data, advertising, and branding.


Ranking factors (weighted differently)
  • Teaching

  • Research

  • Citations

  • International outlook

  • Industry income

  • Reputation

Pitfalls

One of the pitfalls of Times Higher Education, in addition to the conflict of interest it faces, is the reputational survey it uses. This may be misleading as a university specialised in computer science may be outranked by Harvard University, which doesn't necessarily have the best computer science programmes.


3. Academic Ranking of World Universities (ARWU) a.k.a Shanghai Ranking

Logo of  Shanghai ranking

While not as popular as its two previous competing rankers, the Academic Ranking of World Universities (ARWU), commonly known as the Shanghai Ranking, was actually the first ranking tool available globally.


Initially, ARWU was operated and published by Shanghai Jiao Tong University. Today Shanghai Consultancy (surprise, a for-profit) publishes the ranking.


Ranking factors (weighted differently)
  • Quality of education (indicator: alumni winning Nobel Prizes and Field Medals)

  • Quality of faculty (indicators: staff who have Nobel Prizes and Field Medals, and who had research citations categorised in 21 different subjects)

  • Research output (indicators: number of published papers in Nature and Science and number of indexed papers in the Science Citation Index and Social Sciences Citation Index)

  • Per capita academic performance (indicator: number of the full-time academic staff of a university)

Pitfalls

One of the shortcomings of this ranking system is the Nobel Prize weight (30% of the overall score), which leads to the majority of universities having lower scores and thus lower rankings compared to other ranking systems. In addition, this benefits a handful of countries that have Nobel Prizes recipients (USA, European countries, and Japan), and arguably penalizes other countries (in South America and Africa for example) that have excellent academic institutions but haven't necessarily invested or the means to invest into producing Nobel Prize winners.


Some researchers thus argue other scientific awards should be taken into account, as they might provide better indicators of overall university performance.


In addition, ARWU is very research-focused and does not take teaching quality (which we could argue is one of the most important factors for students) and future employability (not all students want to become academic researchers!) into account.


4. Leiden University Ranking

Logo of Leiden University

The Leiden University Ranking, unlike QS, THE, or ARWU, is published by a university, Leiden University, based in the Netherlands. The rankings produced are all based on Thomson Reuters' Web of Science database indexing academic articles, and reviews.


Any document that is not indexed in this database is not used as an indicator for the ranking, which makes the system very transparent and straightforward. It is a very research-focused ranking.


Ranking factors
  • Science papers’ volume (number)

  • Citations of the papers

  • Per paper citations

  • Number of papers in the top 10% in the area by frequency of citations

  • The proportion of a university's publications in that 10% region

Scholars consider the Leiden ranking to be more user-friendly, transparent and even superior to other rankings such as QS and THE.


Indeed, compared to other rankers, the Leiden ranking doesn't incorporate different dimensions (e.g. teaching quality, employability, and international outlook) into a single indicator, which makes rankings less ambiguous. In addition, it does not use surveys and data given by universities, making it more reliable.


Pitfalls

The Leiden ranking will be most useful to prospective students who wish to attend research universities. However, it may lack important insights such as employability and international outlook for instance. In addition, it focuses on the outputs of universities (citations, papers) and not the inputs (teaching quality), due to a lack of reliable data.


5. Ranking Web (also known as Webometrics)

Logo of Webometrics

The Ranking Web of Universities, also known as Webometrics lists close to 12,000 universities in its rankings, more than any other ranking platform we've covered so far. It is operated by Cybermetrics Lab and started indexing universities in 2004.


Ranking factors
  • University web access

  • University web presence (on Google, Yahoo and other search engines)

  • University web visibility (on Google, Yahoo and other search engines)

Pitfalls

As you can see based on the ranking factors, an important flaw of Webometrics is that universities that are more visible online benefit from higher rankings, which doesn't necessarily reflect the quality of education offered and employment opportunities tied to obtained degrees in each academic institution.


Universities that thus have larger marketing budgets and invest more into their branding have a substantial advantage when being ranked by Webometrics.


Summary of differences between university ranking systems

Source: (Ashraf Fauzi et al. 2020: University rankings: A review of methodological flaws)

Indicators

QS 2018

THE 2018

ARWU 2018

Leiden

Webometrics

Nb. of unis ranked

1000

1400

1000

963

11,997

Method.

6 metrics

13 performance indicators

6 objective indicators

Bibliometric scientific impact indicators

Web presence and impact

Owner

Quacquarelli Symonds

Times Higher Education

Shanghai Consultancy

Leiden University

Cybermetrics Lab

Adv.

Rankings by subject and region

Teaching as a performance indicator

Research focused

Research-focused and transparent

Impact of institution web presence on education factor

Disadv.

Conflict of interests; provides paid services to universities

Conflict of interests; provides paid services to universities

Large weight of Nobel Prize winners factor

Only research-focused ranking

Univerisity with low marketing budget will rank lower

Examples of differences in rankings

Source: (Ashraf Fauzi et al. 2020: University rankings: A review of methodological flaws)

Varying university rankings

The table above illustrates the extent to which rankings differ according to the rankings you are using. This demonstrates each and every ranking needs to be taken with a pinch of salt and doesn't hold a universal truth.


University rankings: Key takeaways

University rankings takeaways

Are all university rankings biased?

As demonstrated through research, some rankings face a conflict of interest: as they offer services to universities on the platform, the number of services they supply to certain universities (data insights, branding, promotion) may lead them to rank higher than others, as is the case for QS, notably. In addition, the fact that some rankings use data collected from surveys as ranking factors may lead to inaccurate and biased data.


On the other hand, the Leiden ranking bases its ranking on databases containing factual information, such as the number of papers published by universities. This eliminates the problem of biases for university rankings and this constitutes a more reliable ranking approach. That being said, the Leiden ranking takes fewer factors into account, such as teaching quality and international outlook, which make constitute important factors for many prospective students.


Which university rankings are most reliable?

The reliability of university rankings depends on the type of university you're interested in attending, the specific academic specialization you are aiming for, and the importance you give to university prestige.


For instance, if you're interested in research-focused academic programmes, the Leiden rankings may be more accurate as the database used relies on ranking factors such as science papers volume, the number of citations of the papers, and the proportion of a university's publications.


In contrast, the Times Higher Education ranking is the only one taking teaching quality as a ranking factor, which may be more of interest to prospective students seeking a high-quality teaching experience beyond university prestige and diplomas.


Should rankings be used to make your choice of university?

Despite their flaws, university rankings constitute interesting databases to learn more about the options you have in terms of higher education. In sum, you can use them but cautiously: take each ranking with a pinch of salt of allocate time to researching which universities are really suitable for you and capable of fulfilling your needs as a student and future professional.


In addition, bear in mind that some universities are less renowned but more highly specialized in certain areas, some of which could fall into your academic niche (e.g. law, astrophysics, psychology).


For instance, while less renowned, the University of Groningen may have a far more cpmplete Work and Organisational Psychology master's programme than Harvard. So, if you want to become an organisational psychologist, why go to Harvard, even if it's ranked in the top 5 worldwide universities?


Which other sources of information can I use instead of university rankings?

A great way to learn more about which university is suitable for you is to go straight to the university's website and gather information on:

  • Academic curricula

  • Lists of courses

  • Alumni testimonials, salaries, and careers

  • University location

It could also help you to reach out to current students and alumni on LinkedIn: that way you'll get first-hand information!


Wrap up

To conclude, we hope this article provided you with a better understanding of how university rankings work and the extent to which they should be trusted.


In our opinion, university rankings shouldn't be the main factor you take into account when choosing a university. We also wrote an article arguing whether university rankings matter, make sure to check it out!


References:

Fauzi, M. A., Tan, C. N. L., Daud, M., & Awalludin, M. M. N. (2020). University rankings: A review of methodological flaws. Issues in Educational Research, 30 (1), 79-96.


Chirikov, I. (2022). Does conflict of interest distort global university rankings?. Higher Education, 1-18.

This article helped?

Let's go the extra mile together and land you that job.

 Improve your CV & interview skills with us!

bottom of page